Perception of Video Manipulation
Recent advances in deep learning-based techniques enable highly realistic facial video manipulations. We investigate the response of human observers’ on these manipulated videos in order to assess the perceived realness of modified faces and their conveyed emotions.
Facial reenactment and face swapping offer great possibilities in creative fields like the post-processing of movie materials. However, they can also easily be abused to create defamatory video content in order to hurt the reputation of the target. As humans are highly specialized in processing and analyzing faces, we aim to investigate perception towards current facial manipulation techniques. Our insights can guide both the creation of virtual actors with a high perceived realness as well as the detection of manipulations based on explicit and implicit feedback of observers.
Investigating the Perceived Authenticity and Communicative Abilities of Face-Swapped Portrait Videos
PhD thesis, TU Braunschweig, February 2023.
Personality Analysis of Face Swaps: Can They be Used as Avatars?
in ACM Proceedings of the International Conference on Intelligent Virtual Agents, no. 14, ACM, pp. 1-8, September 2022.
Automatic Generation of Customized AOIs and Evaluation of Observers' Gaze in Portrait Videos
in Proceedings of the ACM on Human-Computer Interaction , vol. 6, no. ETRA, ACM, pp. 144:1-144:14, May 2022.
EEG-based Analysis of the Impact of Familiarity in the Perception of Deepfake Videos
in Proc. IEEE International Conference on Image Processing (ICIP), IEEE, pp. 160-164, September 2021.
Towards Understanding Perceptual Differences between Genuine and Face-Swapped Videos
in Proc. ACM Human Factors in Computing Systems (CHI), no. 240, Association for Computing Machinery, pp. 1-13, May 2021.
Altering the Conveyed Facial Emotion Through Automatic Reenactment of Video Portraits
in Proc. International Conference on Computer Animation and Social Agents (CASA), vol. 1300, Springer, Cham, pp. 128-135, November 2020.
PEFS: A Validated Dataset for Perceptual Experiments on Face Swap Portrait Videos
in Proc. International Conference on Computer Animation and Social Agents (CASA), vol. 1300, Springer, Cham, pp. 120-127, November 2020.
Automatic Face Re-Enactment in Real-World Portrait Videos to Manipulate Emotional Expression
Master's thesis, Institut für Computergraphik, TU Braunschweig, April 2020.
Awarded with the "KI-Talent" prize 2020 of the Niedersächsisches Ministerium für Wirtschaft.
This project focuses on using electroencephalography (EEG) to analyze the human visual process. Human visual perception is becoming increasingly important in the analyses of rendering methods, animation results, interface design, and visualization techniques. Our work uses EEG data to provide concrete feedback on the perception of rendered videos and images as opposed to user studies that just capture the user's response. Our results so far are very promising. Not only have we been able to detect a reaction to artifacts in the EEG data, but we have also been able to differentiate between artifacts based on the EEG response.
The aim of this work is to simulate glaring headlights on a conventional monitor by first measuring the time-dependent effect of glare on human contrast perception and then to integrate the quantitative findings into a driving simulator by adjusting contrast display according to human perception.
Goal of this project is to assess the quality of rendered videos and especially detect those frames that contain visible artifacts, e.g. ghosting, blurring or popping.