Video Quality Assessment
Abstract
Goal of this project is to assess the quality of rendered videos and especially detect those frames that contain visible artifacts, e.g. ghosting, blurring or popping.
Publications
Comparative analysis of three different modalities for perception of artifacts in videos
in ACM Transactions on Applied Perception, vol. 14, no. 4, ACM, pp. 1-12, September 2017.
Gaze Visualization for Immersive Video
in Burch, Michael and Chuang, Lewis and Fisher, Brian and Schmidt, Albrecht and Weiskopf, Daniel (Eds.): Eye Tracking and Visualization, Springer, ISBN 978-3319470238, pp. 57-71, March 2017.
Visualization and Analysis of Head Movement and Gaze Data for Immersive Video in Head-mounted Displays
in Proc. Workshop on Eye Tracking and Visualization (ETVIS), vol. 1, October 2015.
ElectroEncephaloGraphics: Making Waves in Computer Graphics Research
in IEEE Computer Graphics and Applications, vol. 34, no. 6, pp. 46-56, November 2014.
Single Trial EEG Classification of Artifacts in Videos
in ACM Transactions on Applied Perception, vol. 9, no. 3, pp. 12:1-12:15, July 2012.
Evaluation of Video Artifact Perception Using Event-Related Potentials
in Proc. ACM Applied Perception in Computer Graphics and Visualization (APGV), p. 5, August 2011.
A ghosting artifact detector for interpolated image quality assessment
in Proc. IEEE International Symposium on Consumer Electronics (ISCE), June 2010.
A ghosting artifact detector for interpolated image quality assessment
in Proc. ACM Applied Perception in Computer Graphics and Visualization (APGV), September 2009.
A ghosting artifact detector for interpolated image quality assessment
Technical Report, TU Braunschweig, July 2009.
Related Projects
This project focuses on using electroencephalography (EEG) to analyze the human visual process. Human visual perception is becoming increasingly important in the analyses of rendering methods, animation results, interface design, and visualization techniques. Our work uses EEG data to provide concrete feedback on the perception of rendered videos and images as opposed to user studies that just capture the user's response. Our results so far are very promising. Not only have we been able to detect a reaction to artifacts in the EEG data, but we have also been able to differentiate between artifacts based on the EEG response.
Scope of "Reality CG" is to pioneer a novel approach to modelling, editing and rendering in computer graphics. Instead of manually creating digital models of virtual worlds, Reality CG will explore new ways to achieve visual realism from the kind of approximate models that can be derived from conventional, real-world imagery as input.
Official music video "Who Cares" by Symbiz Sound; the first major production using our Virtual Video Camera.
Dubstep, spray cans, brush and paint join forces and unite with the latest digital production techniques. All imagery depicts live action graffiti and performance. Camera motion added in post production using the Virtual Video Camera.