Immersive Digital Reality
Abstract
Motivated by the advent of mass-market head-mounted immersive displays, we set out to pioneer the technology needed to experience recordings of the real world with the sense of full immersion as provided by VR goggles.
CG&A special issue on Real VR (May 2021)Call for Papers |
Immersive Digital Reality
a DFG Reinhart Koselleck Project

Project Summary
Motivated by the advent of mass-market head-mounted immersive displays, we set out to pioneer the technology needed to experience recordings of the real world with the sense of full immersion as provided by VR goggles. To achieve this goal, a number of interdisciplinary, tightly interrelated challenges from video processing, computer graphics, computer vision, and applied visual perception need to be addressed concertedly. By importing the real world into immersive displays, we want to lay the foundations for the way we may watch movies in the future, leaving fixed-viewpoint, limited field-of-view screens behind for a completely immersive, collective experience.
Researchers
Visiting Researchers
Alumni
Job Openings
We are always looking for excellent researchers. Want to join the project?
Invited Talks
January 17, 2020 | Invited talk by Thiemo Alldieck at Carnegie Mellon University, USA: "Reconstructing 3D Human Avatars from Monocular Images" |
October 10, 2019 | Invited talk at DLR Braunschweig "What’s missing in Head-mounted VR Displays?" |
April 26, 2019 | Invited talk by Thiemo Alldieck at TU Tampere, Finland: "Tell Me How You Look and I'll Tell You How You Move" |
November 30, 2018 | Invited talk at FhG Heinrich Hertz Institut Berlin "Turning Reality into Virtual Reality" |
January 12, 2018 | Keynote presentation at VR Walkthrough Technology Day, TU Tampere, Finland (presentation video) |
April 20, 2017 | Invited talk at Stanford Computer Graphics Lab (GCafe), Stanford University, USA |
January 23, 2017 | Invited talk at University of Konstanz/SFB TRR 161: "Visual Computing - Bridging Real and Digital Domain" |
Events
February 28, 2020 | Publication of our Springer book on Real VR (Editor) |
June 30-July 3, 2019 | Real VR - Importing the Real World into Immersive VR and Optimizing the Perceptual Experience of Head-Mounted Displays, Dagstuhl Seminar 19272 (Organizer) |
April 24-26, 2019 | Computational Visual Media Conference in Bath, UK (Program Co-Chair) |
June 7-8, 2017 | Symposium on Visual Computing and Perception (SVCP) at TU Braunschweig (Organizer) |
In the News
March 23, 2018 | Interview in the local newspaper Braunschweiger Zeitung (in German) |
November 10, 2017 | Article in local chamber of commerce magazine standort 38 (in German) |
June 8, 2017 | |
May 4, 2016 | Articles in the local newspaper Braunschweiger Zeitung. TU Research Magazine, and news38.de (in German). |
Publications
Altering the Conveyed Facial Emotion Through Automatic Reenactment of Video Portraits
in Proc. International Conference on Computer Animation and Social Agents (CASA), vol. 1300, Springer, Cham, pp. 128-135, November 2020.
PEFS: A Validated Dataset for Perceptual Experiments on Face Swap Portrait Videos
in Proc. International Conference on Computer Animation and Social Agents (CASA), vol. 1300, Springer, Cham, pp. 120-127, November 2020.
Temporal Consistent Motion Parallax for Omnidirectional Stereo Panorama Video
in ACM Symposium on Virtual Reality Software and Technology (VRST), no. 21, Association for Computing Machinery, pp. 1-9, November 2020.
Stereo Inverse Brightness Modulation for Guidance in Dynamic Panorama Videos in Virtual Reality
in Computer Graphics Forum, vol. 39, no. 6, August 2020.
Exploring Neural and Peripheral Physiological Correlates of Simulator Sickness
in Computer Animation and Virtual Worlds, vol. n/a, no. n/a, John Wiley & Sons, Inc., pp. e1953 ff., August 2020.
electronic ISSN: 1546-427X
Depth Augmented Omnidirectional Stereo for 6-DoF VR Photography
in Proc. IEEE Virtual Reality (VR) Workshop, IEEE, pp. 660-661, May 2020.
Real VR – Immersive Digital Reality: How to Import the Real World into Head-Mounted Immersive Displays
Springer, ISBN 978-3-030-41815-1, pp. 1-355, March 2020.
Reconstructing 3D Human Avatars from Monocular Images
in Magnor M., Sorkine-Hornung A. (Eds.): Real VR – Immersive Digital Reality: How to Import the Real World into Head-Mounted Immersive Displays, Springer International Publishing, Cham, ISBN 978-3-030-41816-8, pp. 188-218, March 2020.
Multiview Panorama Alignment and Optical Flow Refinement
in Magnor M., Sorkine-Hornung A. (Eds.): Real VR – Immersive Digital Reality: How to Import the Real World into Head-Mounted Immersive Displays, Springer International Publishing, Cham, ISBN 978-3-030-41816-8, pp. 96-108, March 2020.
Subtle Visual Attention Guidance in VR
in Magnor M., Sorkine-Hornung A. (Eds.): Real VR – Immersive Digital Reality: How to Import the Real World into Head-Mounted Immersive Displays, Springer International Publishing, Cham, ISBN 978-3-030-41816-8, pp. 272-284, March 2020.
Real VR - Importing the Real World into Immersive VR and Optimizing the Perceptual Experience of Head-Mounted Displays
Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik, ISBN 2192-5283, pp. 143-156, November 2019.
Dagstuhl Seminar 19272
From Reality to Immersive VR: What’s missing in VR?
Dagstuhl Reports @ Dagstuhl Seminar 2019, p. 151, November 2019.
Dagstuhl Seminar 19272
Tex2Shape: Detailed Full Human Body Geometry from a Single Image
in IEEE International Conference on Computer Vision (ICCV), IEEE, pp. 2293-2303, October 2019.
Iterative Optical Flow Refinement for High Resolution Images
in Proc. IEEE International Conference on Image Processing (ICIP), September 2019.
Towards VR Attention Guidance: Environment-dependent Perceptual Threshold for Stereo Inverse Brightness Modulation
in Proc. ACM Symposium on Applied Perception (SAP), September 2019.
Learning to Reconstruct People in Clothing from a Single RGB Camera
in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, pp. 1175-1186, June 2019.
Gaze and Motion-aware Real-Time Dome Projection System
in Proc. IEEE Virtual Reality (VR) Workshop, IEEE, pp. 1780-1783, March 2019.
PerGraVAR
Immersive EEG: Evaluating Electroencephalography in Virtual Reality
in Proc. IEEE Virtual Reality (VR) Workshop, IEEE, pp. 1794-1800, March 2019.
PerGraVAR
Comparing Unobtrusive Gaze Guiding Stimuli in Head-mounted Displays
in Proc. IEEE International Conference on Image Processing (ICIP), IEEE, October 2018.
Comparison of Unobtrusive Visual Guidance Methods in an Immersive Dome Environment
in ACM Transactions on Applied Perception, vol. 15, no. 4, ACM, pp. 27:1-27:11, October 2018.
Detailed Human Avatars from Monocular Video
in International Conference on 3D Vision, IEEE, pp. 98-109, September 2018.
Low Cost Setup for High Resolution Multiview Panorama Recording and Registration
in Proc. European Signal Processing Conference (EUSIPCO), September 2018.
Analysis of Neural Correlates of Saccadic Eye Movements
in Proc. ACM Symposium on Applied Perception (SAP), no. 17, ACM, pp. 17:1-17:9, August 2018.
On the Delay Performance of Browser-based Interactive TCP Free-viewpoint Streaming
in Proc. IFIP Networking 2018 Conference (NETWORKING 2018), IEEE, pp. 1-9, July 2018.
Video Based Reconstruction of 3D People Models
in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, pp. 8387-8397, June 2018.
CVPR Spotlight Paper
Gaze Guidance in Immersive Environments
Poster @ IEEE Virtual Reality 2018, March 2018.
Automatic Upright Alignment of Multi-View Spherical Panoramas
Poster @ European Conference on Visual Media Production 2017, December 2017.
Best Student Poster Award
Subtle Gaze Guidance for Immersive Environments
in Proc. ACM Symposium on Applied Perception (SAP), ACM, pp. 4:1-4:7, September 2017.
Comparative analysis of three different modalities for perception of artifacts in videos
in ACM Transactions on Applied Perception, vol. 14, no. 4, ACM, pp. 1-12, September 2017.
Optical Flow-based 3D Human Motion Estimation from Monocular Video
in Proc. German Conference on Pattern Recognition (GCPR), Springer, pp. 347-360, September 2017.
Perception-driven Accelerated Rendering
in Computer Graphics Forum (Proc. of Eurographics EG), vol. 36, no. 2, The Eurographics Association and John Wiley & Sons Ltd., pp. 611-643, April 2017.
Gaze Visualization for Immersive Video
in Burch, Michael and Chuang, Lewis and Fisher, Brian and Schmidt, Albrecht and Weiskopf, Daniel (Eds.): Eye Tracking and Visualization, Springer, ISBN 978-3319470238, pp. 57-71, March 2017.
Adaptive Image-Space Sampling for Gaze-Contingent Real-time Rendering
Poster @ German Conference on Pattern Recognition 2016, September 2016.
Gaze-contingent Computational Displays: Boosting perceptual fidelity
in IEEE Signal Processing Magazine, vol. 33, no. 5, IEEE, pp. 139-148, September 2016.
Adaptive Image-Space Sampling for Gaze-Contingent Real-time Rendering
in Computer Graphics Forum (Proc. of Eurographics Symposium on Rendering EGSR), vol. 35, no. 4, pp. 129-139, July 2016.
EGSR'16 Best Paper Award
Related Projects
Comprehensive Human Performance Capture from Monocular Video Footage
Photo-realistic modeling and digital editing of image sequences with human actors are common tasks in the movies and games industry. The processes are however still laborious since tools only allow basic manipulations. In cooperation with the Institut für Informationsverarbeitung (TNT) of the University of Hannover (http://www.tnt.uni-hannover.de/), this project aims to solve this dilemma by providing algorithms and tools for automatic and semi-automatic digital editing of actors in monocular footage. To enable visual convincing renderings, a digital model of the human actor, detailed spatial scene information as well as scene illumination need to be reconstructed. Hereby plausible look and motion of the digital model are crucial.
This research project is partially funded by the German Science Foundation DFG.
Digital Representations of the Real World
The book presents the state-of-the-art of how to create photo-realistic digital models of the real world. It is the result of work by experts from around the world, offering a comprehensive overview of the entire pipeline from acquisition, data processing, and modelling to content editing, photo-realistic rendering, and user interaction.
Eye-tracking Head-mounted Display
Immersion is the ultimate goal of head-mounted displays (HMD) for Virtual Reality (VR) in order to produce a convincing user experience. Two important aspects in this context are motion sickness, often due to imprecise calibration, and the integration of a reliable eye tracking. We propose an affordable hard- and software solution for drift-free eye-tracking and user-friendly lens calibration within an HMD. The use of dichroic mirrors leads to a lean design that provides the full field-of-view (FOV) while using commodity cameras for eye tracking.
Featuring more than 10 million pixels at 120 Hertz refresh rate, full-body motion capture, as well as real-time gaze tracking, our 5-meter ICG Dome enables us to research peripheral visual perception, to devise comprehensive foveal-peripheral rendering strategies, and to explore multi-user immersive visualization and interaction.
Scope of "Reality CG" is to pioneer a novel approach to modelling, editing and rendering in computer graphics. Instead of manually creating digital models of virtual worlds, Reality CG will explore new ways to achieve visual realism from the kind of approximate models that can be derived from conventional, real-world imagery as input.
The Virtual Video Camera research project is aimed to provide algorithms for rendering free-viewpoint video from asynchronous camcorder captures. We want to record our multi-video data without the need of specialized hardware or intrusive setup procedures (e.g., waving calibration patterns).