Events
Talk Invited Talks: Layered Weighted Blended Order-Independent Transparency and AR Carcassonne
05.05.2022 13:00
G30
Speaker(s): Fabian Friederichs, Jannis Malte Möller
Translated with www.DeepL.com/Translator (free version)
Talk Disputation: Computer Graphics from a Bio-Signal Perspective - Exploration of Autonomic Human Physiological Responses to Synthetic and Natural Imagery
29.04.2022 10:30
IZ 812
Speaker(s): Jan-Philipp Tauscher
Talk Teamprojekt-Abschluss: Special Effects with Video Matching
29.03.2022 13:00
Online: https://webconf.tu-bs.de/mar-uf3-wqy
Präsentation der Ergebnisse des studentischen Teamprojekts.
Talk BA-Talk: Leistungsanalyse und Vergleich differenzierbarer Renderingsysteme
28.03.2022 13:30
- 28.03.2022 14:00
Online: http://webconf.tu-bs.de/mar-3vy-aef
Speaker(s): Domenik Jaspers
Talk BA-Talk: Erkennung und Einordnung von Emotionen in den semantischen Raum mittels Deep Learning
28.03.2022 13:00
- 28.03.2022 13:30
Online: http://webconf.tu-bs.de/mar-3vy-aef
Speaker(s): Bill Matthias Thang
Talk BA-Talk: Neural Radiance Fields: Eine systematische Übersichtsarbeit und Ausblick auf weitere Entwicklungen
16.03.2022 13:00
Online: https://webconf.tu-bs.de/mar-uf3-wqy
Speaker(s): Lars Christian Lund
Talk MA-Talk: Visualisierung wissenschaftlicher Daten in Multi-User Augmented Reality
23.02.2022 15:30
Online: https://webconf.tu-bs.de/mar-uf3-wqy
Speaker(s): Jan Wulkop
Talk BA-Talk: Evaluation von Open-Source Experiment-Management- Systemen zur Unterstützung der universitären Forschung
23.09.2021 17:00
Online?
Talk Promotions-Vorvortrag: Computer graphics from a bio-signal perspective - Exploration of Autonomic Human Physiological Responses to Synthetic and Natural Imagery
30.07.2021 13:00
Online
JP Tauscher is presenting his dissertation pre-talk Computer graphics from a bio-signal perspective - Exploration of Autonomic Human Physiological Responses to Synthetic and Natural Imagery on Friday, July, 30. at 1pm.
http://webconf.tu-bs.de/mar-3vy-aef
The impact of graphics on our perception is usually measured by asking users to complete self-assessment questionnaires. These psycho-physical rating scales and questionnaires reflect a subjective opinion by conscious responses but may be (in)voluntarily biased and do usually not provide real-time feedback. Subjects may also have difficulties communicating their opinion because a rating scale may not reflect their intrinsic perception or may be biased by external factors such as mood, expectation, past experience or even problems of task definition and understanding.
In this thesis, we investigate how the human body reacts involuntarily to computer-generated as well as real-world image content. Here, we add a whole new range of modalities to our perception quantification apparatus to abstract from subjective ratings towards objective bodily measures. These include electroencephalography (EEG), eye tracking, galvanic skin response (GSR), and cardiac and respiratory data. We seek to explore the gap between what humans consciously see and what they implicitly perceive when consuming generated and natural content. We include different display technologies ranging from traditional monitors to virtual reality (VR) devices commonly used to present computer graphical content.
This thesis shows how the human brain and the autonomous nervous system react to visual stimuli and how these bio-signals can be reliably measured to analyse and quantify the immediate physiological reactions towards certain aspects of generated and natural graphical content. We advance the current frontiers in the context of perceptual graphics towards novel measurement and analysis methods for immediate and involuntary physiological reactions.
Talk BA-Talk: Real-time high-resolution playback of 360° stereoscopic videos in virtual reality
16.07.2021 13:00
Online: https://webconf.tu-bs.de/mar-3vy-aef
Speaker(s): Nikkel Heesen
Talk MA-Talk: Video Objekt Segmentierung für Omnidirektionale Stereo Panoramen
14.05.2021 13:00
Online: https://webconf.tu-bs.de/mar-3vy-aef
Speaker(s): Fan Song
Talk MA-Talk: Functional Volumetric Rendering for Industrial Applications
07.05.2021 13:00
Online: https://webconf.tu-bs.de/mar-3vy-aef
Speaker(s): Jan-Christopher Schmidt
Talk Teamprojekt-Abschluss: Schoduvel im Dome
31.03.2021 13:15
Dome (Aufnahmestudio & Visualisierungslabor) / Online
Präsentation der Ergebnisse des studentischen Teamprojekts.
Talk MA-Talk: Temporal Coherent Relighting in Portrait Videos from Neural Textures
29.03.2021 13:00
Online: https://webconf.tu-bs.de/mar-3vy-aef
Speaker(s): Jann-Ole Henningson
Talk BA-Talk: Bekämpfung von Motion Sickness in VR durch dynamische bipolare Galvanisch Vestibuläre Stimulation
12.03.2021 13:00
Online: https://webconf.tu-bs.de/mar-3vy-aef
Speaker(s): Max Hattenbach
Talk BA-Talk: Neuronales Rendering - Wahrnehmungsbasierte Auswertung der Tiefenwirkung in VR
25.01.2021 13:00
Online: https://webconf.tu-bs.de/mar-3vy-aef
Speaker(s): Yannic Rühl
Talk Disputation
22.01.2021 10:00
Online
Speaker(s): Steve Grogorick
Guiding Visual Attention in Immersive Environments
Talk BA-Talk: Entwurf einer interaktiven Simulation zum Erlernen von Sternenkonstellationen in öffentlichen Planetarien
08.12.2020 13:00
Planetarium Wolfsburg
Speaker(s): Lars Richard
Talk Promotions-V-Vg: Guiding Visual Attention in Immersive Environments
30.10.2020 13:00
Online
Growing popularity of virtual reality (VR) technology, presenting content virtually all around a user, creates new challenges for digital content creators and presentation systems. In this dissertation we investigate how to support viewers to not miss important information when exploring unknown virtual environments. We examine different visual stimuli to guide viewers' attention towards predetermined target regions of surrounding environments. To best possibly maintain the original visual appearance of scenes, we aim for subtle visual modifications that operate as close as possible to viewers' perception threshold, while still providing effective guidance.
In a first approach, we identify issues of existing visual guidance stimuli to be effective in VR environments. For use in large field of view (FOV) head-mounted displays (HMDs), we derive techniques to handle perspective distortions, degradation of visual acuity in the peripheral visual field and target regions outside the initial FOV. An existing visual stimulus, originally conceived for desktop environments, is adapted accordingly and successfully evaluated in a perceptual study.
Subsequently the generalizability of these extending techniques is investigated, regarding different guidance methods and VR devices. For this, additional methods from related work are re-implemented and updated accordingly. Two comparable perceptual studies are conducted to evaluate their effectiveness within a consumer-grade HMD and in an immersive dome projection system covering almost the full human visual field. Regardless of the actual success rates, all of the tested methods show a measurable effect on participants' viewing behavior, indicating general applicability of our modification techniques for various guiding methods and VR systems.
Finally, a novel visual guidance method (SIBM) is created, specifically designed for immersive systems. It builds on contrary manipulations of the two stereoscopic frames in VR rendering systems, turning the inevitable overhead of double (per eye) rendering into an advantage that is not available in monocular systems. Moreover, exploiting our visual system's sensitivity for discrepancies in binocular visual input, it allows to noticeably reduce the required per-image contrast of the actual stimulus well below previous state-of-the-art.
Talk SEP-Abschluss: Massively distributed collaborative crowd input system for dome environments
31.08.2020 13:00
Dome (Aufnahmestudio & Visualisierungslabor)
Präsentation der Ergebnisse des studentischen Softwareentwicklungspraktikums (SEP).
Talk BA-Talk: Eye Tracking Analysis Framework for Video Portraits
28.08.2020 13:00
Online
Speaker(s): Moritz von Estorff
Dieser Abschlussvortrag wird online gestreamt:
Talk BA-Talk: Implementing Dynamic Stimuli in VR Environments for Visual Perception Research
04.08.2020 15:00
Dome (Aufnahmestudio & Visualisierungslabor)
Speaker(s): Mai Hellmann
Talk Praktikum-Abschluss: Creating an interactive VR-adventure for the ICG Dome
05.06.2020 13:30
Dome (Aufnahmestudio & Visualisierungslabor)
Präsentation der Ergebnisse des studentischen Computergraphik Praktikums (MA).
(Ein Folgeprojekt vom Computergraphik Praktikum (BA) SS'19)
Talk Teamprojekt-Abschluss: Unser kleines Planetarium
05.06.2020 13:00
Dome (Aufnahmestudio & Visualisierungslabor)
Präsentation der Ergebnisse des studentischen Teamprojekts.
Talk MA-Talk: Automatic Face Re-enactment in Real-World Portrait Videos to Manipulate Emotional Expression
24.04.2020 13:15
https://webconf.tu-bs.de/jan-n7t-j7a
Speaker(s): Colin Groth