18 March 2005 Perceiving simulated ego-motions in virtual reality: comparing large screen displays with HMDs
Author Affiliations +
In this keynote I will present some of the work from our virtual reality laboratory at the Max Planck Institute for Biological Cybernetics in Tübingen. Our research philosophy to understand the brain is to study human information processing in an experimental setting as close as possible to our natural environment. Using computer graphics and virtual reality technology we can now study perception not only in a well controlled natural setting but also in a closed perception-action loop, in which the action of the observer will also change the input to our senses. In psychophysical studies we could show that humans can integrate multimodal sensory information in a statistically optimal way, in which cues are weighted according to their reliability. A better understanding of multimodal sensor fusion will allow us to build new virtual reality platforms in which the design effort for visual, auditory, haptic, vestibular and proprioceptive simulation is influenced by the weight of each cue in multimodal sensor fusion.
© (2005) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Bernhard E. Riecke, Bernhard E. Riecke, Joerg Schulte-Pelkum, Joerg Schulte-Pelkum, Heinrich H. Buelthoff, Heinrich H. Buelthoff, "Perceiving simulated ego-motions in virtual reality: comparing large screen displays with HMDs", Proc. SPIE 5666, Human Vision and Electronic Imaging X, (18 March 2005); doi: 10.1117/12.610846; https://doi.org/10.1117/12.610846

Back to Top