Help Privacy Policy Disclaimer
  Advanced SearchBrowse





Multisensory integration for the perception of self-motion


Berger,  DR
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available

Berger, D. (2005). Multisensory integration for the perception of self-motion. Talk presented at Institutskolloquium, Max-Planck-Institut für medizinische Forschung. Heidelberg, Germany. 2005-07-07.

Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-D501-7
When we move in the environment, we perceive our position and motion in space with several senses. Among these are the visual sense and body senses of self-motion (vestibular and somatosensory). The human brain combines the information of the different senses to generate a unified and robust percept of self-motion.
We investigated the integration process in human observers using psychophysical methods. Experiments were performed on a hexapod platform with a projection screen, which allows the presentation of realistic movements during which visual cues and body cues for self-motion can be manipulated independently.
I will present a series of experiments in which we studied the multimodal perception of whole-body rotations around an earth-vertical axis (yaw rotations). In particular, we tested whether the integration of visual and body cues of self-motion follows the mathematically optimal maximum likelihood integration principle. We also investigated how the influence of visual and body cues on the perception of yaw rotations depends on focusing attention to either cue, and on becoming aware of conflicts between the two modalities.