User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse




Book Chapter

A Bayesian view on multimodal cue integration


Ernst,  MO
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available

Ernst, M. (2005). A Bayesian view on multimodal cue integration. In Human Body Perception From The Inside Out (pp. 105-131). Oxford, UK: Oxford University Press.

Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-D3A3-F
We perceive our own body and the world surrounding us via multiple sources of sensory information derived from several modalities, including vision, touch and audition. To enable interactions with the environment this information has to converge into a coherent and unambiguous multimodal percept of the body and the world. But how does the brain come up with such a unique percept? In this chapter I review a model that in the statistical sense describes an optimal integration mechanism. The benefit of integrating sensory information comes from a reduction in variance of the final perceptual estimate. Furthermore, I point out how this integration scheme can be incorporated in a larger framework using Bayesian decision theory (BDT).