日本語
 
User Manual Privacy Policy ポリシー/免責事項 連絡先
  詳細検索ブラウズ

アイテム詳細


公開

書籍の一部

A Bayesian view on multimodal cue integration

MPS-Authors
/persons/resource/persons83906

Ernst,  MO
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

URL
There are no locators available
フルテキスト (公開)
公開されているフルテキストはありません
付随資料 (公開)
There is no public supplementary material available
引用

Ernst, M. (2005). A Bayesian view on multimodal cue integration. In Human Body Perception From The Inside Out (pp. 105-131). Oxford, UK: Oxford University Press.


引用: http://hdl.handle.net/11858/00-001M-0000-0013-D3A3-F
要旨
We perceive our own body and the world surrounding us via multiple sources of sensory information derived from several modalities, including vision, touch and audition. To enable interactions with the environment this information has to converge into a coherent and unambiguous multimodal percept of the body and the world. But how does the brain come up with such a unique percept? In this chapter I review a model that in the statistical sense describes an optimal integration mechanism. The benefit of integrating sensory information comes from a reduction in variance of the final perceptual estimate. Furthermore, I point out how this integration scheme can be incorporated in a larger framework using Bayesian decision theory (BDT).