Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Buchkapitel

A Bayesian view on multimodal cue integration

MPG-Autoren
/persons/resource/persons83906

Ernst,  MO
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)

Ernst_BayesIntegration_2006.pdf
(beliebiger Volltext), 4MB

Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Ernst, M. (2006). A Bayesian view on multimodal cue integration. In G. Knoblich, M. Grosjean, I. Thornton, & M. Shiffrar (Eds.), Human body perception from the inside out (pp. 105-131). Oxford, UK: Oxford University Press.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-D3A3-F
Zusammenfassung
We perceive our own body and the world surrounding us via multiple sources of sensory information derived from several modalities, including vision, touch and audition. To enable interactions with the environment this information has to converge into a coherent and unambiguous multimodal percept of the body and the world. But how does the brain come up with such a unique percept? In this chapter I review a model that in the statistical sense describes an optimal integration mechanism. The benefit of integrating sensory information comes from a reduction in variance of the final perceptual estimate. Furthermore, I point out how this integration scheme can be incorporated in a larger framework using Bayesian decision theory (BDT).