Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT
  The "puzzle" of sensory perception: putting together multisensory information

Ernst, M. (2005). The "puzzle" of sensory perception: putting together multisensory information. In ICMI '05: 7th international conference on Multimodal interfaces (pp. 1). New York, NY, USA: ACM Press.

Item is

Basisdaten

einblenden: ausblenden:
Genre: Konferenzbeitrag

Externe Referenzen

einblenden:
ausblenden:
externe Referenz:
https://dl.acm.org/citation.cfm?doid=1088463.1088464 (Verlagsversion)
Beschreibung:
-
OA-Status:

Urheber

einblenden:
ausblenden:
 Urheber:
Ernst, MO1, 2, Autor           
Affiliations:
1Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              
2Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497797              

Inhalt

einblenden:
ausblenden:
Schlagwörter: -
 Zusammenfassung:
For perceiving the environment our brain uses multiple sources of sensory information derived from several different modalities, including vision, touch and audition. The question how information derived from these different sensory modalities converges in the brain to form a coherent and robust percept is central to understanding the process of perception. My main research interest is the study of human perception focusing on multimodal integration and visual-haptic interaction. For this, I use quantitative computational/statistical models together with psychophysical and neuropsychological methods.

A desirable goal for the perceptual system is to maximize the reliability of the various perceptual estimates. From a statistical viewpoint the optimal strategy for achieving this goal is to integrate all available sensory information. This may be done using a "maximum-likelihood-estimation" (MLE) strategy. Then the combined percept will be a weighted average across the individual estimates with weights that are proportional to their reliabilities.

In a recent study we could show that humans actually integrate visual and haptic information in such a statistically optimal fashion (Ernst & Banks, Nature, 2002). Others have now demonstrated that this finding is true not only for the integration across vision and touch, but also for the integration of information across and within other modalities, such as audition or vision. This suggests that maximum-likelihood-estimation is an effective and widely used strategy exploited by the perceptual system.

By integrating sensory information the brain may or may not loose access to the individual input signals feeding into the integrated percept. The degree to which the original information is still accessible defines the strength of coupling between the signals. We found that the strengths of coupling is varying depending on the set of signals used; e.g. strong coupling for stereo and texture signals to slant and weak coupling for visual and haptic signals to size (Hillis, Ernst, Banks, & Landy, Science, 2002). As suggested by one of our recent learning studies, the strength of coupling, which can be modeled using Bayesian statistics, seems to depend on the natural statistical co-occurrence between signals (Jäkel & Ernst, in prep.)

Important precondition for integrating signals is to know which signals derived from the different modalities belong together and how reliable these are. Recently we could show that touch can teach the visual modality how to interpret its signals and their reliabilities. More specifically, we could show that by exploiting touch we can alter visual perception of slant (Ernst, Banks & Bulthoff, Nature Neuroscience, 2000). This finding contributes to a very old debate postulating that we only perceive the world because of our interactions with the environment. Similarly, in one of our latest studies we could show that experience can change the so-called "light-from-above" prior. Prior knowledge is essential for the interpretation of sensory signals during perception. Consequently, with the prior change we introduced a change in the perception of shape (Adams, Graf & Ernst, Nature Neuroscience, 2004).

Integration is only sensible if the information sources carry redundant information. If the information sources are complementary, different combination strategies have to be exploited. Complementation of cross-modal information was demonstrated in a recent study investigating visual-haptic shape perception (Newell, Ernst, Tjan, & Bulthoff, Psychological Science, 2001).

Details

einblenden:
ausblenden:
Sprache(n):
 Datum: 2005-10
 Publikationsstatus: Erschienen
 Seiten: -
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: -
 Identifikatoren: DOI: 10.1145/1088463.1088464
 Art des Abschluß: -

Veranstaltung

einblenden:
ausblenden:
Titel: Seventh International Conference on Multimodal Interfaces (ICMI 2005)
Veranstaltungsort: Torento, Italy
Start-/Enddatum: 2005-10-04 - 2005-10-06

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle 1

einblenden:
ausblenden:
Titel: ICMI '05: 7th international conference on Multimodal interfaces
Genre der Quelle: Konferenzband
 Urheber:
Affiliations:
Ort, Verlag, Ausgabe: New York, NY, USA : ACM Press
Seiten: - Band / Heft: - Artikelnummer: - Start- / Endseite: 1 Identifikator: ISBN: 1-59593-028-0