English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  The "puzzle" of sensory perception: putting together multisensory information

Ernst, M. (2005). The "puzzle" of sensory perception: putting together multisensory information. In ICMI '05: 7th international conference on Multimodal interfaces (pp. 1). New York, NY, USA: ACM Press.

Item is

Basic

show hide
Genre: Conference Paper

Files

show Files

Locators

show
hide
Description:
-
OA-Status:

Creators

show
hide
 Creators:
Ernst, MO1, 2, Author           
Affiliations:
1Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              
2Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497797              

Content

show
hide
Free keywords: -
 Abstract:
For perceiving the environment our brain uses multiple sources of sensory information derived from several different modalities, including vision, touch and audition. The question how information derived from these different sensory modalities converges in the brain to form a coherent and robust percept is central to understanding the process of perception. My main research interest is the study of human perception focusing on multimodal integration and visual-haptic interaction. For this, I use quantitative computational/statistical models together with psychophysical and neuropsychological methods.

A desirable goal for the perceptual system is to maximize the reliability of the various perceptual estimates. From a statistical viewpoint the optimal strategy for achieving this goal is to integrate all available sensory information. This may be done using a "maximum-likelihood-estimation" (MLE) strategy. Then the combined percept will be a weighted average across the individual estimates with weights that are proportional to their reliabilities.

In a recent study we could show that humans actually integrate visual and haptic information in such a statistically optimal fashion (Ernst & Banks, Nature, 2002). Others have now demonstrated that this finding is true not only for the integration across vision and touch, but also for the integration of information across and within other modalities, such as audition or vision. This suggests that maximum-likelihood-estimation is an effective and widely used strategy exploited by the perceptual system.

By integrating sensory information the brain may or may not loose access to the individual input signals feeding into the integrated percept. The degree to which the original information is still accessible defines the strength of coupling between the signals. We found that the strengths of coupling is varying depending on the set of signals used; e.g. strong coupling for stereo and texture signals to slant and weak coupling for visual and haptic signals to size (Hillis, Ernst, Banks, & Landy, Science, 2002). As suggested by one of our recent learning studies, the strength of coupling, which can be modeled using Bayesian statistics, seems to depend on the natural statistical co-occurrence between signals (Jäkel & Ernst, in prep.)

Important precondition for integrating signals is to know which signals derived from the different modalities belong together and how reliable these are. Recently we could show that touch can teach the visual modality how to interpret its signals and their reliabilities. More specifically, we could show that by exploiting touch we can alter visual perception of slant (Ernst, Banks & Bulthoff, Nature Neuroscience, 2000). This finding contributes to a very old debate postulating that we only perceive the world because of our interactions with the environment. Similarly, in one of our latest studies we could show that experience can change the so-called "light-from-above" prior. Prior knowledge is essential for the interpretation of sensory signals during perception. Consequently, with the prior change we introduced a change in the perception of shape (Adams, Graf & Ernst, Nature Neuroscience, 2004).

Integration is only sensible if the information sources carry redundant information. If the information sources are complementary, different combination strategies have to be exploited. Complementation of cross-modal information was demonstrated in a recent study investigating visual-haptic shape perception (Newell, Ernst, Tjan, & Bulthoff, Psychological Science, 2001).

Details

show
hide
Language(s):
 Dates: 2005-10
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: DOI: 10.1145/1088463.1088464
 Degree: -

Event

show
hide
Title: Seventh International Conference on Multimodal Interfaces (ICMI 2005)
Place of Event: Torento, Italy
Start-/End Date: 2005-10-04 - 2005-10-06

Legal Case

show

Project information

show

Source 1

show
hide
Title: ICMI '05: 7th international conference on Multimodal interfaces
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: New York, NY, USA : ACM Press
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 1 Identifier: ISBN: 1-59593-028-0