English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Combining Sensory Information: Mandatory Fusion Within, but Not Between, Senses

MPS-Authors
/persons/resource/persons83906

Ernst,  MO
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Hillis, J., Ernst, M., Banks, M., & Landy, M. (2002). Combining Sensory Information: Mandatory Fusion Within, but Not Between, Senses. Science, 298(5598), 1627-1630. doi:10.1126/science.1075396.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-DE44-D
Abstract
Humans use multiple sources of sensory information to estimate environmental properties. For example, the eyes and hands both provide relevant information about an object’s shape. The eyes pick up shape information from the object’s projected outline, its disparity gradient, texture gradient, shading, and more. The hands supply tactile and haptic shape information (respectively, static and active cues). When multiple cues are available, it would be sensible to combine them in a way that yields a more accurate estimate of the object property in question than any single-cue estimate would. By combining information from multiple sources, the nervous system might lose access to single-cue information. Here we report that single-cue information is indeed lost when cues from within the same sensory modality (disparity and texture gradients in vision) are combined, but not when cues from different modalities (vision and haptics) are combined. When one considers the nature of within- and inter-modal information, th
is difference is perfectly reasonable.