日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

学術論文

Combining Sensory Information: Mandatory Fusion Within, but Not Between, Senses

MPS-Authors
/persons/resource/persons83906

Ernst,  MO
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)
公開されているフルテキストはありません
付随資料 (公開)
There is no public supplementary material available
引用

Hillis, J., Ernst, M., Banks, M., & Landy, M. (2002). Combining Sensory Information: Mandatory Fusion Within, but Not Between, Senses. Science, 298(5598), 1627-1630. doi:10.1126/science.1075396.


引用: https://hdl.handle.net/11858/00-001M-0000-0013-DE44-D
要旨
Humans use multiple sources of sensory information to estimate environmental properties. For example, the eyes and hands both provide relevant information about an object’s shape. The eyes pick up shape information from the object’s projected outline, its disparity gradient, texture gradient, shading, and more. The hands supply tactile and haptic shape information (respectively, static and active cues). When multiple cues are available, it would be sensible to combine them in a way that yields a more accurate estimate of the object property in question than any single-cue estimate would. By combining information from multiple sources, the nervous system might lose access to single-cue information. Here we report that single-cue information is indeed lost when cues from within the same sensory modality (disparity and texture gradients in vision) are combined, but not when cues from different modalities (vision and haptics) are combined. When one considers the nature of within- and inter-modal information, th
is difference is perfectly reasonable.