English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Within- and cross-modal distance information disambiguates visual size perception

Battaglia, P., Di Luca, M., Ernst, M., Schrater, P., Machulla, T., & Kersten, D. (2010). Within- and cross-modal distance information disambiguates visual size perception. PLoS Computational Biology, 6(3), 1-10. doi:10.1371/journal.pcbi.1000697.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-0013-C100-E Version Permalink: http://hdl.handle.net/21.11116/0000-0002-7745-B
Genre: Journal Article

Files

show Files

Creators

show
hide
 Creators:
Battaglia, PW1, 2, 3, Author              
Di Luca, M1, 2, 3, Author              
Ernst, MO1, 2, 3, Author              
Schrater, PR, Author
Machulla, T1, 2, 3, Author              
Kersten, D1, 3, Author              
Affiliations:
1Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497794              
2Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497806              
3Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              

Content

show
hide
Free keywords: -
 Abstract: Perception is fundamentally underconstrained because different combinations of object properties can generate the same sensory information. To disambiguate sensory information into estimates of scene properties, our brains incorporate prior knowledge and additional “auxiliary” (i.e., not directly relevant to desired scene property) sensory information to constrain perceptual interpretations. For example, knowing the distance to an object helps in perceiving its size. The literature contains few demonstrations of the use of prior knowledge and auxiliary information in combined visual and haptic disambiguation and almost no examination of haptic disambiguation of vision beyond “bistable” stimuli. Previous studies have reported humans integrate multiple unambiguous sensations to perceive single, continuous object properties, like size or position. Here we test whether humans use visual and haptic information, individually and jointly, to disambiguate size from distance. We presented participants with a ball moving in depth with a changing diameter. Because no unambiguous distance information is available under monocular viewing, participants rely on prior assumptions about the ball's distance to disambiguate their -size percept. Presenting auxiliary binocular and/or haptic distance information augments participants' prior distance assumptions and improves their size judgment accuracy—though binocular cues were trusted more than haptic. Our results suggest both visual and haptic distance information disambiguate size perception, and we interpret these results in the context of probabilistic perceptual reasoning.

Details

show
hide
Language(s):
 Dates: 2010-03
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: -
 Identifiers: DOI: 10.1371/journal.pcbi.1000697
BibTex Citekey: 5774
eDoc: e1000697
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: PLoS Computational Biology
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: San Francisco, CA : Public Library of Science
Pages: - Volume / Issue: 6 (3) Sequence Number: - Start / End Page: 1 - 10 Identifier: ISSN: 1553-734X
CoNE: https://pure.mpg.de/cone/journals/resource/1000000000017180_1