English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Forced Fusion in Multisensory Heading Estimation

de Winkel, K., Katliar, M., & Bülthoff, H. (2015). Forced Fusion in Multisensory Heading Estimation. PLoS One, 10(5), 1-20. doi:10.1371/journal.pone.0127104.

Item is

Files

show Files

Creators

show
hide
 Creators:
de Winkel, KN1, 2, Author           
Katliar, M1, 2, Author           
Bülthoff, HH1, 2, Author           
Affiliations:
1Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: It has been shown that the Central Nervous System (CNS) integrates visual and inertial information in heading estimation for congruent multisensory stimuli and stimuli with small discrepancies. Multisensory information should, however, only be integrated when the cues are redundant. Here, we investigated how the CNS constructs an estimate of heading for combinations of visual and inertial heading stimuli with a wide range of discrepancies. Participants were presented with 2s visual-only and inertial-only motion stimuli, and combinations thereof. Discrepancies between visual and inertial heading ranging between 0-90° were introduced for the combined stimuli. In the unisensory conditions, it was found that visual heading was generally biased towards the fore-aft axis, while inertial heading was biased away from the fore-aft axis. For multisensory stimuli, it was found that five out of nine participants integrated visual and inertial heading information regardless of the size of the discrepancy; for one participant, the data were best described by a model that explicitly performs causal inference. For the remaining three participants the evidence could not readily distinguish between these models. The finding that multisensory information is integrated is in line with earlier findings, but the finding that even large discrepancies are generally disregarded is surprising. Possibly, people are insensitive to discrepancies in visual-inertial heading angle because such discrepancies are only encountered in artificial environments, making a neural mechanism to account for them otiose. An alternative explanation is that detection of a discrepancy may depend on stimulus duration, where sensitivity to detect discrepancies differs between people.

Details

show
hide
Language(s):
 Dates: 2015-05
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: DOI: 10.1371/journal.pone.0127104
eDoc: e0127104
BibTex Citekey: deWinkelKB2015
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: PLoS One
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: San Francisco, CA : Public Library of Science
Pages: - Volume / Issue: 10 (5) Sequence Number: - Start / End Page: 1 - 20 Identifier: ISSN: 1932-6203
CoNE: https://pure.mpg.de/cone/journals/resource/1000000000277850