English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Contributions of visual and proprioceptive information to travelled distance estimation during changing sensory congruencies

Campos, J. L., Butler, J., & Bülthoff, H. (2014). Contributions of visual and proprioceptive information to travelled distance estimation during changing sensory congruencies. Experimental Brain Research, 232(10), 3277-3289. doi:10.1007/s00221-014-4011-0.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-0027-7FB1-5 Version Permalink: http://hdl.handle.net/21.11116/0000-0001-2766-1
Genre: Journal Article

Files

show Files

Locators

show

Creators

show
hide
 Creators:
Campos, Jennifer L1, 2, Author              
Butler, JS1, 2, Author              
Bülthoff, HH1, 2, Author              
Affiliations:
1Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: Recent research has provided evidence that visual and body-based cues (vestibular, proprioceptive and efference copy) are integrated using a weighted linear sum during walking and passive transport. However, little is known about the specific weighting of visual information when combined with proprioceptive inputs alone, in the absence of vestibular information about forward self-motion. Therefore, in this study, participants walked in place on a stationary treadmill while dynamic visual information was updated in real time via a head-mounted display. The task required participants to travel a predefined distance and subsequently match this distance by adjusting an egocentric, in-depth target using a game controller. Travelled distance information was provided either through visual cues alone, proprioceptive cues alone or both cues combined. In the combined cue condition, the relationship between the two cues was manipulated by either changing the visual gain across trials (0.7⁽×⁾, 1.0⁽×⁾, 1.4⁽×⁾; Exp. 1) or the proprioceptive gain across trials (0.7⁽×⁾, 1.0⁽×⁾, 1.4⁽×⁾; Exp. 2). Results demonstrated an overall higher weighting of proprioception over vision. These weights were scaled, however, as a function of which sensory input provided more stable information across trials. Specifically, when visual gain was constantly manipulated, proprioceptive weights were higher than when proprioceptive gain was constantly manipulated. These results therefore reveal interesting characteristics of cue-weighting within the context of unfolding spatio-temporal cue dynamics.

Details

show
hide
Language(s):
 Dates: 2014-10
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: -
 Identifiers: DOI: 10.1007/s00221-014-4011-0
BibTex Citekey: CamposBB2014
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Experimental Brain Research
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 232 (10) Sequence Number: - Start / End Page: 3277 - 3289 Identifier: -