English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Integration of visual and non-visual self-motion cues during voluntary head movements in the human brain

Schindler, A., & Bartels, A. (2018). Integration of visual and non-visual self-motion cues during voluntary head movements in the human brain. NeuroImage, 172, 597-607. doi:10.1016/j.neuroimage.2018.02.006.

Item is

Files

show Files

Locators

show
hide
Locator:
Link (Any fulltext)
Description:
-
OA-Status:
Not specified

Creators

show
hide
 Creators:
Schindler, A1, 2, Author           
Bartels, A1, 2, Author           
Affiliations:
1Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497794              
2Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497798              

Content

show
hide
Free keywords: -
 Abstract: Our phenomenological experience of the stable world is maintained by continuous integration of visual self-motion with extra-retinal signals. However, due to conventional constraints of fMRI acquisition in humans, neural responses to visuo-vestibular integration have only been studied using artificial stimuli, in the absence of voluntary head-motion. We here circumvented these limitations and let participants to move their heads during scanning. The slow dynamics of the BOLD signal allowed us to acquire neural signal related to head motion after the observer's head was stabilized by inflatable aircushions. Visual stimuli were presented on head-fixed display goggles and updated in real time as a function of head-motion that was tracked using an external camera. Two conditions simulated forward translation of the participant. During physical head rotation, the congruent condition simulated a stable world, whereas the incongruent condition added arbitrary lateral motion. Importantly, both conditions were precisely matched in visual properties and head-rotation. By comparing congruent with incongruent conditions we found evidence consistent with the multi-modal integration of visual cues with head motion into a coherent “stable world” percept in the parietal operculum and in an anterior part of parieto-insular cortex (aPIC). In the visual motion network, human regions MST, a dorsal part of VIP, the cingulate sulcus visual area (CSv) and a region in precuneus (Pc) showed differential responses to the same contrast. The results demonstrate for the first time neural multimodal interactions between precisely matched congruent versus incongruent visual and non-visual cues during physical head-movement in the human brain. The methodological approach opens the path to a new class of fMRI studies with unprecedented temporal and spatial control over visuo-vestibular stimulation.

Details

show
hide
Language(s):
 Dates: 2018-05
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: DOI: 10.1016/j.neuroimage.2018.02.006
BibTex Citekey: SchindlerB2018
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: NeuroImage
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 172 Sequence Number: - Start / End Page: 597 - 607 Identifier: -