English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Meeting Abstract

Neural systems involved in visual-tactile integration of shape information

MPS-Authors
/persons/resource/persons83960

Helbig,  HB
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84112

Noppeney,  U
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83906

Ernst,  M
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Helbig, H., Noppeney, U., & Ernst, M. (2007). Neural systems involved in visual-tactile integration of shape information. In M. Falkenstein, M. Grosjean, G. Rinkenauer, & E. Wascher (Eds.), Psychologie und Gehirn 2007 (pp. 3). Dortmund, Germany: Institut für Arbeitsphysiologie: Universität Dortmund.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-CD83-4
Abstract
The brain integrates multisensory information to create a coherent and more reliable perceptual estimate of the environment. This multisensory estimate is a linear combination of the individual unimodal estimates that are weighted by their relative reliabilities (e.g., Ernst and Banks, Nature, 2002). Here we explored the neural substrates underlying visual-tactile integration in shape processing. To identify multisensory integration sites, we correlated behavioural data with neural activity evoked by multisensory integration. Observers were presented with elliptical shapes that they could see and/or touch. Observers’ task was to judge the shape of the ellipse. Introducing conflicts between seen and felt shape allowed us to examine whether participants relied more on visual or tactile information (relative weight of vision and touch). To manipulate the weight attributed to vision, we degraded visual information. We observed a decrease in visual weight when vision was degraded and thus became less reliable. Discrimination performance increased when both modalities were presented together, indicating that visual and tactile shape information is indeed fused. BOLD response bilaterally in the anterior IPS is modulated by visual input. Change in BOLD signal these areas correlates with cue weights, suggesting that this activity reflects the relative weighting of vision and touch.