English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  A cortical hierarchy performs Bayesian Causal Inference for multisensory perception

Rohe, T., & Noppeney, U. (2014). A cortical hierarchy performs Bayesian Causal Inference for multisensory perception. Poster presented at 20th Annual Meeting of the Organization for Human Brain Mapping (OHBM 2014), Hamburg, Germany.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/21.11116/0000-0001-32A2-F Version Permalink: http://hdl.handle.net/21.11116/0000-0001-35C0-A
Genre: Poster

Files

show Files

Creators

show
hide
 Creators:
Rohe, Tim1, 2, Author              
Noppeney, Uta1, 2, Author              
Affiliations:
1Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497804              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: Introduction: To form a reliable percept of the multisensory environment, the brain integrates signals across the senses. However, it should integrate signals only when caused by a common source, but segregate those from different sources (Shams and Beierholm, 2010). Bayesian Causal inference provides a rational strategy to arbitrate between information integration and segregation: In the case of a common source, signals should be integrated weighted by their sensory reliability (Ernst and Banks, 2002; Alais and Burr, 2004; Fetsch et al., 2012). In case of separate sources, they should be processed independently. Yet, in everyday life, the brain does not know whether signals come from common or different sources, but needs to infer the probabilities of these casual structures from the sensory signals. A final estimate can then be obtained by averaging the estimates under the two causal structures weighted by their posterior probabilities (i.e. model averaging). Indeed, human observers locate audiovisual signal sources by combining the spatial estimates under the assumptions of common and separate sources weighted by their probabilities (Kording et al., 2007). Yet, the neural basis of Bayesian Causal Inference during spatial localization remains unknown. This study combines Bayesian Modeling and multivariate fMRI decoding to characterize how Bayesian Causal Inference is performed by the auditory and visual cortical hierarchies (Fig. 1A-C). Methods: Participants (N = 5) were presented with auditory and visual signals that were independently sampled from four locations along the azimuth. The spatial reliability of the visual signal was high or low. In a selective attention paradigm, participants localized either the auditory or the visual spatial signal. After fitting the Bayesian causal inference model to participants' localization responses, we obtained condition-specific auditory and visual spatial estimates under the assumption of (i) common (SAV,C=1) and (ii) separate sources (SA,C=2, SV,C=2) and (iii) the final combined spatial estimate after model averaging (SA, SV), i.e. five spatial estimates in total (Fig. 1C). Using cross-validation, we trained a support vector regression model to decode these auditory or visual spatial estimates from fMRI voxel response patterns in regions along the visual and auditory cortical hierarchies. We evaluated the decoding accuracy for each spatial estimate in terms of the correlation coefficient between the spatial estimate decoded from fMRI and predicted from the Bayesian Causal Inference model. To determine the spatial estimate that is primarily encoded in a region, we next computed the exceedance probability that a correlation coefficient of one spatial estimate was greater than any of the other spatial estimates (Fig. 1D). Results: Bayesian Causal Inference emerged along the auditory and visual hierarchies: Lower level visual and auditory areas encoded auditory and visual estimates under the assumption of separate sources (i.e. information segregation). Posterior intraparietal sulcus (IPS1-2) represented the reliability-weighted average of the signals under common source assumptions. Anterior IPS (IPS3-4) represented the task-relevant auditory or visual spatial estimate obtained from model averaging. Conclusions: This is the first demonstration that the computational operations underlying Bayesian Causal Inference are performed by the human brain in a hierarchical fashion. Critically, the brain explicitly encodes not only the spatial estimates under the assumption of full segregation (primary visual and auditory areas), but also under forced fusion (IPS1-2). These spatial estimates under the causal structures of common and separate sources are then averaged into task-relevant auditory or visual estimates according to model averaging (IPS3-4). Our study provides a novel hierarchical perspective on multisensory integration in human neocortex.

Details

show
hide
Language(s):
 Dates: 2014-06
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: -
 Identifiers: BibTex Citekey: RoheN2014
 Degree: -

Event

show
hide
Title: 20th Annual Meeting of the Organization for Human Brain Mapping (OHBM 2014)
Place of Event: Hamburg, Germany
Start-/End Date: -

Legal Case

show

Project information

show

Source 1

show
hide
Title: 20th Annual Meeting of the Organization for Human Brain Mapping (OHBM 2014)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: - Sequence Number: 4050 Start / End Page: - Identifier: -