English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Reliability-Weighted Integration of Audiovisual Signals Can Be Modulated by Top-down Control

Rohe, T., & Noppeney, U. (2018). Reliability-Weighted Integration of Audiovisual Signals Can Be Modulated by Top-down Control. eNeuro, 5(1), 1-20. doi:10.1523/ENEURO.0315-17.2018.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/21.11116/0000-0001-7D14-D Version Permalink: http://hdl.handle.net/21.11116/0000-0004-B5E2-0
Genre: Journal Article

Files

show Files

Locators

show
hide
Locator:
Link (Any fulltext)
Description:
-

Creators

show
hide
 Creators:
Rohe, T1, 2, 3, Author              
Noppeney, U1, 2, 3, Author              
Affiliations:
1Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497794              
3Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497804              

Content

show
hide
Free keywords: -
 Abstract: Behaviorally, it is well established that human observers integrate signals near-optimally weighted in proportion to their reliabilities as predicted by maximum likelihood estimation. Yet, despite abundant behavioral evidence, it is unclear how the human brain accomplishes this feat. In a spatial ventriloquist paradigm, participants were presented with auditory, visual, and audiovisual signals and reported the location of the auditory or the visual signal. Combining psychophysics, multivariate functional MRI (fMRI) decoding, and models of maximum likelihood estimation (MLE), we characterized the computational operations underlying audiovisual integration at distinct cortical levels. We estimated observers' behavioral weights by fitting psychometric functions to participants' localization responses. Likewise, we estimated the neural weights by fitting neurometric functions to spatial locations decoded from regional fMRI activation patterns. Our results demonstrate that low-level auditory and visual areas encode predominantly the spatial location of the signal component of a region's preferred auditory (or visual) modality. By contrast, intraparietal sulcus forms spatial representations by integrating auditory and visual signals weighted by their reliabilities. Critically, the neural and behavioral weights and the variance of the spatial representations depended not only on the sensory reliabilities as predicted by the MLE model but also on participants' modality-specific attention and report (i.e., visual vs. auditory). These results suggest that audiovisual integration is not exclusively determined by bottom-up sensory reliabilities. Instead, modality-specific attention and report can flexibly modulate how intraparietal sulcus integrates sensory signals into spatial representations to guide behavioral responses (e.g., localization and orienting).

Details

show
hide
Language(s):
 Dates: 2018-02
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: -
 Identifiers: DOI: 10.1523/ENEURO.0315-17.2018
BibTex Citekey: RoheN2018
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: eNeuro
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 5 (1) Sequence Number: - Start / End Page: 1 - 20 Identifier: -