English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Leading or Lagging: Temporal prediction errors are expressed in auditory and visual cortices

Lee, H., & Noppemey, U. (2012). Leading or Lagging: Temporal prediction errors are expressed in auditory and visual cortices. Poster presented at 18th Annual Meeting of the Organization for Human Brain Mapping (OHBM 2012), Beijing, China.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-0013-B738-3 Version Permalink: http://hdl.handle.net/21.11116/0000-0001-9E79-6
Genre: Poster

Files

show Files
hide Files
:
OHBM-2012-Lee.pdf (Abstract), 235KB
Name:
OHBM-2012-Lee.pdf
Description:
-
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Creators

show
hide
 Creators:
Lee, HL1, 2, Author              
Noppemey, U1, 2, Author              
Affiliations:
1Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497804              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: Introduction: In our natural environment our brain is exposed to a constant influx of multisensory signals that dynamically evolve at multiple timescales. Statistical regularities are important cues informing the brain whether two sensory signals are generated by a common physical process and should hence be integrated. This fMRI study investigated how the brain detects violations of these statistical regularities induced by the temporal misalignment of the visual and auditory signals. Specifically, we arbitrated between two hypotheses that make opposite predictions: Under the predictive coding framework the brain iteratively optimizes an internal model of its multisensory environment by reducing the error between its predictions and the sensory inputs. An audiovisual misalignment that violates the natural statistical regularities should thus induce a prediction error signal. For visual leading asynchrony, we would expect a prediction error signal in the auditory cortex, because the delayed auditory signal violates the temporal predictions of the 'leading' visual system (vice versa for auditory leading asynchrony) [2,3]. Alternatively, from the perspective of the biased competition model, the misaligned auditory and visual signals compete for processing resources. For visual leading asynchrony, we would expect an increased BOLD-signal in the visual system indexing the higher salience of the leading visual signal which then suppresses the temporally incompatible auditory signal [1]. Methods: 37 subjects participated in this fMRI study (Siemens TimTrio 3T scanner, GE-EPI, TE = 40 ms, 42 axial slices, TR = 3s). They passively perceived audiovisual movies of natural speech, sinewave speech (SWS) and piano music. The audiovisual signals were synchronous, auditory leading (+240ms) or visual leading (-240ms). Hence, the 3 x 3 factorial design manipulated (i) temporal alignment (3 levels) and (ii) stimulus class (3 levels). The activation trials were interleaved with 8s fixation blocks. To allow for random-effects analyses, contrast images (single condition > fixation) for each subject were entered into a 2nd level ANOVA, which modelled the 9 effects in our 3 X 3 design. 1. Using a conjunction null conjunction analysis, we identified differences between auditory and visual leading conditions that are common to speech, SWS and music. 2. We tested for asynchrony effects (i.e. auditory leading > synchronous, visual leading > synchronous) separately for each stimulus class. Results are reported at p<.05 corrected for multiple comparisons at the cluster level using a height threshold of p<.001 uncorrected. Results: 1. Common for all stimulus classes, auditory leading relative to visual leading signals increased activations in bilateral V5/hMT+. In contrast, visual leading relative to auditory leading signals increased activations in bilateral Heschl's gyri (Fig. 1). 2. Auditory leading relative to synchronous AV signals increased activations in the auditory system extending from Heschl's gyrus into posterior superior temporal sulcus/gyrus (STS/STG) bilaterally. Conversely, visual leading relative to synchronous signals increased activations in bilateral occipito-temporal cortices predominantly in V5/hMT+ (Fig. 2).

Details

show
hide
Language(s):
 Dates: 2012-06
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: -
 Identifiers: URI: http://ww4.aievolution.com/hbm1201/index.cfm?do=abs.viewAbsabs=6296
BibTex Citekey: LeeN2012
 Degree: -

Event

show
hide
Title: 18th Annual Meeting of the Organization for Human Brain Mapping (OHBM 2012)
Place of Event: Beijing, China
Start-/End Date: -

Legal Case

show

Project information

show

Source 1

show
hide
Title: 18th Annual Meeting of the Organization for Human Brain Mapping (OHBM 2012)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: - Sequence Number: 997 Start / End Page: - Identifier: -