English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Modulation of early auditory processing by visual information: Prediction or bimodal integration?

Stuckenberg, M., Schröger, E., & Widmann, A. (2021). Modulation of early auditory processing by visual information: Prediction or bimodal integration? Attention, Perception & Psychophysics, 83(4), 1538-1551. doi:10.3758/s13414-021-02240-1.

Item is

Files

show Files
hide Files
:
Stuckenberg_2021.pdf (Publisher version), 2MB
Name:
Stuckenberg_2021.pdf
Description:
-
OA-Status:
Hybrid
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-

Locators

show

Creators

show
hide
 Creators:
Stuckenberg, Maria1, 2, Author           
Schröger, Erich1, Author
Widmann, Andreas1, 3, Author
Affiliations:
1Institute of Psychology, University of Leipzig, Germany, ou_persistent22              
2International Max Planck Research School on Neuroscience of Communication: Function, Structure, and Plasticity, MPI for Human Cognitive and Brain Sciences, Max Planck Society, ou_2616696              
3Leibniz Institute for Neurobiology, Magdeburg, Germany, ou_persistent22              

Content

show
hide
Free keywords: Audition; MEG; Methods: EEG; Multisensory processing
 Abstract: What happens if a visual cue misleads auditory expectations? Previous studies revealed an early visuo–auditory incongruency effect, so-called incongruency response (IR) of the auditory event-related brain potential (ERP), occurring 100 ms after onset of the sound being incongruent to the preceding visual cue. So far, this effect has been ascribed to reflect the mismatch between auditory sensory expectation activated by visual predictive information and the actual sensory input. Thus, an IR should be confined to an asynchronous presentation of visual cue and sound. Alternatively, one could argue that frequently presented congruent visual-cue–sound combinations are integrated into a bimodal representation whereby violation of the visual–auditory relationship results in a bimodal feature mismatch (the IR should be obtained with asynchronous and with synchronous presentation). In an asynchronous condition, an either high-pitched or low-pitched sound was preceded by a visual note symbol presented above or below a fixation cross (90% congruent; 10% incongruent), while in a synchronous condition, both were presented simultaneously. High-pitched and low-pitched sounds were presented with different probabilities (83% vs. 17%) to form a strong association between bimodal stimuli. In both conditions, tones with pitch incongruent with the location of the note symbols elicited incongruency effects in the N2 and P3 ERPs; however, the IR was only elicited in the asynchronous condition. This finding supports the sensorial prediction error hypothesis stating that the amplitude of the auditory ERP 100 ms after sound onset is enhanced in response to unexpected compared with expected but otherwise identical sounds.

Details

show
hide
Language(s): eng - English
 Dates: 2020-12-292021-01-272021-05
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: DOI: 10.3758/s13414-021-02240-1
Other: epub 2021
PMID: 33506354
 Degree: -

Event

show

Legal Case

show

Project information

show hide
Project name : -
Grant ID : -
Funding program : -
Funding organization : International Max Planck Research School for Neuroscience of Communication (IMPRS NeuroCom)
Project name : -
Grant ID : -
Funding program : -
Funding organization : Projekt DEAL

Source 1

show
hide
Title: Attention, Perception & Psychophysics
  Abbreviation : Atten Percept Psychophys
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Psychonomic Society
Pages: - Volume / Issue: 83 (4) Sequence Number: - Start / End Page: 1538 - 1551 Identifier: ISSN: 1943-3921
CoNE: https://pure.mpg.de/cone/journals/resource/1943-3921