English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Early visual and auditory processing rely on modality-specific attentional resources

MPS-Authors
/persons/resource/persons19833

Maess,  Burkhard
Methods and Development Unit MEG and EEG: Signal Analysis and Modelling, MPI for Human Cognitive and Brain Sciences, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Keitel, C., Maess, B., Schröger, E., & Müller, M. M. (2013). Early visual and auditory processing rely on modality-specific attentional resources. NeuroImage, 70, 240-249. doi:10.1016/j.neuroimage.2012.12.046.


Cite as: https://hdl.handle.net/11858/00-001M-0000-000E-F40F-F
Abstract
Many everyday situations require focusing on visual or auditory information while ignoring the other modality. Previous findings suggest an attentional mechanism that operates between sensory modalities and governs such states. To date, evidence is equivocal as to whether this ‘intermodal’ attention relies on a distribution of resources either common or specific to sensory modalities. We provide new insights by investigating consequences of a shift from simultaneous (‘bimodal’) attention to vision and audition to unimodal selective attention. Concurrently presented visual and auditory stimulus streams were frequency-tagged to elicit steady-state responses (SSRs) recorded simultaneously in electro- and magnetoencephalograms (EEG/MEG). After the shift, decreased amplitudes of the SSR corresponding to the unattended sensory stream indicated reduced processing. We did not observe an amplitude increase of the SSR corresponding to the attended sensory stream. These findings are incompatible with a common-resources account. A redistribution of attentional resources between vision and audition would result in simultaneous processing gain in the attended sensory modality and reduction in the unattended sensory modality. Our results favor a modality-specific-resources account, which allows for independent modulation of early cortical processing in each sensory modality.