English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Attention drives visual processing and audiovisual integration during multimodal communication

Seijdel, N., Schoffelen, J.-M., Hagoort, P., & Drijvers, L. (2024). Attention drives visual processing and audiovisual integration during multimodal communication. The Journal of Neuroscience, 44(10): e0870232023. doi:10.1523/JNEUROSCI.0870-23.2023.

Item is

Files

show Files
hide Files
:
Seijdel_etal_2024_attention drives visual processing ....pdf (Publisher version), 3MB
Name:
Seijdel_etal_2024_attention drives visual processing ....pdf
Description:
-
OA-Status:
Hybrid
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
2024
Copyright Info:
-
License:
-

Locators

show
hide
Locator:
link to preprint (Supplementary material)
Description:
-
OA-Status:
Not specified

Creators

show
hide
 Creators:
Seijdel, Noor1, 2, Author           
Schoffelen, Jan-Mathijs3, Author           
Hagoort, Peter2, 3, Author           
Drijvers, Linda1, 2, 3, Author           
Affiliations:
1The Communicative Brain, MPI for Psycholinguistics, Max Planck Society, Wundtlaan 1, 6525 XD Nijmegen, NL, ou_3275695              
2Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society, Nijmegen, NL, ou_792551              
3Donders Institute for Brain, Cognition and Behaviour, External Organizations, ou_55236              

Content

show
hide
Free keywords: -
 Abstract: During communication in real-life settings, our brain often needs to integrate auditory and visual information, and at the same time actively focus on the relevant sources of information, while ignoring interference from irrelevant events. The interaction between integration and attention processes remains poorly understood. Here, we use rapid invisible frequency tagging (RIFT) and magnetoencephalography (MEG) to investigate how attention affects auditory and visual information processing and integration, during multimodal communication. We presented human participants (male and female) with videos of an actress uttering action verbs (auditory; tagged at 58 Hz) accompanied by two movie clips of hand gestures on both sides of fixation (attended stimulus tagged at 65 Hz; unattended stimulus tagged at 63 Hz). Integration difficulty was manipulated by a lower-order auditory factor (clear/degraded speech) and a higher-order visual semantic factor (matching/mismatching gesture). We observed an enhanced neural response to the attended visual information during degraded speech compared to clear speech. For the unattended information, the neural response to mismatching gestures was enhanced compared to matching gestures. Furthermore, signal power at the intermodulation frequencies of the frequency tags, indexing non-linear signal interactions, was enhanced in left frontotemporal and frontal regions. Focusing on LIFG (Left Inferior Frontal Gyrus), this enhancement was specific for the attended information, for those trials that benefitted from integration with a matching gesture. Together, our results suggest that attention modulates audiovisual processing and interaction, depending on the congruence and quality of the sensory input.

Details

show
hide
Language(s): eng - English
 Dates: 2024-01-102024
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1523/JNEUROSCI.0870-23.2023
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: The Journal of Neuroscience
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Washington, DC : Society of Neuroscience
Pages: - Volume / Issue: 44 (10) Sequence Number: e0870232023 Start / End Page: - Identifier: ISSN: 0270-6474
CoNE: https://pure.mpg.de/cone/journals/resource/954925502187_1