English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Attention drives visual processing and audiovisual integration during multimodal communication

MPS-Authors
/persons/resource/persons256312

Seijdel,  Noor
The Communicative Brain, MPI for Psycholinguistics, Max Planck Society;
Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society;

/persons/resource/persons69

Hagoort,  Peter
Donders Institute for Brain, Cognition and Behaviour, External Organizations;
Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society;

/persons/resource/persons188997

Drijvers,  Linda
The Communicative Brain, MPI for Psycholinguistics, Max Planck Society;
Donders Institute for Brain, Cognition and Behaviour, External Organizations;
Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society;

External Resource

link to preprint
(Supplementary material)

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
Supplementary Material (public)
There is no public supplementary material available
Citation

Seijdel, N., Schoffelen, J.-M., Hagoort, P., & Drijvers, L. (2024). Attention drives visual processing and audiovisual integration during multimodal communication. The Journal of Neuroscience, 44(10): e0870232023. doi:10.1523/JNEUROSCI.0870-23.2023.


Cite as: https://hdl.handle.net/21.11116/0000-000D-4672-4
Abstract
During communication in real-life settings, our brain often needs to integrate auditory and visual information, and at the same time actively focus on the relevant sources of information, while ignoring interference from irrelevant events. The interaction between integration and attention processes remains poorly understood. Here, we use rapid invisible frequency tagging (RIFT) and magnetoencephalography (MEG) to investigate how attention affects auditory and visual information processing and integration, during multimodal communication. We presented human participants (male and female) with videos of an actress uttering action verbs (auditory; tagged at 58 Hz) accompanied by two movie clips of hand gestures on both sides of fixation (attended stimulus tagged at 65 Hz; unattended stimulus tagged at 63 Hz). Integration difficulty was manipulated by a lower-order auditory factor (clear/degraded speech) and a higher-order visual semantic factor (matching/mismatching gesture). We observed an enhanced neural response to the attended visual information during degraded speech compared to clear speech. For the unattended information, the neural response to mismatching gestures was enhanced compared to matching gestures. Furthermore, signal power at the intermodulation frequencies of the frequency tags, indexing non-linear signal interactions, was enhanced in left frontotemporal and frontal regions. Focusing on LIFG (Left Inferior Frontal Gyrus), this enhancement was specific for the attended information, for those trials that benefitted from integration with a matching gesture. Together, our results suggest that attention modulates audiovisual processing and interaction, depending on the congruence and quality of the sensory input.