English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Left motor delta oscillations reflect asynchrony detection in multisensory speech perception

MPS-Authors

Gunter,  Thomas C.
Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society;

/persons/resource/persons19791

Kotz,  Sonja A.
Basic and Applied NeuroDynamics Lab, Department of Neuropsychology and Psychopharmacology, Maastricht University, the Netherlands;
Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Biau, E., Schultz, B. G., Gunter, T. C., & Kotz, S. A. (2022). Left motor delta oscillations reflect asynchrony detection in multisensory speech perception. The Journal of Neuroscience, 42(11), 2313-2326. doi:10.1523/JNEUROSCI.2965-20.2022.


Cite as: http://hdl.handle.net/21.11116/0000-000A-4B90-F
Abstract
During multisensory speech perception, slow δ oscillations (∼1–3 Hz) in the listener's brain synchronize with the speech signal, likely engaging in speech signal decomposition. Notable fluctuations in the speech amplitude envelope, resounding speaker prosody, temporally align with articulatory and body gestures and both provide complementary sensations that temporally structure speech. Further, δ oscillations in the left motor cortex seem to align with speech and musical beats, suggesting their possible role in the temporal structuring of (quasi)-rhythmic stimulation. We extended the role of δ oscillations to audiovisual asynchrony detection as a test case of the temporal analysis of multisensory prosody fluctuations in speech. We recorded Electroencephalograph (EEG) responses in an audiovisual asynchrony detection task while participants watched videos of a speaker. We filtered the speech signal to remove verbal content and examined how visual and auditory prosodic features temporally (mis-)align. Results confirm (1) that participants accurately detected audiovisual asynchrony, and (2) increased δ power in the left motor cortex in response to audiovisual asynchrony. The difference of δ power between asynchronous and synchronous conditions predicted behavioral performance, and (3) decreased δ-β coupling in the left motor cortex when listeners could not accurately map visual and auditory prosodies. Finally, both behavioral and neurophysiological evidence was altered when a speaker's face was degraded by a visual mask. Together, these findings suggest that motor δ oscillations support asynchrony detection of multisensory prosodic fluctuation in speech.