English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Left motor delta oscillations reflect asynchrony detection in multisensory speech perception

Biau, E., Schultz, B. G., Gunter, T. C., & Kotz, S. A. (2022). Left motor delta oscillations reflect asynchrony detection in multisensory speech perception. The Journal of Neuroscience, 42(11), 2313-2326. doi:10.1523/JNEUROSCI.2965-20.2022.

Item is

Files

show Files
hide Files
:
Biau_2022.pdf (Publisher version), 2MB
Name:
Biau_2022.pdf
Description:
-
OA-Status:
Green
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-

Locators

show

Creators

show
hide
 Creators:
Biau, Emmanuel1, 2, Author
Schultz, Benjamin G.2, Author
Gunter, Thomas C.3, Author           
Kotz, Sonja A.2, 3, Author           
Affiliations:
1Department of Psychological Sciences, University of Liverpool, United Kingdom, ou_persistent22              
2Basic and Applied NeuroDynamics Lab, Department of Neuropsychology and Psychopharmacology, Maastricht University, the Netherlands, ou_persistent22              
3Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society, ou_634551              

Content

show
hide
Free keywords: Audio-visual asynchrony; Delta oscillations; Motor cortex; Multisensory speech; Prosody
 Abstract: During multisensory speech perception, slow δ oscillations (∼1–3 Hz) in the listener's brain synchronize with the speech signal, likely engaging in speech signal decomposition. Notable fluctuations in the speech amplitude envelope, resounding speaker prosody, temporally align with articulatory and body gestures and both provide complementary sensations that temporally structure speech. Further, δ oscillations in the left motor cortex seem to align with speech and musical beats, suggesting their possible role in the temporal structuring of (quasi)-rhythmic stimulation. We extended the role of δ oscillations to audiovisual asynchrony detection as a test case of the temporal analysis of multisensory prosody fluctuations in speech. We recorded Electroencephalograph (EEG) responses in an audiovisual asynchrony detection task while participants watched videos of a speaker. We filtered the speech signal to remove verbal content and examined how visual and auditory prosodic features temporally (mis-)align. Results confirm (1) that participants accurately detected audiovisual asynchrony, and (2) increased δ power in the left motor cortex in response to audiovisual asynchrony. The difference of δ power between asynchronous and synchronous conditions predicted behavioral performance, and (3) decreased δ-β coupling in the left motor cortex when listeners could not accurately map visual and auditory prosodies. Finally, both behavioral and neurophysiological evidence was altered when a speaker's face was degraded by a visual mask. Together, these findings suggest that motor δ oscillations support asynchrony detection of multisensory prosodic fluctuation in speech.

Details

show
hide
Language(s): eng - English
 Dates: 2022-01-122020-11-252022-01-142022-01-272022-03-16
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: DOI: 10.1523/JNEUROSCI.2965-20.2022
Other: epub 2022
PMID: 35086905
PMC: PMC8936602
 Degree: -

Event

show

Legal Case

show

Project information

show hide
Project name : -
Grant ID : 707727
Funding program : Horizon 2020
Funding organization : European Union

Source 1

show
hide
Title: The Journal of Neuroscience
  Other : The Journal of Neuroscience: the Official Journal of the Society for Neuroscience
  Abbreviation : J. Neurosci.
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Washington, DC : Society of Neuroscience
Pages: - Volume / Issue: 42 (11) Sequence Number: - Start / End Page: 2313 - 2326 Identifier: ISSN: 0270-6474
CoNE: https://pure.mpg.de/cone/journals/resource/954925502187_1