English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Lower beta: A central coordinator of temporal prediction in multimodal speech

Biau, E., & Kotz, S. A. (2018). Lower beta: A central coordinator of temporal prediction in multimodal speech. Frontiers in Human Neuroscience, 12: 434. doi:10.3389/fnhum.2018.00434.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/21.11116/0000-0002-9740-B Version Permalink: http://hdl.handle.net/21.11116/0000-0003-8ADA-C
Genre: Journal Article

Files

show Files
hide Files
:
Biau_2018.pdf (Publisher version), 2MB
Name:
Biau_2018.pdf
Description:
-
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show

Creators

show
hide
 Creators:
Biau, Emmanuel 1, Author
Kotz, Sonja A.1, 2, Author              
Affiliations:
1Basic and Applied NeuroDynamics Lab, Department of Neuropsychology and Psychopharmacology, Maastricht University, the Netherlands, ou_persistent22              
2Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society, ou_634551              

Content

show
hide
Free keywords: Temporal predictions; Beta oscillations; Multimodal speech perception; Prosody; Biological motion
 Abstract: How the brain decomposes and integrates information in multimodal speech perception is linked to oscillatory dynamics. However, how speech takes advantage of redundancy between different sensory modalities, and how this translates into specific oscillatory patterns remains unclear. We address the role of lower beta activity (~20 Hz), generally associated with motor functions, as an amodal central coordinator that receives bottom-up delta-theta copies from specific sensory areas and generate top-down temporal predictions for auditory entrainment. Dissociating temporal prediction from entrainment may explain how and why visual input benefits speech processing rather than adding cognitive load in multimodal speech perception. On the one hand, body movements convey prosodic and syllabic features at delta and theta rates (i.e., 1–3 Hz and 4–7 Hz). On the other hand, the natural precedence of visual input before auditory onsets may prepare the brain to anticipate and facilitate the integration of auditory delta-theta copies of the prosodic-syllabic structure. Here, we identify three fundamental criteria based on recent evidence and hypotheses, which support the notion that lower motor beta frequency may play a central and generic role in temporal prediction during speech perception. First, beta activity must respond to rhythmic stimulation across modalities. Second, beta power must respond to biological motion and speech-related movements conveying temporal information in multimodal speech processing. Third, temporal prediction may recruit a communication loop between motor and primary auditory cortices (PACs) via delta-to-beta cross-frequency coupling. We discuss evidence related to each criterion and extend these concepts to a beta-motivated framework of multimodal speech processing.

Details

show
hide
Language(s): eng - English
 Dates: 2018-05-082018-10-032018-10-24
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: Peer
 Identifiers: DOI: 10.3389/fnhum.2018.00434
PMID: 30405383
PMC: PMC6207805
Other: eCollection 2018
 Degree: -

Event

show

Legal Case

show

Project information

show hide
Project name : -
Grant ID : 707727
Funding program : Horizon 2020
Funding organization : European Union (EU)

Source 1

show
hide
Title: Frontiers in Human Neuroscience
  Abbreviation : Front Hum Neurosci
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Lausanne, Switzerland : Frontiers Research Foundation
Pages: - Volume / Issue: 12 Sequence Number: 434 Start / End Page: - Identifier: ISSN: 1662-5161
CoNE: https://pure.mpg.de/cone/journals/resource/1662-5161