English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Visual speech speeds up the neural processing of auditory speech

van Wassenhove, V., Grant, K. W., & Poeppel, D. (2005). Visual speech speeds up the neural processing of auditory speech. Proceedings of the National Academy of Sciences of the United States of America, 102(4), 1181-1186. doi:10.1073/pnas.0408949102.

Item is

Files

show Files

Locators

show

Creators

show
hide
 Creators:
van Wassenhove, Virginie1, Author
Grant, Ken W., Author
Poeppel, David1, Author           
Affiliations:
1University of Maryland, College Park, USA, ou_persistent22              

Content

show
hide
Free keywords: EEG multisensory predictive coding MULTISENSORY CONVERGENCE CROSSMODAL BINDING AUDIOVISUAL SPEECH ASSOCIATION CORTEX MACAQUE MONKEY INTEGRATION INFORMATION STIMULI NEURONS BRAIN Multidisciplinary Sciences
 Abstract: Synchronous presentation of stimuli to the auditory and visual systems can modify the formation of a percept in either modality. For example, perception of auditory speech is improved when the speaker's facial articulatory movements are visible. Neural convergence onto multisensory sites exhibiting supra-additivity has been proposed as the principal mechanism for integration. Recent findings, however, have suggested that putative sensory-specific cortices are responsive to inputs presented through a different modality. Consequently, when and where audiovisual representations emerge remain unsettled. In combined psychophysical and electroencephalography experiments we show that visual speech speeds up the cortical processing of auditory signals early (within 100 ms of signal onset). The auditory-visual interaction is reflected as an articulator-specific temporal facilitation (as well as a nonspecific amplitude reduction). The latency facilitation systematically depends on the degree to which the visual signal predicts possible auditory targets. The observed auditory-visual data support the view that there exist abstract internal representations that constrain the analysis of subsequent speech inputs. This is evidence for the existence of an "analysis-by-synthesis" mechanism in auditory-visual speech perception.

Details

show
hide
Language(s): eng - English
 Dates: 2005-01-122005-01-25
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: Other: WOS:000226617900041
DOI: 10.1073/pnas.0408949102
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Proceedings of the National Academy of Sciences of the United States of America
  Other : Proc. Natl. Acad. Sci. USA
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Washington, DC : National Academy of Sciences
Pages: - Volume / Issue: 102 (4) Sequence Number: - Start / End Page: 1181 - 1186 Identifier: ISSN: 0027-8424
CoNE: https://pure.mpg.de/cone/journals/resource/954925427230