English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Audiovisual asynchrony detection in human speech

Maier, J., Di Luca, M., & Noppeney, U. (2011). Audiovisual asynchrony detection in human speech. Journal of Experimental Psychology: Human Perception and Performance, 37(1), 245-256. doi:10.1037/a0019952.

Item is

Files

show Files

Locators

show
hide
Description:
-
OA-Status:

Creators

show
hide
 Creators:
Maier, JX1, 2, Author           
Di Luca, M1, 2, 3, Author           
Noppeney, U2, 4, Author           
Affiliations:
1Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              
3Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497806              
4Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497804              

Content

show
hide
Free keywords: -
 Abstract: Combining information from the visual and auditory senses can greatly enhance intelligibility of natural speech. Integration of audiovisual speech signals is robust even when temporal offsets are present between the component signals. In the present study, we characterized the temporal integration window for speech and nonspeech stimuli with similar spectrotemporal structure to investigate to what extent humans have adapted to the specific characteristics of natural audiovisual speech. We manipulated spectrotemporal structure of the auditory signal, stimulus length, and task context. Results indicate that the temporal integration window is narrower and more asymmetric for speech than for nonspeech signals. When perceiving audiovisual speech, subjects tolerate visual leading asynchronies, but are nevertheless very sensitive to auditory leading asynchronies that are less likely to occur in natural speech. Thus, speech perception may be fine-tuned to the natural statistics of audiovisual speech, where facial movements always occur before acoustic speech articulation.

Details

show
hide
Language(s):
 Dates: 2011-02
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: DOI: 10.1037/a0019952
BibTex Citekey: 6313
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Journal of Experimental Psychology: Human Perception and Performance
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Washington : American Psychological Association (PsycARTICLES)
Pages: - Volume / Issue: 37 (1) Sequence Number: - Start / End Page: 245 - 256 Identifier: ISSN: 0096-1523
CoNE: https://pure.mpg.de/cone/journals/resource/954927546243