English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  The role of synchrony and ambiguity in speech–gesture integration during comprehension

Habets, B., Kita, S., Shao, Z., Ozyurek, A., & Hagoort, P. (2011). The role of synchrony and ambiguity in speech–gesture integration during comprehension. Journal of Cognitive Neuroscience, 23, 1845-1854. doi:10.1162/jocn.2010.21462.

Item is

Files

show Files
hide Files
:
Habets_2011_The Role of Synchrony and Ambiguity_JOCN.pdf (Publisher version), 232KB
Name:
Habets_2011_The Role of Synchrony and Ambiguity_JOCN.pdf
Description:
-
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show

Creators

show
hide
 Creators:
Habets, Boukje1, Author
Kita, Sotaro2, Author
Shao, Zeshu2, 3, Author           
Ozyurek, Asli4, 5, 6, 7, Author           
Hagoort, Peter5, 6, 8, Author           
Affiliations:
1University of Hamburg, Germany, ou_persistent22              
2University of Birmingham, United Kingdom, ou_persistent22              
3Individual Differences in Language Processing Department, MPI for Psycholinguistics, Max Planck Society, ou_792545              
4Language in our Hands: Sign and Gesture, MPI for Psycholinguistics, Max Planck Society, ou_789545              
5Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society, ou_792551              
6Radboud University Nijmegen, ou_persistent22              
7Multimodal Language and Cognition, Radboud University Nijmegen, External Organizations, ou_3055480              
8Donders Institute for Brain, Cognition and Behaviour, External Organizations, ou_55236              

Content

show
hide
Free keywords: -
 Abstract: During face-to-face communication, one does not only hear speech but also see a speaker's communicative hand movements. It has been shown that such hand gestures play an important role in communication where the two modalities influence each other's interpretation. A gesture typically temporally overlaps with coexpressive speech, but the gesture is often initiated before (but not after) the coexpressive speech. The present ERP study investigated what degree of asynchrony in the speech and gesture onsets are optimal for semantic integration of the concurrent gesture and speech. Videos of a person gesturing were combined with speech segments that were either semantically congruent or incongruent with the gesture. Although gesture and speech always overlapped in time, gesture and speech were presented with three different degrees of asynchrony. In the SOA 0 condition, the gesture onset and the speech onset were simultaneous. In the SOA 160 and 360 conditions, speech was delayed by 160 and 360 msec, respectively. ERPs time locked to speech onset showed a significant difference between semantically congruent versus incongruent gesture–speech combinations on the N400 for the SOA 0 and 160 conditions. No significant difference was found for the SOA 360 condition. These results imply that speech and gesture are integrated most efficiently when the differences in onsets do not exceed a certain time span because of the fact that iconic gestures need speech to be disambiguated in a way relevant to the speech context.

Details

show
hide
Language(s): eng - English
 Dates: 2011
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1162/jocn.2010.21462
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Journal of Cognitive Neuroscience
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Cambridge, MA : MIT Press Journals
Pages: - Volume / Issue: 23 Sequence Number: - Start / End Page: 1845 - 1854 Identifier: ISSN: 0898-929X
CoNE: https://pure.mpg.de/cone/journals/resource/991042752752726