English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  What iconic gesture fragments reveal about gesture-speech integration: When synchrony is lost, memory can help

Obermeier, C., Holle, H., & Gunter, T. C. (2011). What iconic gesture fragments reveal about gesture-speech integration: When synchrony is lost, memory can help. Journal of Cognitive Neuroscience, 23(7), 1648-1663. doi:10.1162/jocn.2010.21498.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-0010-EA87-9 Version Permalink: http://hdl.handle.net/11858/00-001M-0000-002B-FF95-5
Genre: Journal Article

Files

show Files

Locators

show

Creators

show
hide
 Creators:
Obermeier, Christian1, Author              
Holle, Henning2, Author
Gunter, Thomas C.1, Author              
Affiliations:
1Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society, ou_634551              
2University of Sussex, Brighton, United Kingdom , ou_persistent22              

Content

show
hide
Free keywords: -
 Abstract: The present series of experiments explores several issues related to gesture–speech integration and synchrony during sentence processing. To be able to more precisely manipulate gesture–speech synchrony, we used gesture fragments instead of complete gestures, thereby avoiding the usual long temporal overlap of gestures with their coexpressive speech. In a pretest, the minimal duration of an iconic gesture fragment needed to disambiguate a homonym (i.e., disambiguation point) was therefore identified. In three subsequent ERP experiments, we then investigated whether the gesture information available at the disambiguation point has immediate as well as delayed consequences on the processing of a temporarily ambiguous spoken sentence, and whether these gesture–speech integration processes are susceptible to temporal synchrony. Experiment 1, which used asynchronous stimuli as well as an explicit task, showed clear N400 effects at the homonym as well as at the target word presented further downstream, suggesting that asynchrony does not prevent integration under explicit task conditions. No such effects were found when asynchronous stimuli were presented using a more shallow task (Experiment 2). Finally, when gesture fragment and homonym were synchronous, similar results as in Experiment 1 were found, even under shallow task conditions (Experiment 3). We conclude that when iconic gesture fragments and speech are in synchrony, their interaction is more or less automatic. When they are not, more controlled, active memory processes are necessary to be able to combine the gesture fragment and speech context in such a way that the homonym is disambiguated correctly.

Details

show
hide
Language(s): eng - English
 Dates: 2011-05-102011-07
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: -
 Identifiers: eDoc: 512157
DOI: 10.1162/jocn.2010.21498
Other: P11557
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Journal of Cognitive Neuroscience
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Cambridge, MA : MIT Press Journals
Pages: - Volume / Issue: 23 (7) Sequence Number: - Start / End Page: 1648 - 1663 Identifier: ISSN: 0898-929X
CoNE: /journals/resource/991042752752726