English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Multisensory integration: The case of a time window of gesture-speech integration

Obermeier, C., & Gunter, T. C. (2015). Multisensory integration: The case of a time window of gesture-speech integration. Journal of Cognitive Neuroscience, 27(2), 292-307. doi:10.1162/jocn_a_00688.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-0014-7868-F Version Permalink: http://hdl.handle.net/21.11116/0000-0003-796F-A
Genre: Journal Article

Files

show Files

Locators

show

Creators

show
hide
 Creators:
Obermeier, Christian1, Author              
Gunter, Thomas C.1, Author              
Affiliations:
1Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society, ou_634551              

Content

show
hide
Free keywords: Gestures; Time window of integration
 Abstract: This experiment investigates the integration of gesture and speech from a multisensory perspective. In a disambiguation paradigm, participants were presented with short videos of an actress uttering sentences like “She was impressed by the BALL, because the GAME/DANCE….” The ambiguous noun (BALL) was accompanied by an iconic gesture fragment containing information to disambiguate the noun toward its dominant or subordinate meaning. We used four different temporal alignments between noun and gesture fragment: the identification point (IP) of the noun was either prior to (+120 msec), synchronous with (0 msec), or lagging behind the end of the gesture fragment (−200 and −600 msec). ERPs triggered to the IP of the noun showed significant differences for the integration of dominant and subordinate gesture fragments in the −200, 0, and +120 msec conditions. The outcome of this integration was revealed at the target words. These data suggest a time window for direct semantic gesture–speech integration ranging from at least −200 up to +120 msec. Although the −600 msec condition did not show any signs of direct integration at the homonym, significant disambiguation was found at the target word. An explorative analysis suggested that gesture information was directly integrated at the verb, indicating that there are multiple positions in a sentence where direct gesture–speech integration takes place. Ultimately, this would implicate that in natural communication, where a gesture lasts for some time, several aspects of that gesture will have their specific and possibly distinct impact on different positions in an utterance.

Details

show
hide
Language(s): eng - English
 Dates: 2013-07-072014-07-252015-02
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: Peer
 Identifiers: DOI: 10.1162/jocn_a_00688
PMID: 25061929
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Journal of Cognitive Neuroscience
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Cambridge, MA : MIT Press Journals
Pages: - Volume / Issue: 27 (2) Sequence Number: - Start / End Page: 292 - 307 Identifier: ISSN: 0898-929X
CoNE: /journals/resource/991042752752726