English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  On-line integration of semantic information from speech and gesture: Insights from event-related brain potentials

Ozyurek, A., Willems, R. M., Kita, S., & Hagoort, P. (2007). On-line integration of semantic information from speech and gesture: Insights from event-related brain potentials. Journal of Cognitive Neuroscience, 19(4), 605-616. doi:10.1162/jocn.2007.19.4.605.

Item is

Files

show Files
hide Files
:
ozyurek_2007_on-line.pdf (Publisher version), 223KB
File Permalink:
-
Name:
ozyurek_2007_on-line.pdf
Description:
-
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf
Technical Metadata:
Copyright Date:
-
Copyright Info:
eDoc_access: USER
License:
-

Locators

show

Creators

show
hide
 Creators:
Ozyurek, Asli1, 2, 3, Author           
Willems, Roel M.4, Author           
Kita, Sotaro1, Author
Hagoort, Peter1, 2, 4, Author           
Affiliations:
1Language in our Hands: Sign and Gesture, MPI for Psycholinguistics, Max Planck Society, Nijmegen, NL, ou_789545              
2Neurobiology of Language Group, MPI for Psycholinguistics, Max Planck Society, Nijmegen, NL, ou_102880              
3Language in Action , MPI for Psycholinguistics, Max Planck Society, Nijmegen, NL, ou_55214              
4FC Donders Centre for Cognitive Neuroimaging, external, ou_55235              

Content

show
hide
Free keywords: -
 Abstract: During language comprehension, listeners use the global semantic representation from previous sentence or discourse context to immediately integrate the meaning of each upcoming word into the unfolding message-level representation. Here we investigate whether communicative gestures that often spontaneously co-occur with speech are processed in a similar fashion and integrated to previous sentence context in the same way as lexical meaning. Event-related potentials were measured while subjects listened to spoken sentences with a critical verb (e.g., knock), which was accompanied by an iconic co-speech gesture (i.e., KNOCK). Verbal and/or gestural semantic content matched or mismatched the content of the preceding part of the sentence. Despite the difference in the modality and in the specificity of meaning conveyed by spoken words and gestures, the latency, amplitude, and topographical distribution of both word and gesture mismatches are found to be similar, indicating that the brain integrates both types of information simultaneously. This provides evidence for the claim that neural processing in language comprehension involves the simultaneous incorporation of information coming from a broader domain of cognition than only verbal semantics. The neural evidence for similar integration of information from speech and gesture emphasizes the tight interconnection between speech and co-speech gestures.

Details

show
hide
Language(s): eng - English
 Dates: 2007
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: eDoc: 320764
DOI: 10.1162/jocn.2007.19.4.605
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Journal of Cognitive Neuroscience
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 19 (4) Sequence Number: - Start / End Page: 605 - 616 Identifier: -