English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Hearing and seeing meaning in speech and gesture: Insights from brain and behaviour

Ozyurek, A. (2014). Hearing and seeing meaning in speech and gesture: Insights from brain and behaviour. Philosophical Transactions of the Royal Society of London, Series B: Biological Sciences, 369(1651): 20130296. doi:10.1098/rstb.2013.0296.

Item is

Files

show Files
hide Files
:
ozyurek_2014.pdf (Publisher version), 803KB
Name:
ozyurek_2014.pdf
Description:
-
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show

Creators

show
hide
 Creators:
Ozyurek, Asli1, 2, Author           
Affiliations:
1Center for Language Studies, External organization, ou_55238              
2Research Associates, MPI for Psycholinguistics, Max Planck Society, Wundtlaan 1, 6525 XD Nijmegen, NL, ou_2344700              

Content

show
hide
Free keywords: -
 Abstract: As we speak, we use not only the arbitrary form–meaning mappings of the speech channel but also motivated form–meaning correspondences, i.e. iconic gestures that accompany speech (e.g. inverted V-shaped hand wiggling across gesture space to demonstrate walking). This article reviews what we know about processing of semantic information from speech and iconic gestures in spoken languages during comprehension of such composite utterances. Several studies have shown that comprehension of iconic gestures involves brain activations known to be involved in semantic processing of speech: i.e. modulation of the electrophysiological recording component N400, which is sensitive to the ease of semantic integration of a word to previous context, and recruitment of the left-lateralized frontal–posterior temporal network (left inferior frontal gyrus (IFG), medial temporal gyrus (MTG) and superior temporal gyrus/sulcus (STG/S)). Furthermore, we integrate the information coming from both channels recruiting brain areas such as left IFG, posterior superior temporal sulcus (STS)/MTG and even motor cortex. Finally, this integration is flexible: the temporal synchrony between the iconic gesture and the speech segment, as well as the perceived communicative intent of the speaker, modulate the integration process. Whether these findings are special to gestures or are shared with actions or other visual accompaniments to speech (e.g. lips) or other visual symbols such as pictures are discussed, as well as the implications for a multimodal view of language.

Details

show
hide
Language(s): eng - English
 Dates: 20142014
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1098/rstb.2013.0296
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Philosophical Transactions of the Royal Society of London, Series B: Biological Sciences
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: London : Royal Society
Pages: - Volume / Issue: 369 (1651) Sequence Number: 20130296 Start / End Page: - Identifier: ISSN: 0962-8436
CoNE: https://pure.mpg.de/cone/journals/resource/963017382021_1