Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT
  Hearing and seeing meaning in speech and gesture: Insights from brain and behaviour

Ozyurek, A. (2014). Hearing and seeing meaning in speech and gesture: Insights from brain and behaviour. Philosophical Transactions of the Royal Society of London, Series B: Biological Sciences, 369(1651): 20130296. doi:10.1098/rstb.2013.0296.

Item is

Dateien

einblenden: Dateien
ausblenden: Dateien
:
ozyurek_2014.pdf (Verlagsversion), 803KB
Name:
ozyurek_2014.pdf
Beschreibung:
-
OA-Status:
Sichtbarkeit:
Öffentlich
MIME-Typ / Prüfsumme:
application/pdf / [MD5]
Technische Metadaten:
Copyright Datum:
-
Copyright Info:
-
Lizenz:
-

Externe Referenzen

einblenden:

Urheber

einblenden:
ausblenden:
 Urheber:
Ozyurek, Asli1, 2, Autor           
Affiliations:
1Center for Language Studies, External organization, ou_55238              
2Research Associates, MPI for Psycholinguistics, Max Planck Society, Wundtlaan 1, 6525 XD Nijmegen, NL, ou_2344700              

Inhalt

einblenden:
ausblenden:
Schlagwörter: -
 Zusammenfassung: As we speak, we use not only the arbitrary form–meaning mappings of the speech channel but also motivated form–meaning correspondences, i.e. iconic gestures that accompany speech (e.g. inverted V-shaped hand wiggling across gesture space to demonstrate walking). This article reviews what we know about processing of semantic information from speech and iconic gestures in spoken languages during comprehension of such composite utterances. Several studies have shown that comprehension of iconic gestures involves brain activations known to be involved in semantic processing of speech: i.e. modulation of the electrophysiological recording component N400, which is sensitive to the ease of semantic integration of a word to previous context, and recruitment of the left-lateralized frontal–posterior temporal network (left inferior frontal gyrus (IFG), medial temporal gyrus (MTG) and superior temporal gyrus/sulcus (STG/S)). Furthermore, we integrate the information coming from both channels recruiting brain areas such as left IFG, posterior superior temporal sulcus (STS)/MTG and even motor cortex. Finally, this integration is flexible: the temporal synchrony between the iconic gesture and the speech segment, as well as the perceived communicative intent of the speaker, modulate the integration process. Whether these findings are special to gestures or are shared with actions or other visual accompaniments to speech (e.g. lips) or other visual symbols such as pictures are discussed, as well as the implications for a multimodal view of language.

Details

einblenden:
ausblenden:
Sprache(n): eng - English
 Datum: 20142014
 Publikationsstatus: Online veröffentlicht
 Seiten: -
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: Expertenbegutachtung
 Identifikatoren: DOI: 10.1098/rstb.2013.0296
 Art des Abschluß: -

Veranstaltung

einblenden:

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle 1

einblenden:
ausblenden:
Titel: Philosophical Transactions of the Royal Society of London, Series B: Biological Sciences
Genre der Quelle: Zeitschrift
 Urheber:
Affiliations:
Ort, Verlag, Ausgabe: London : Royal Society
Seiten: - Band / Heft: 369 (1651) Artikelnummer: 20130296 Start- / Endseite: - Identifikator: ISSN: 0962-8436
CoNE: https://pure.mpg.de/cone/journals/resource/963017382021_1