English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Multimodal encoding of motion events in speech, gesture, and cognition

Ünal, E., Mamus, E., & Özyürek, A. (2023). Multimodal encoding of motion events in speech, gesture, and cognition. Language and Cognition. Advance online publication. doi:10.1017/langcog.2023.61.

Item is

Files

show Files
hide Files
:
Unal_Mamus_Ozyurek_2023_multimodal encodiing of....pdf (Publisher version), 217KB
Name:
Unal_Mamus_Ozyurek_2023_multimodal encodiing of....pdf
Description:
-
OA-Status:
Hybrid
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
2023
Copyright Info:
© The Author(s), 2023. Published by Cambridge University Press. This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.

Locators

show

Creators

show
hide
 Creators:
Ünal, Ercenur1, Author           
Mamus, Ezgi2, 3, 4, Author           
Özyürek, Asli2, 4, 5, Author           
Affiliations:
1Ozyegin University, Istanbul, Türkiye, ou_persistent22              
2Multimodal Language Department, MPI for Psycholinguistics, Max Planck Society, ou_3398547              
3International Max Planck Research School for Language Sciences, MPI for Psycholinguistics, Max Planck Society, Nijmegen, NL, ou_1119545              
4Center for Language Studies, External Organizations, ou_55238              
5Donders Institute for Brain, Cognition and Behaviour, External Organizations, ou_55236              

Content

show
hide
Free keywords: -
 Abstract: How people communicate about motion events and how this is shaped by language typology are mostly studied with a focus on linguistic encoding in speech. Yet, human communication typically involves an interactional exchange of multimodal signals, such as hand gestures that have different affordances for representing event components. Here, we review recent empirical evidence on multimodal encoding of motion in speech and gesture to gain a deeper understanding of whether and how language typology shapes linguistic expressions in different modalities, and how this changes across different sensory modalities of input and interacts with other aspects of cognition. Empirical evidence strongly suggests that Talmy’s typology of event integration predicts multimodal event descriptions in speech and gesture and visual attention to event components prior to producing these descriptions. Furthermore, variability within the event itself, such as type and modality of stimuli, may override the influence of language typology, especially for expression of manner.

Details

show
hide
Language(s): eng - English
 Dates: 20232023-12-15
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1017/langcog.2023.61
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Language and Cognition. Advance online publication
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Cambridge : Cambridge University Press
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: - Identifier: Other: ISSN
CoNE: https://pure.mpg.de/cone/journals/resource/1866-9808