English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Paper

Multiscale kinematic analysis reveals structural properties of change in evolving manual languages in the lab

MPS-Authors
/persons/resource/persons142

Ozyurek,  Asli
Donders Institute for Brain, Cognition and Behaviour, External Organizations;
Center for Language Studies, External Organizations;
Multimodal Language and Cognition, Radboud University Nijmegen, External Organizations;
Research Associates, MPI for Psycholinguistics, Max Planck Society;

External Resource

Open data
(Supplementary material)

Fulltext (public)
Supplementary Material (public)
There is no public supplementary material available
Citation

Pouw, W., Dingemanse, M., Motamedi, Y., & Ozyurek, A. (2020). Multiscale kinematic analysis reveals structural properties of change in evolving manual languages in the lab. OSF Preprints. doi:10.31219/osf.io/heu24.


Cite as: http://hdl.handle.net/21.11116/0000-0006-B338-1
Abstract
Reverse engineering how language emerged is a daunting interdisciplinary project. Experimental cognitive science has contributed to this effort by eliciting in the lab constraints likely playing a role for language emergence; constraints such as iterated transmission of communicative tokens between agents. Since such constraints played out over long phylogenetic time and involved vast populations, a crucial challenge for iterated language learning paradigms is to extend its limits. In the current approach we perform a multiscale quantification of kinematic changes of an evolving silent gesture system. Silent gestures consist of complex multi-articulatory movement that have so far proven elusive to quantify in a structural and reproducable way, and is primarily studied through human coders meticulously interpreting the referential content of gestures. Here we reanalyzed video data from a silent gesture iterated learning experiment (Motamedi et al. 2019), which originally showed increases in systematicity of gestural form over language transmissions. We applied a signal-based approach, first utilizing computer vision techniques to quantify kinematics from videodata. Then we performed a multiscale kinematic analysis showing that over generations of language users, silent gestures became more efficient and less complex in their kinematics. We further detect systematicity of the communicative tokens’s interrelations which proved itself as a proxy of systematicity obtained via human observation data. Thus we demonstrate the potential for a signal-based approach of language evolution in complex multi-articulatory gestures.