English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Many morphs: Parsing gesture signals from the noise

MPS-Authors
/persons/resource/persons294264

Piel,  Alex K.       
Department of Human Origins, Max Planck Institute for Evolutionary Anthropology, Max Planck Society;

/persons/resource/persons294266

Stewart,  Fiona       
Department of Human Origins, Max Planck Institute for Evolutionary Anthropology, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)

Mielke_Many_BehavResMeth_2024.pdf
(Supplementary material), 863KB

Citation

Mielke, A., Badihi, G., Graham, K. E., Grund, C., Hashimoto, C., Piel, A. K., et al. (2024). Many morphs: Parsing gesture signals from the noise. Behavior Research Methods. doi:10.3758/s13428-024-02368-6.


Cite as: https://hdl.handle.net/21.11116/0000-000F-1AC0-B
Abstract
Parsing signals from noise is a general problem for signallers and recipients, and for researchers studying communicative systems. Substantial efforts have been invested in comparing how other species encode information and meaning, and how signalling is structured. However, research depends on identifying and discriminating signals that represent meaningful units of analysis. Early approaches to defining signal repertoires applied top-down approaches, classifying cases into predefined signal types. Recently, more labour-intensive methods have taken a bottom-up approach describing detailed features of each signal and clustering cases based on patterns of similarity in multi-dimensional feature-space that were previously undetectable. Nevertheless, it remains essential to assess whether the resulting repertoires are composed of relevant units from the perspective of the species using them, and redefining repertoires when additional data become available. In this paper we provide a framework that takes data from the largest set of wild chimpanzee (Pan troglodytes) gestures currently available, splitting gesture types at a fine scale based on modifying features of gesture expression using latent class analysis (a model-based cluster detection algorithm for categorical variables), and then determining whether this splitting process reduces uncertainty about the goal or community of the gesture. Our method allows different features of interest to be incorporated into the splitting process, providing substantial future flexibility across, for example, species, populations, and levels of signal granularity. Doing so, we provide a powerful tool allowing researchers interested in gestural communication to establish repertoires of relevant units for subsequent analyses within and between systems of communication.