English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

A multi-scale investigation of the human communication system's response to visual disruption

MPS-Authors
/persons/resource/persons197919

Trujillo,  James P.
Communication in Social Interaction, Radboud University Nijmegen, External Organizations;
Other Research, MPI for Psycholinguistics, Max Planck Society;
Donders Institute for Brain, Cognition and Behaviour, External Organizations;

/persons/resource/persons116

Levinson,  Stephen C.
Emeriti, MPI for Psycholinguistics, Max Planck Society;
Language and Cognition Department, MPI for Psycholinguistics, Max Planck Society;

/persons/resource/persons4512

Holler,  Judith
Communication in Social Interaction, Radboud University Nijmegen, External Organizations;
Other Research, MPI for Psycholinguistics, Max Planck Society;
Donders Institute for Brain, Cognition and Behaviour, External Organizations;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
Supplementary Material (public)
Citation

Trujillo, J. P., Levinson, S. C., & Holler, J. (2022). A multi-scale investigation of the human communication system's response to visual disruption. Royal Society Open Science, 9(4): 211489. doi:10.1098/rsos.211489.


Cite as: https://hdl.handle.net/21.11116/0000-000A-50DD-3
Abstract
In human communication, when the speech is disrupted, the visual channel (e.g. manual gestures) can compensate to ensure successful communication. Whether speech also compensates when the visual channel is disrupted is an open question, and one that significantly bears on the status of the gestural modality. We test whether gesture and speech are dynamically co-adapted to meet communicative needs. To this end, we parametrically reduce visibility during casual conversational interaction and measure the effects on speakers' communicative behaviour using motion tracking and manual annotation for kinematic and acoustic analyses. We found that visual signalling effort was flexibly adapted in response to a decrease in visual quality (especially motion energy, gesture rate, size, velocity and hold-time). Interestingly, speech was also affected: speech intensity increased in response to reduced visual quality (particularly in speech-gesture utterances, but independently of kinematics). Our findings highlight that multi-modal communicative behaviours are flexibly adapted at multiple scales of measurement and question the notion that gesture plays an inferior role to speech.