English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  The quantification of gesture–speech synchrony: A tutorial and validation of multimodal data acquisition using device-based and video-based motion tracking

Pouw, W., Trujillo, J. P., & Dixon, J. A. (2019). The quantification of gesture–speech synchrony: A tutorial and validation of multimodal data acquisition using device-based and video-based motion tracking. Behavior Research Methods. Advance online publication. doi:10.3758/s13428-019-01271-9.

Item is

Files

show Files
hide Files
:
Pouw2019_Article_TheQuantificationOfGestureSpee.pdf (Publisher version), 3MB
Name:
Pouw2019_Article_TheQuantificationOfGestureSpee.pdf
Description:
-
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
2019
Copyright Info:
© The Author(s) 2019 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Locators

show

Creators

show
hide
 Creators:
Pouw, Wim1, 2, Author           
Trujillo, James P.3, 4, Author           
Dixon, James A.1, Author
Affiliations:
1University of Connecticut, Storrs, CT, USA, ou_persistent22              
2Erasmus University Rotterdam, Rotterdam, The Netherlands, ou_persistent22              
3Donders Institute for Brain, Cognition and Behaviour, External Organizations, ou_55236              
4Center for Language Studies, External Organizations, ou_55238              

Content

show
hide
Free keywords: -
 Abstract: There is increasing evidence that hand gestures and speech synchronize their activity on multiple dimensions and timescales. For example, gesture’s kinematic peaks (e.g., maximum speed) are coupled with prosodic markers in speech. Such coupling operates on very short timescales at the level of syllables (200 ms), and therefore requires high-resolution measurement of gesture kinematics and speech acoustics. High-resolution speech analysis is common for gesture studies, given that field’s classic ties with (psycho)linguistics. However, the field has lagged behind in the objective study of gesture kinematics (e.g., as compared to research on instrumental action). Often kinematic peaks in gesture are measured by eye, where a “moment of maximum effort” is determined by several raters. In the present article, we provide a tutorial on more efficient methods to quantify the temporal properties of gesture kinematics, in which we focus on common challenges and possible solutions that come with the complexities of studying multimodal language. We further introduce and compare, using an actual gesture dataset (392 gesture events), the performance of two video-based motion-tracking methods (deep learning vs. pixel change) against a high-performance wired motion-tracking system (Polhemus Liberty). We show that the videography methods perform well in the temporal estimation of kinematic peaks, and thus provide a cheap alternative to expensive motion-tracking systems. We hope that the present article incites gesture researchers to embark on the widespread objective study of gesture kinematics and their relation to speech.

Details

show
hide
Language(s): eng - English
 Dates: 2019-10-28
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.3758/s13428-019-01271-9
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Behavior Research Methods. Advance online publication
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Austin, TX : Psychonomic Society
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: - Identifier: ISSN: 1554-3528
CoNE: https://pure.mpg.de/cone/journals/resource/1554-3528