Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

 
 
DownloadE-Mail
  EMOKINE: A software package and computational framework for scaling up the creation of highly controlled emotional full-body movement datasets

Christensen, J. F., Fernández, A., Smith, R. A., Michalareas, G., Yazdi, S. H. N., Farahi, F., et al. (2024). EMOKINE: A software package and computational framework for scaling up the creation of highly controlled emotional full-body movement datasets. Behavior Research Methods. doi:10.3758/s13428-024-02433-0.

Item is

Basisdaten

einblenden: ausblenden:
Genre: Zeitschriftenartikel

Dateien

einblenden: Dateien
ausblenden: Dateien
:
s13428-024-02433-0.pdf (Verlagsversion), 8MB
Name:
s13428-024-02433-0.pdf
Beschreibung:
OA
OA-Status:
Hybrid
Sichtbarkeit:
Öffentlich
MIME-Typ / Prüfsumme:
application/pdf / [MD5]
Technische Metadaten:
Copyright Datum:
2024
Copyright Info:
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. Open Access funding enabled and organized by Projekt DEAL. International Max Planck Research School for Intelligent Systems (IMPRS-IS), Max-Planck-Gesellschaft, Economic and Social Research Council.

Externe Referenzen

einblenden:

Urheber

einblenden:
ausblenden:
 Urheber:
Christensen, Julia F.1, Autor                 
Fernández, Andrés2, 3, Autor
Smith, Rebecca A.4, Autor
Michalareas, Giorgos1, Autor                 
Yazdi, Sina H. N.5, 6, Autor
Farahi, Fahima6, 7, Autor
Schmidt, Eva-Madeleine8, Autor
Roig, Gemma9, 10, Autor
Bahmanian, Nasimeh11, Autor
Affiliations:
1Department of Cognitive Neuropsychology, Max Planck Institute for Empirical Aesthetics, Max Planck Society, ou_3351901              
2Methods of Machine Learning, University of Tübingen, Tübingen, Germany, ou_persistent22              
3International Max Planck Research School for Intelligent Systems, Tübingen, Germany, ou_persistent22              
4Department of Psychology, University of Glasgow, Glasgow, Scotland, ou_persistent22              
5Max Planck School of Cognition, Leipzig, Germany, ou_persistent22              
6WiseWorld.AI, Porto, Portugal, ou_persistent22              
7Department of Language and Literature, Max Planck Institute for Empirical Aesthetics, Max Planck Society, Grüneburgweg 14, 60322 Frankfurt am Main, DE, ou_2421695              
8Department of Modern Languages, Goethe University, Frankfurt/M, Germany, ou_persistent22              
9Computer Science Department, Goethe University, Frankfurt/M, Germany, ou_persistent22              
10The Hessian Center for Artificial Intelligence (hessian.AI), Darmstadt, Germany, ou_persistent22              
11Center for Humans and Machines, Max Planck Institute for Human Development, Berlin, Germany, ou_persistent22              

Inhalt

einblenden:
ausblenden:
Schlagwörter: Emotion, Motion capture, Computer vision, Affective neuroscience, Aesthetics, Dance, Dataset, Open science
 Zusammenfassung: EMOKINE is a software package and dataset creation suite for emotional full-body movement research in experimental psychology, affective neuroscience, and computer vision. A computational framework, comprehensive instructions, a pilot dataset, observer ratings, and kinematic feature extraction code are provided to facilitate future dataset creations at scale. In addition, the EMOKINE framework outlines how complex sequences of movements may advance emotion research. Traditionally, often emotional-‘action’-based stimuli are used in such research, like hand-waving or walking motions. Here instead, a pilot dataset is provided with short dance choreographies, repeated several times by a dancer who expressed different emotional intentions at each repetition: anger, contentment, fear, joy, neutrality, and sadness. The dataset was simultaneously filmed professionally, and recorded using XSENS® motion capture technology (17 sensors, 240 frames/second). Thirty-two statistics from 12 kinematic features were extracted offline, for the first time in one single dataset: speed, acceleration, angular speed, angular acceleration, limb contraction, distance to center of mass, quantity of motion, dimensionless jerk (integral), head angle (with regards to vertical axis and to back), and space (convex hull 2D and 3D). Average, median absolute deviation (MAD), and maximum value were computed as applicable. The EMOKINE software is appliable to other motion-capture systems and is openly available on the Zenodo Repository. Releases on GitHub include: (i) the code to extract the 32 statistics, (ii) a rigging plugin for Python for MVNX file-conversion to Blender format (MVNX=output file XSENS® system), and (iii) a Python-script-powered custom software to assist with blurring faces; latter two under GPLv3 licenses.

Details

einblenden:
ausblenden:
Sprache(n): eng - English
 Datum: 2024-04-152024-06-25
 Publikationsstatus: Online veröffentlicht
 Seiten: -
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: Expertenbegutachtung
 Identifikatoren: DOI: 10.3758/s13428-024-02433-0
 Art des Abschluß: -

Veranstaltung

einblenden:

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle 1

einblenden:
ausblenden:
Titel: Behavior Research Methods
Genre der Quelle: Zeitschrift
 Urheber:
Affiliations:
Ort, Verlag, Ausgabe: Austin, TX : Psychonomic Society
Seiten: - Band / Heft: - Artikelnummer: - Start- / Endseite: - Identifikator: ISSN: 1554-3528
CoNE: https://pure.mpg.de/cone/journals/resource/1554-3528