English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  The Variably Intense Vocalizations of Affect and Emotion (VIVAE) Corpus prompts new perspective on nonspeech perception

Holz, N., Larrouy-Maestri, P., & Poeppel, D. (2022). The Variably Intense Vocalizations of Affect and Emotion (VIVAE) Corpus prompts new perspective on nonspeech perception. Emotion, 22(1), 213-225. doi:10.1037/emo0001048.

Item is

Files

show Files

Locators

show

Creators

show
hide
 Creators:
Holz, Natalie1, Author           
Larrouy-Maestri, Pauline1, 2, Author           
Poeppel, David1, 2, 3, Author           
Affiliations:
1Department of Neuroscience, Max Planck Institute for Empirical Aesthetics, Max Planck Society, ou_2421697              
2Center for Language, Music, and Emotion (CLaME), New York, New York, United States, ou_persistent22              
3Ernst Struengmann Institute for Neuroscience, Frankfurt, Germany, ou_persistent22              

Content

show
hide
Free keywords: voice, nonverbal vocalizations, emotion perception, emotion intensity, database
 Abstract: The human voice is a potent source of information to signal emotion. Nonspeech vocalizations (e.g., laughter, crying, moans, or screams), in particular, can elicit compelling affective experiences. Consensus exists that the emotional intensity of such expressions matters; however how intensity affects such signals, and their perception remains controversial and poorly understood. One reason is the lack of appropriate data sets. We have developed a comprehensive stimulus set of nonverbal vocalizations, the first corpus to represent emotion intensity from one extreme to the other, in order to resolve the empirically underdetermined basis of emotion intensity. The full set, comprising 1085 stimuli, features eleven speakers expressing 3 positive (achievement/triumph, sexual pleasure, surprise) and 3 negative (anger, fear, physical pain) affective states, each varying from low to peak emotion intensity. The smaller core set of 480 files represents a fully crossed subsample (6 emotions × 4 intensities × 10 speakers × 2 items) selected based on judged authenticity. Perceptual validation and acoustic characterization of the stimuli are provided; the expressed emotional intensity, like expressed emotion, is reflected in listener evaluation and signal properties of nonverbal vocalizations. These carefully curated new materials can help disambiguate foundational questions on the communication of affect and emotion in the psychological and neural sciences and strengthen our theoretical understanding of this domain of emotional experience.

Details

show
hide
Language(s): eng - English
 Dates: 2021-07-212020-12-302021-09-082022-02-01
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1037/emo0001048
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Emotion
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Washington, DC : American Psychological Association
Pages: - Volume / Issue: 22 (1) Sequence Number: - Start / End Page: 213 - 225 Identifier: ISSN: 1528-3542
CoNE: https://pure.mpg.de/cone/journals/resource/1528-3542