English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Multisensory integration of musical emotion perception in singing

Lange, E. B., Fünderich, J., & Grimm, H. (2022). Multisensory integration of musical emotion perception in singing. Psychological Research, 86, 2099-2114. doi:10.1007/s00426-021-01637-9.

Item is

Files

show Files
hide Files
:
mus-22-lan-01-multisensory.pdf (Copyright transfer agreement), 2MB
Name:
mus-22-lan-01-multisensory.pdf
Description:
OA
OA-Status:
Hybrid
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
2022
Copyright Info:
This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Locators

show

Creators

show
hide
 Creators:
Lange, Elke B.1, Author                 
Fünderich, Jens1, 2, Author
Grimm, Hartmut1, Author           
Affiliations:
1Department of Music, Max Planck Institute for Empirical Aesthetics, Max Planck Society, ou_2421696              
2University of Erfurt, Erfurt, Germany, ou_persistent22              

Content

show
hide
Free keywords: -
 Abstract: We investigated how visual and auditory information contributes to emotion communication during singing. Classically trained singers applied two different facial expressions (expressive/suppressed) to pieces from their song and opera repertoire. Recordings of the singers were evaluated by laypersons or experts, presented to them in three different modes: auditory, visual, and audio–visual. A manipulation check confirmed that the singers succeeded in manipulating the face while keeping the sound highly expressive. Analyses focused on whether the visual difference or the auditory concordance between the two versions determined perception of the audio–visual stimuli. When evaluating expressive intensity or emotional content a clear effect of visual dominance showed. Experts made more use of the visual cues than laypersons. Consistency measures between uni-modal and multimodal presentations did not explain the visual dominance. The evaluation of seriousness was applied as a control. The uni-modal stimuli were rated as expected, but multisensory evaluations converged without visual dominance. Our study demonstrates that long-term knowledge and task context affect multisensory integration. Even though singers’ orofacial movements are dominated by sound production, their facial expressions can communicate emotions composed into the music, and observes do not rely on audio information instead. Studies such as ours are important to understand multisensory integration in applied settings.

Details

show
hide
Language(s): eng - English
 Dates: 2021-04-152021-12-162022-01-10
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1007/s00426-021-01637-9
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Psychological Research
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Berlin : Springer-Verlag
Pages: - Volume / Issue: 86 Sequence Number: - Start / End Page: 2099 - 2114 Identifier: ISSN: 0340-0727
CoNE: https://pure.mpg.de/cone/journals/resource/954925518603_1