English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  How bodies and voices interact in early emotion perception

Jessen, S., Obleser, J., & Kotz, S. A. (2012). How bodies and voices interact in early emotion perception. PLoS One, 7(4): e36070. doi:10.1371/journal.pone.0036070.

Item is

Files

show Files
hide Files
:
Jessen_2012_Bodies.pdf (Publisher version), 571KB
Name:
Jessen_2012_Bodies.pdf
Description:
-
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show

Creators

show
hide
 Creators:
Jessen, Sarah1, 2, Author           
Obleser, Jonas3, Author           
Kotz, Sonja A.1, Author           
Affiliations:
1Minerva Research Group Neurocognition of Rhythm in Communication, MPI for Human Cognitive and Brain Sciences, Max Planck Society, ou_634560              
2Cluster Languages of Emotion, FU Berlin, Germany, ou_persistent22              
3Max Planck Research Group Auditory Cognition, MPI for Human Cognitive and Brain Sciences, Max Planck Society, ou_751545              

Content

show
hide
Free keywords: -
 Abstract: Successful social communication draws strongly on the correct interpretation of others' body and vocal expressions. Both can provide emotional information and often occur simultaneously. Yet their interplay has hardly been studied. Using electroencephalography, we investigated the temporal development underlying their neural interaction in auditory and visual perception. In particular, we tested whether this interaction qualifies as true integration following multisensory integration principles such as inverse effectiveness. Emotional vocalizations were embedded in either low or high levels of noise and presented with or without video clips of matching emotional body expressions. In both, high and low noise conditions, a reduction in auditory N100 amplitude was observed for audiovisual stimuli. However, only under high noise, the N100 peaked earlier in the audiovisual than the auditory condition, suggesting facilitatory effects as predicted by the inverse effectiveness principle. Similarly, we observed earlier N100 peaks in response to emotional compared to neutral audiovisual stimuli. This was not the case in the unimodal auditory condition. Furthermore, suppression of beta–band oscillations (15–25 Hz) primarily reflecting biological motion perception was modulated 200–400 ms after the vocalization. While larger differences in suppression between audiovisual and audio stimuli in high compared to low noise levels were found for emotional stimuli, no such difference was observed for neutral stimuli. This observation is in accordance with the inverse effectiveness principle and suggests a modulation of integration by emotional content. Overall, results show that ecologically valid, complex stimuli such as joined body and vocal expressions are effectively integrated very early in processing.

Details

show
hide
Language(s): eng - English
 Dates: 2011-11-222012-03-242012-04-30
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1371/journal.pone.0036070
PMID: 22558332
PMC: PMC3340409
Other: Epub 2012
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: PLoS One
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: San Francisco, CA : Public Library of Science
Pages: - Volume / Issue: 7 (4) Sequence Number: e36070 Start / End Page: - Identifier: ISSN: 1932-6203
CoNE: https://pure.mpg.de/cone/journals/resource/1000000000277850