English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

How bodies and voices interact in early emotion perception

MPS-Authors
/persons/resource/persons19755

Jessen,  Sarah
Minerva Research Group Neurocognition of Rhythm in Communication, MPI for Human Cognitive and Brain Sciences, Max Planck Society;
Cluster Languages of Emotion, FU Berlin, Germany;

/persons/resource/persons19902

Obleser,  Jonas
Max Planck Research Group Auditory Cognition, MPI for Human Cognitive and Brain Sciences, Max Planck Society;

/persons/resource/persons19791

Kotz,  Sonja A.
Minerva Research Group Neurocognition of Rhythm in Communication, MPI for Human Cognitive and Brain Sciences, Max Planck Society;

External Resource
No external resources are shared
Fulltext (public)

Jessen_2012_Bodies.pdf
(Publisher version), 571KB

Supplementary Material (public)
There is no public supplementary material available
Citation

Jessen, S., Obleser, J., & Kotz, S. A. (2012). How bodies and voices interact in early emotion perception. PLoS One, 7(4): e36070. doi:10.1371/journal.pone.0036070.


Cite as: http://hdl.handle.net/11858/00-001M-0000-000F-A1C7-D
Abstract
Successful social communication draws strongly on the correct interpretation of others' body and vocal expressions. Both can provide emotional information and often occur simultaneously. Yet their interplay has hardly been studied. Using electroencephalography, we investigated the temporal development underlying their neural interaction in auditory and visual perception. In particular, we tested whether this interaction qualifies as true integration following multisensory integration principles such as inverse effectiveness. Emotional vocalizations were embedded in either low or high levels of noise and presented with or without video clips of matching emotional body expressions. In both, high and low noise conditions, a reduction in auditory N100 amplitude was observed for audiovisual stimuli. However, only under high noise, the N100 peaked earlier in the audiovisual than the auditory condition, suggesting facilitatory effects as predicted by the inverse effectiveness principle. Similarly, we observed earlier N100 peaks in response to emotional compared to neutral audiovisual stimuli. This was not the case in the unimodal auditory condition. Furthermore, suppression of beta–band oscillations (15–25 Hz) primarily reflecting biological motion perception was modulated 200–400 ms after the vocalization. While larger differences in suppression between audiovisual and audio stimuli in high compared to low noise levels were found for emotional stimuli, no such difference was observed for neutral stimuli. This observation is in accordance with the inverse effectiveness principle and suggests a modulation of integration by emotional content. Overall, results show that ecologically valid, complex stimuli such as joined body and vocal expressions are effectively integrated very early in processing.