English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Affect differentially modulates brain activation in uni- and multisensory body-voice perception

Jessen, S., & Kotz, S. A. (2015). Affect differentially modulates brain activation in uni- and multisensory body-voice perception. Neuropsychologia, 66, 134-143. doi:10.1016/j.neuropsychologia.2014.10.038.

Item is

Files

show Files

Locators

show

Creators

show
hide
 Creators:
Jessen, Sarah1, Author           
Kotz, Sonja A.2, 3, Author           
Affiliations:
1Max Planck Research Group Early Social Development, MPI for Human Cognitive and Brain Sciences, Max Planck Society, ou_1356545              
2Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society, ou_634551              
3School of Psychological Sciences, University of Manchester, United Kingdom, ou_persistent22              

Content

show
hide
Free keywords: Audiovisual; Crossmodal prediction; Emotion; fMRI; Voice; Body
 Abstract: Emotion perception naturally entails multisensory integration. It is also assumed that multisensory emotion perception is characterized by enhanced activation of brain areas implied in multisensory integration, such as the superior temporal gyrus and sulcus (STG/STS). However, most previous studies have employed designs and stimuli that preclude other forms of multisensory interaction, such as crossmodal prediction, leaving open the question whether classical integration is the only relevant process in multisensory emotion perception. Here, we used video clips containing emotional and neutral body and vocal expressions to investigate the role of crossmodal prediction in multisensory emotion perception.

While emotional multisensory expressions increased activation in the bilateral fusiform gyrus (FFG), neutral expressions compared to emotional ones enhanced activation in the bilateral middle temporal gyrus (MTG) and posterior STS. Hence, while neutral stimuli activate classical multisensory areas, emotional stimuli invoke areas linked to unisensory visual processing. Emotional stimuli may therefore trigger a prediction of upcoming auditory information based on prior visual information. Such prediction may be stronger for highly salient emotional compared to less salient neutral information. Therefore, we suggest that multisensory emotion perception involves at least two distinct mechanisms; classical multisensory integration, as shown for neutral expressions, and crossmodal prediction, as evident for emotional expressions.

Details

show
hide
Language(s): eng - English
 Dates: 2014-07-212014-10-302014-11-042015-01
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1016/j.neuropsychologia.2014.10.038
PMID: 25445782
Other: Epub 2014
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Neuropsychologia
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Oxford : Pergamon
Pages: 10 Volume / Issue: 66 Sequence Number: - Start / End Page: 134 - 143 Identifier: ISSN: 0028-3932
CoNE: https://pure.mpg.de/cone/journals/resource/954925428258