English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  On the role of crossmodal prediction in audiovisual emotion perception

Jessen, S., & Kotz, S. A. (2013). On the role of crossmodal prediction in audiovisual emotion perception. Frontiers in Human Neuroscience, 7: 369. doi:10.3389/fnhum.2013.00369.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-0013-FA46-9 Version Permalink: http://hdl.handle.net/21.11116/0000-0003-ACF0-C
Genre: Journal Article

Files

show Files
hide Files
:
Jessen_OnTheRole.pdf (Publisher version), 2MB
Name:
Jessen_OnTheRole.pdf
Description:
-
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show

Creators

show
hide
 Creators:
Jessen, Sarah1, Author              
Kotz, Sonja A.2, 3, Author              
Affiliations:
1Max Planck Research Group Early Social Development, MPI for Human Cognitive and Brain Sciences, Max Planck Society, Leipzig, DE, ou_1356545              
2Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society, ou_634551              
3School of Psychological Sciences, University of Manchester, United Kingdom, ou_persistent22              

Content

show
hide
Free keywords: Cross-modal prediction; Emotion; Multisensory; EEG; Audiovisual
 Abstract: Humans rely on multiple sensory modalities to determine the emotional state of others. In fact, such multisensory perception may be one of the mechanisms explaining the ease and efficiency by which others' emotions are recognized. But how and when exactly do the different modalities interact? One aspect in multisensory perception that has received increasing interest in recent years is the concept of cross-modal prediction. In emotion perception, as in most other settings, visual information precedes the auditory information. Thereby, leading in visual information can facilitate subsequent auditory processing. While this mechanism has often been described in audiovisual speech perception, so far it has not been addressed in audiovisual emotion perception. Based on the current state of the art in (a) cross-modal prediction and (b) multisensory emotion perception research, we propose that it is essential to consider the former in order to fully understand the latter. Focusing on electroencephalographic (EEG) and magnetoencephalographic (MEG) studies, we provide a brief overview of the current research in both fields. In discussing these findings, we suggest that emotional visual information may allow more reliable predicting of auditory information compared to non-emotional visual information. In support of this hypothesis, we present a re-analysis of a previous data set that shows an inverse correlation between the N1 EEG response and the duration of visual emotional, but not non-emotional information. If the assumption that emotional content allows more reliable predicting can be corroborated in future studies, cross-modal prediction is a crucial factor in our understanding of multisensory emotion perception.

Details

show
hide
Language(s): eng - English
 Dates: 2013-04-042013-06-252013-07-18
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.3389/fnhum.2013.00369
PMID: 23882204
PMC: PMC3714569
Other: eCollection 2013
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Frontiers in Human Neuroscience
  Abbreviation : Front Hum Neurosci
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Lausanne, Switzerland : Frontiers Research Foundation
Pages: - Volume / Issue: 7 Sequence Number: 369 Start / End Page: - Identifier: ISSN: 1662-5161
CoNE: https://pure.mpg.de/cone/journals/resource/1662-5161