Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

 
 
DownloadE-Mail
  On the role of crossmodal prediction in audiovisual emotion perception

Jessen, S., & Kotz, S. A. (2013). On the role of crossmodal prediction in audiovisual emotion perception. Frontiers in Human Neuroscience, 7: 369. doi:10.3389/fnhum.2013.00369.

Item is

Basisdaten

einblenden: ausblenden:
Genre: Zeitschriftenartikel

Dateien

einblenden: Dateien
ausblenden: Dateien
:
Jessen_OnTheRole.pdf (Verlagsversion), 2MB
Name:
Jessen_OnTheRole.pdf
Beschreibung:
-
OA-Status:
Sichtbarkeit:
Öffentlich
MIME-Typ / Prüfsumme:
application/pdf / [MD5]
Technische Metadaten:
Copyright Datum:
-
Copyright Info:
-
Lizenz:
-

Externe Referenzen

einblenden:

Urheber

einblenden:
ausblenden:
 Urheber:
Jessen, Sarah1, Autor           
Kotz, Sonja A.2, 3, Autor           
Affiliations:
1Max Planck Research Group Early Social Development, MPI for Human Cognitive and Brain Sciences, Max Planck Society, Leipzig, DE, ou_1356545              
2Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society, ou_634551              
3School of Psychological Sciences, University of Manchester, United Kingdom, ou_persistent22              

Inhalt

einblenden:
ausblenden:
Schlagwörter: Cross-modal prediction; Emotion; Multisensory; EEG; Audiovisual
 Zusammenfassung: Humans rely on multiple sensory modalities to determine the emotional state of others. In fact, such multisensory perception may be one of the mechanisms explaining the ease and efficiency by which others' emotions are recognized. But how and when exactly do the different modalities interact? One aspect in multisensory perception that has received increasing interest in recent years is the concept of cross-modal prediction. In emotion perception, as in most other settings, visual information precedes the auditory information. Thereby, leading in visual information can facilitate subsequent auditory processing. While this mechanism has often been described in audiovisual speech perception, so far it has not been addressed in audiovisual emotion perception. Based on the current state of the art in (a) cross-modal prediction and (b) multisensory emotion perception research, we propose that it is essential to consider the former in order to fully understand the latter. Focusing on electroencephalographic (EEG) and magnetoencephalographic (MEG) studies, we provide a brief overview of the current research in both fields. In discussing these findings, we suggest that emotional visual information may allow more reliable predicting of auditory information compared to non-emotional visual information. In support of this hypothesis, we present a re-analysis of a previous data set that shows an inverse correlation between the N1 EEG response and the duration of visual emotional, but not non-emotional information. If the assumption that emotional content allows more reliable predicting can be corroborated in future studies, cross-modal prediction is a crucial factor in our understanding of multisensory emotion perception.

Details

einblenden:
ausblenden:
Sprache(n): eng - English
 Datum: 2013-04-042013-06-252013-07-18
 Publikationsstatus: Online veröffentlicht
 Seiten: -
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: Expertenbegutachtung
 Identifikatoren: DOI: 10.3389/fnhum.2013.00369
PMID: 23882204
PMC: PMC3714569
Anderer: eCollection 2013
 Art des Abschluß: -

Veranstaltung

einblenden:

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle 1

einblenden:
ausblenden:
Titel: Frontiers in Human Neuroscience
  Kurztitel : Front Hum Neurosci
Genre der Quelle: Zeitschrift
 Urheber:
Affiliations:
Ort, Verlag, Ausgabe: Lausanne, Switzerland : Frontiers Research Foundation
Seiten: - Band / Heft: 7 Artikelnummer: 369 Start- / Endseite: - Identifikator: ISSN: 1662-5161
CoNE: https://pure.mpg.de/cone/journals/resource/1662-5161