English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

ERP evidence on the processing of emotional prosody

MPS-Authors
/persons/resource/persons19919

Paulmann,  Silke
Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society;

/persons/resource/persons19791

Kotz,  Sonja A.
Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Paulmann, S., & Kotz, S. A. (2005). ERP evidence on the processing of emotional prosody. Poster presented at ISCA Workshop on Plasticity in Speech Perception, London, UK.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0010-B23F-5
Abstract
Emotional encoding is central to human communication. Among others, emotional states are communicated by evaluating affective prosody and the accompanying semantics (i.e. the verbal content of the message). Obviously, the two channels do have to interact in order to be able to evaluate the emotional state of a message (or the messenger). However, so far it is yet not fully understood how and at which point in time emotional prosody and (emotional) semantics interact at the sentence level. Previous evidence (Kotz et. al, 2001 & Kotz & Paulmann, in prep.) has shown that the time course of the emotional prosodic processing and semantics differ. In order to further investigate the two channels we carried out two ERP experiments in which we tried to isolate the emotional prosody channel from the semantic content channel using a cross-splicing method. The ERP evidence shows that violations of the emotional prosodic contour (with neutral semantic content in Experiment 1 and pseudo-sentences used in Experiment 2) mainly elicit a positivity whereas violations of the semantic content mainly elicit an N400-like negativity. This pattern seems to evolve independet of speaker voice (male or female). Furthermore, it appears that this pattern accounts for all basic emotions (fear, sad, anger, disgust, happpiness, pleasant surprise and neutral serving as a baseline) investigated in these experiments. Only the results for happiness show deviant ERP results. We conclude that prosodic information is processed differently from semantic information, and that each channel (prosody and semantic) contributes individually to the perception of emotional speech. Furthermore, the data suggest that semantic information can override prosody when the two channels interact in time, that is when the emotional prosodic contour agrees with the semantic content of a sentence.