Deutsch
 
Benutzerhandbuch Datenschutzhinweis Impressum Kontakt
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

You not me: others' emotional facial expressions capture attention automatically: but only for empathic people

MPG-Autoren
Es sind keine MPG-Autoren in der Publikation vorhanden
Externe Ressourcen

Link
(beliebiger Volltext)

Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Wallraven, C., & Kang, J. (2016). You not me: others' emotional facial expressions capture attention automatically: but only for empathic people. Poster presented at 16th Annual Meeting of the Vision Sciences Society (VSS 2016), St. Pete Beach, FL, USA.


Zitierlink: http://hdl.handle.net/21.11116/0000-0000-7B36-A
Zusammenfassung
Facial expressions are processed effortlessly and quickly, allowing us to assess a person's mood and emotion. In addition, emotional facial expressions are known to preferentially attract attention for visual processing. This preferential attention, however, may be mitigated by individual differences in traits – in particular, we speculated that highly empathic people may process emotional expressions more efficiently than non-empathic people. To study preferential attention in facial expression processing, we used the attentional blink paradigm in which identification of a first target (T1) transiently impairs the detection of a second target (T2) during rapid serial visual presentation of a stimulus stream. If emotional expressions are processed preferentially by empathic people, then the impairment for T2-stimuli should be less compared to non-empathic people. Crucially, however, this effect should only occur for faces of others but not for one's own face. To test this, we recruited 100 participants and split them into low- (N=34), medium- (N=47), and high-empathy (N=19) groups based on self-reported levels of emotional empathy. Stimuli consisted of happy, sad, and neutral expressions from the Korean Facial Expressions of Emotion resource for other-face stimuli. In addition, we recorded and validated these three expressions for all participants for use as own-face stimuli. A standard attentional-blink paradigm was implemented with neutral faces as T1-stimuli and emotional faces as T2-stimuli in Psychtoolbox-3. In accordance with our hypothesis, the amount of impairment correlated significantly with the self-reported empathy score only for the other-face condition, but not for the own-face condition. Overall, the high-empathy group showed significantly less impairment for other-face emotional faces compared to the two other groups. These results clearly show that emotional expressions preferentially capture the attention of empathic people. Our findings also provide support for a previously untested component of the Perception-Action-Model of empathy that posits automatic, preferential processing for emotionally-charged stimuli.