Help Privacy Policy Disclaimer
  Advanced SearchBrowse




Journal Article

Perceptual integration of faces and voices depends on the interaction of emotional content and spatial frequency


Tavano,  Alessandro       
Department of Neuroscience, Max Planck Institute for Empirical Aesthetics, Max Planck Society;
Institute of Psychology, University of Leipzig, Germany;

Kotz,  Sonja A.
Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society;
Department of Neuropsychology and Psychopharmacology, Maastricht University, the Netherlands;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available

Kokinous, J., Tavano, A., Kotz, S. A., & Schröger, E. (2017). Perceptual integration of faces and voices depends on the interaction of emotional content and spatial frequency. Biological Psychology, 123, 155-165. doi:10.1016/j.biopsycho.2016.12.007.

Cite as: https://hdl.handle.net/11858/00-001M-0000-002D-F271-D
The role of spatial frequencies (SF) is highly debated in emotion perception, but previous work suggests the importance of low SFs for detecting emotion in faces. Furthermore, emotion perception essentially relies on the rapid integration of multimodal information from faces and voices. We used EEG to test the functional relevance of SFs in the integration of emotional and non-emotional audiovisual stimuli. While viewing dynamic face-voice pairs, participants were asked to identify auditory interjections, and the electroencephalogram (EEG) was recorded. Audiovisual integration was measured as auditory facilitation, indexed by the extent of the auditory N1 amplitude suppression in audiovisual compared to an auditory only condition. We found an interaction of SF filtering and emotion in the auditory response suppression. For neutral faces, larger N1 suppression ensued in the unfiltered and high SF conditions as compared to the low SF condition. Angry face perception led to a larger N1 suppression in the low SF condition. While the results for the neural faces indicate that perceptual quality in terms of SF content plays a major role in audiovisual integration, the results for angry faces suggest that early multisensory integration of emotional information favors low SF neural processing pathways, overruling the predictive value of the visual signal per se.