hide
Free keywords:
-
Abstract:
The perception of facial expressions can be modulated by affective body postures. For instance, humans are more likely to perceive disgusted facial expressions as ‘angry’ when paired with an angry body. Interestingly, the influence of body context is highly variable across individuals, offering an opportunity to study the mechanisms underlying integrated whole-person perception. Using psychophysical tasks in combination with computational modelling, we indexed the precision of representations of isolated facial expression and body
posture cues, as well as the influence of each cue on the integrated whole-person emotion percept. The results indicate that the perceptual integration leading to wholeperson representation is determined by the precision of the individual cues. These results provide the basis for
developing a mechanistic model of how facial expression and body posture cues are combined to create integrated whole-person percepts of emotion, and have important
implications for our understanding of real-world individual differences in social perception.