English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Psychophysical investigation of facial expressions using computer animated faces

MPS-Authors
/persons/resource/persons83947

Griesser,  RT
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84298

Wallraven,  C
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

APGV-2007-Griesser.pdf
(Any fulltext), 541KB

Supplementary Material (public)
There is no public supplementary material available
Citation

Griesser, R., Cunningham, D., Wallraven, C., & Bülthoff, H. (2007). Psychophysical investigation of facial expressions using computer animated faces. In C. Wallraven, & V. Sundstedt (Eds.), APGV '07: 4th Symposium on Applied Perception in Graphics and Visualization (pp. 11-18). New York, NY, USA: ACM Press.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-CCC7-3
Abstract
The human face is capable of producing a large variety of facial expressions that supply important information for communication. As was shown in previous studies using unmanipulated video sequences, movements of single regions like mouth, eyes, and eyebrows as well as rigid head motion play a decisive role in the recognition of conversational facial expressions. Here, flexible but at the same time realistic computer animated faces were used to investigate the spatiotemporal coaction of facial movements systematically. For three psychophysical experiments, spatiotemporal properties were manipulated in a highly controlled manner. First, single regions (mouth, eyes, and eyebrows) of a computer animated face performing seven basic facial expressions were selected. These single regions, as well as combinations of these regions, were animated for each of the seven chosen facial expressions. Participants were then asked to recognize these animated expressions in the experiments. The findings show that the animated avatar in general is a useful tool for the investigation of facial expressions, although improvements have to be made to reach a higher recognition accuracy of certain expressions. Furthermore, the results shed light on the importance and interplay of individual facial regions for recognition. With this knowledge the perceptual quality of computer animations can be improved in order to reach a higher level of realism and effectiveness.