English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Psychophysical evaluation of animated facial expressions

MPS-Authors
/persons/resource/persons84298

Wallraven,  C
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83829

Breidt,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Project group: Cognitive Engineering, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83870

Cunningham,  D
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Wallraven, C., Breidt, M., Cunningham, D., & Bülthoff, H. (2005). Psychophysical evaluation of animated facial expressions. In H. Bülthoff, & T. Troscianko (Eds.), APGV '05: 2nd Symposium on Applied Perception in Graphics and Visualization (pp. 17-24). New York, NY, USA: ACM Press.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-D4C3-1
Abstract
The human face is capable of producing an astonishing variety of expressions - expressions for which sometimes the smallest difference changes the perceived meaning noticably. Producing realistic-looking facial animations that are able to transport this degree of complexity continues to be a challenging research topic in computer graphics. One important question that remains to be answered is: When are facial animations good enough? Here we present an integrated framework in which psychophysical experiments are used in a first step to systematically evaluate the perceptual quality of computer-generated animations with respect to real-world video sequences. The result of the first experiment is an evaluation of several animation techniques in which we expose specific animation parameters that are important for perceptual fidelity. In a second experiment we then use these benchmarked animations in the context of perceptual research in order to systematically investigate the spatio-temporal characteristics of ex pressions. Using such an integrated approach, we are able to provide insights into facial expressions for both the perceptual and computer graphics community.