English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Book Chapter

Recognition of Dynamic Facial Action Probed by Visual Adaptation

MPS-Authors
/persons/resource/persons83871

Curio,  C
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Project group: Cognitive Engineering, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83829

Breidt,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Project group: Cognitive Engineering, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84016

Kleiner,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Project group: Cognitive Engineering, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Curio, C., Giese, M., Breidt, M., Kleiner, M., & Bülthoff, H. (2010). Recognition of Dynamic Facial Action Probed by Visual Adaptation. In C. Curio, H. Bülthoff, & M. Giese (Eds.), Dynamic Faces: Insights from Experiments and Computation (pp. 47-65). Cambridge, MA, USA: MIT Press.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-BD42-8
Abstract
This chapter presents a psychophysical experiment in which 3D computer graphic methods were used to generate close-to-reality facial expressions to examine aspects of recognizing dynamic facial expressions in humans. The study shows that high-level aftereffects similar to those shown earlier for static faces are produced by dynamic faces. The findings indicate that the aftereffects, which are consistent for adaptation with dynamic anti-expressions, are highly expression-specific. The chapter also highlights how computer graphics-generated expressions can be used in order to rule out low-level motion aftereffects. Dynamic face stimuli were created by using a three-dimensional face model that is based on the Facial Action Coding System (FACS).