日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

書籍

Behavioral and Neural Mechanisms Underlying Dynamic Face Perception

MPS-Authors
/persons/resource/persons83890

Dobs,  K
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Project group: Cognitive Engineering, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)
公開されているフルテキストはありません
付随資料 (公開)
There is no public supplementary material available
引用

Dobs, K. (2015). Behavioral and Neural Mechanisms Underlying Dynamic Face Perception. Berlin, Germany: Logos Verlag.


引用: https://hdl.handle.net/11858/00-001M-0000-002A-47A7-B
要旨
Dynamic faces are highly complex, ecologically and socially relevant stimuli which we encounter almost everyday. When and what we extract from this rich source of information needs to be well coordinated by the face perception system. The current thesis investigates how this coordination is achieved.
Part I comprises two psychophysical experiments examining the mechanisms underlying facial motion processing. Facial motion is represented as high-dimensional spatio-temporal data defining which part of the face is moving in which direction over time. Previous studies suggest that facial motion can be adequately represented using simple approximations. I argue against the use of synthetic facial motion by showing that the face perception system is highly sensitive towards manipulations of the natural spatio-temporal characteristics of facial motion. The neural processes coordinating facial motion processing may rely on two mechanisms: first, a sparse but meaningful spatio-temporal code representing facial motion; second, a mechanism that extracts distinctive motion characteristics. Evidence for the latter hypothesis is provided by the observation that facial motion, when performed in unconstrained contexts, helps identity judgments.
Part II presents a functional magnetic resonance imaging (fMRI) study investigating the neural processing of expression and identity information in dynamic faces. Previous studies proposed a distributed neural system for face perception which distinguishes between invariant (e.g., identity) and changeable (e.g., expression) aspects of faces. Attention is a potential candidate mechanism to coordinate the processing of these two facial aspects. Two findings support this hypothesis: first, attention to expression versus identity of dynamic faces dissociates cortical areas assumed to process changeable aspects from those involved in discriminating invariant aspects of faces; second, attention leads to a more precise neural representation of the attended facial feature. Interactions between these two representations may be mediated by a part of the inferior occipital gyrus and the superior temporal sulcus which is supported by the observation that the latter area represented both expression and identity, while the first represented identity information irrespective of the attended feature.