Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Konferenzbeitrag

Testing the effect of depth on the perception of faces in an online study

MPG-Autoren
/persons/resource/persons227300

Hofmann,  Simon       
Department Neurology, MPI for Human Cognitive and Brain Sciences, Max Planck Society;

/persons/resource/persons280857

Koushik,  Abhay       
Department Neurology, MPI for Human Cognitive and Brain Sciences, Max Planck Society;
Max Planck School of Cognition;

/persons/resource/persons215680

Klotzsche,  Felix       
Department Neurology, MPI for Human Cognitive and Brain Sciences, Max Planck Society;

/persons/resource/persons201758

Nikulin,  Vadim V.       
Department Neurology, MPI for Human Cognitive and Brain Sciences, Max Planck Society;

/persons/resource/persons20065

Villringer,  Arno       
Department Neurology, MPI for Human Cognitive and Brain Sciences, Max Planck Society;

/persons/resource/persons134458

Gaebler,  Michael       
Department Neurology, MPI for Human Cognitive and Brain Sciences, Max Planck Society;

Externe Ressourcen
Es sind keine externen Ressourcen hinterlegt
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Hofmann, S., Koushik, A., Klotzsche, F., Nikulin, V. V., Villringer, A., & Gaebler, M. (2022). Testing the effect of depth on the perception of faces in an online study. In Proceedings of the 2022 Conference on Cognitive Computational Neuroscience.


Zitierlink: https://hdl.handle.net/21.11116/0000-000B-06D3-0
Zusammenfassung
Faces are socially relevant stimuli that can be distinguished by the spatial arrangements of their visual features. However, face perception has been mostly investigated with static 2D images, which differs from everyday life experience. In an online study, we investigate face perception in two viewing conditions (2D & 3D). We compare the cognitive face space for these conditions, by modeling the acquired human similarity ratings with similarity matrices computed from physical face attributes and feature maps of deep learning-based face recognition models. Lastly, we fit these models to the human similarity judgements to explore relevant facial features between the viewing conditions. Unveiling differences between 2D and 3D perception of faces will further our understanding on the role of stimulus presentation on face processing.