日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

学術論文

Deep learning models for webcam eye tracking in online experiments

MPS-Authors
/persons/resource/persons274611

Saxena,  Shreshth
Department of Music, Max Planck Institute for Empirical Aesthetics, Max Planck Society;

/persons/resource/persons255421

Fink,  Lauren       
Department of Music, Max Planck Institute for Empirical Aesthetics, Max Planck Society;

/persons/resource/persons185905

Lange,  Elke B.       
Department of Music, Max Planck Institute for Empirical Aesthetics, Max Planck Society;

External Resource
There are no locators available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)

mus-23-sax-01-deep.pdf
(出版社版), 3MB

付随資料 (公開)
There is no public supplementary material available
引用

Saxena, S., Fink, L., & Lange, E. B. (2023). Deep learning models for webcam eye tracking in online experiments. Behavior Research Methods. doi:10.3758/s13428-023-02190-6.


引用: https://hdl.handle.net/21.11116/0000-000D-ACF3-F
要旨
Eye tracking is prevalent in scientific and commercial applications. Recent computer vision and deep learning methods enable eye tracking with off-the-shelf webcams and reduce dependence on expensive, restrictive hardware. However, such deep learning methods have not yet been applied and evaluated for remote, online psychological experiments. In this study, we tackle critical challenges faced in remote eye tracking setups and systematically evaluate appearance-based deep learning methods of gaze tracking and blink detection. From their own homes and laptops, 65 participants performed a battery of eye tracking tasks including (i) fixation, (ii) zone classification, (iii) free viewing, (iv) smooth pursuit, and (v) blink detection. Webcam recordings of the participants performing these tasks were processed offline through appearance-based models of gaze and blink detection. The task battery required different eye movements that characterized gaze and blink prediction accuracy over a comprehensive list of measures. We find the best gaze accuracy to be 2.4° and precision of 0.47°, which outperforms previous online eye tracking studies and reduces the gap between laboratory-based and online eye tracking performance. We release the experiment template, recorded data, and analysis code with the motivation to escalate affordable, accessible, and scalable eye tracking that has the potential to accelerate research in the fields of psychological science, cognitive neuroscience, user experience design, and human–computer interfaces.