Deutsch
 
Benutzerhandbuch Datenschutzhinweis Impressum Kontakt
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Konferenzbeitrag

Towards Pervasive Gaze Tracking with Low-level Image Features

MPG-Autoren
Es sind keine MPG-Autoren in der Publikation vorhanden
Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Yanxia, Z., Bulling, A., & Gellersen, H. (2012). Towards Pervasive Gaze Tracking with Low-level Image Features. In S. N. Spencer (Ed.), Proceedings ETRA 2012 (pp. 261-264). New York, NY: ACM.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0017-9BDD-9
Zusammenfassung
We contribute a novel gaze estimation technique, which is adaptable for person-independent applications. In a study with 17 participants, using a standard webcam, we recorded the subjects\textquoteright} left eye images for different gaze locations. From these images, we extracted five types of basic visual features. We then sub-selected a set of features with minimum Redundancy Maximum Relevance (mRMR) for the input of a 2-layer regression neural network for estimating the subjects{\textquoteright} gaze. We investigated the effect of different visual features on the accuracy of gaze estimation. Using machine learning techniques, by combing different features, we achieved average gaze estimation error of 3.44{\textdegree} horizontally and 1.37{\textdegree vertically for person-dependent.