English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Data-driven approaches to unrestricted gaze-tracking benefit from saccade filtering

MPS-Authors
/persons/resource/persons192889

Flad,  N
Project group: Cognition & Control in Human-Machine Systems, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons216472

Ditz,  JC
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Project group: Cybernetics Approach to Perception & Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83861

Chuang,  LL
Project group: Cognition & Control in Human-Machine Systems, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource

Link
(Any fulltext)

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Flad, N., Ditz, J., Schmidt, A., Bülthoff, H., & Chuang, L. (2017). Data-driven approaches to unrestricted gaze-tracking benefit from saccade filtering. In Second Workshop on Eye Tracking and Visualization (ETVIS 2016) (pp. 1-5). Piscataway, NJ, USA: IEEE.


Cite as: https://hdl.handle.net/21.11116/0000-0000-C3BD-F
Abstract
Unrestricted gaze tracking that allows for head and body movements can enable us to understand interactive gaze behavior with large-scale visualizations. Approaches that support this, by simultaneously recording eye- and user-movements, can either be based on geometric or data-driven regression models. A data-driven approach can be implemented more flexibly but its performance can suffer with poor quality training data. In this paper, we introduce a pre-processing procedure to remove training data for periods when the gaze is not fixating the presented target stimuli. Our procedure is based on a velocity-based filter for rapid eye-movements (i.e., saccades). Our results show that this additional procedure improved the accuracy of our unrestricted gaze-tracking model by as much as 56 . Future improvements to data-driven approaches for unrestricted gaze-tracking are proposed, in order to allow for more complex dynamic visualizations.