English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Data-driven approaches to unrestricted gaze-tracking benefit from saccade filtering

Flad, N., Ditz, J., Schmidt, A., Bülthoff, H., & Chuang, L. (2017). Data-driven approaches to unrestricted gaze-tracking benefit from saccade filtering. In Second Workshop on Eye Tracking and Visualization (ETVIS 2016) (pp. 1-5). Piscataway, NJ, USA: IEEE.

Item is

Basic

show hide
Genre: Conference Paper

Files

show Files

Locators

show
hide
Locator:
Link (Any fulltext)
Description:
-
OA-Status:

Creators

show
hide
 Creators:
Flad, N1, 2, 3, Author           
Ditz, JC2, 3, Author           
Schmidt, A, Author
Bülthoff, HH2, 3, 4, Author           
Chuang, LL1, 2, 3, Author           
Affiliations:
1Project group: Cognition & Control in Human-Machine Systems, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_2528703              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497794              
3Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              
4Project group: Cybernetics Approach to Perception & Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_2528701              

Content

show
hide
Free keywords: -
 Abstract: Unrestricted gaze tracking that allows for head and body movements can enable us to understand interactive gaze behavior with large-scale visualizations. Approaches that support this, by simultaneously recording eye- and user-movements, can either be based on geometric or data-driven regression models. A data-driven approach can be implemented more flexibly but its performance can suffer with poor quality training data. In this paper, we introduce a pre-processing procedure to remove training data for periods when the gaze is not fixating the presented target stimuli. Our procedure is based on a velocity-based filter for rapid eye-movements (i.e., saccades). Our results show that this additional procedure improved the accuracy of our unrestricted gaze-tracking model by as much as 56 . Future improvements to data-driven approaches for unrestricted gaze-tracking are proposed, in order to allow for more complex dynamic visualizations.

Details

show
hide
Language(s):
 Dates: 2017-02
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: DOI: 10.1109/ETVIS.2016.7851156
BibTex Citekey: FladDSBC2016
 Degree: -

Event

show
hide
Title: Second Workshop on Eye Tracking and Visualization (ETVIS 2016)
Place of Event: Baltimore, MD, USA
Start-/End Date: -

Legal Case

show

Project information

show

Source 1

show
hide
Title: Second Workshop on Eye Tracking and Visualization (ETVIS 2016)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: Piscataway, NJ, USA : IEEE
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 1 - 5 Identifier: ISBN: 978-1-5090-4731-4