Deutsch
 
Benutzerhandbuch Datenschutzhinweis Impressum Kontakt
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Konferenzbeitrag

Towards Using Gaze Properties to Detect Language Proficiency

MPG-Autoren
/persons/resource/persons83861

Chuang,  LL
Project group: Cognition & Control in Human-Machine Systems, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen

Link
(beliebiger Volltext)

Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Karolus, J., Woźniak, P., & Chuang, L. (2016). Towards Using Gaze Properties to Detect Language Proficiency. In S. Björk, & E. Eriksson (Eds.), 9th Nordic Conference on Human-Computer Interaction (NordiCHI '16) (pp. 118). New York, NY, USA: ACM Press.


Zitierlink: http://hdl.handle.net/21.11116/0000-0000-7A6C-F
Zusammenfassung
Humans are inherently skilled at using subtle physiological cues from other persons, for example gaze direction in a conversation. Personal computers have yet to explore this implicit input modality. In a study with 14 participants, we investigate how a user's gaze can be leveraged in adaptive computer systems. In particular, we examine the impact of different languages on eye movements by presenting simple questions in multiple languages to our participants. We found that fixation duration is sufficient to ascertain if a user is highly proficient in a given language. We propose how these findings could be used to implement adaptive visualizations that react implicitly on the user's gaze.