User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse




Conference Paper

EyeMote - Towards Context‐Aware Gaming Using Eye Movements Recorded From Wearable Electrooculography

There are no MPG-Authors available
External Ressource
No external resources are shared
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available

Bulling, A., Roggen, D., & Tröster, G. (2008). EyeMote - Towards Context‐Aware Gaming Using Eye Movements Recorded From Wearable Electrooculography. In P. Markopoulos, B. de Ruyter, W. IJsselsteijn, & D. Rowland (Eds.), Fun and Games (pp. 33-45). Berlin: Springer.

Cite as: http://hdl.handle.net/11858/00-001M-0000-0023-D1BC-D
Physical activity has emerged as a novel input modality for so‐called\u000A active video games. Input devices such as music instruments, dance\u000A mats or the Wii accessories allow for novel ways of interaction and\u000A a more immersive gaming experience. In this work we describe how\u000A eye movements recognised from electrooculographic (EOG) signals can\u000A be used for gaming purposes in three different scenarios. In contrast\u000A to common video‐based systems, EOG can be implemented as a wearable\u000A and light‐weight system which allows for long‐term use with unconstrained\u000A simultaneous physical activity. In a stationary computer game we\u000A show that eye gestures of varying complexity can be recognised online\u000A with equal performance to a state‐of‐the‐art video‐based system.\u000A For pervasive gaming scenarios, we show how eye movements can be\u000A recognised in the presence of signal artefacts caused by physical\u000A activity such as walking. Finally, we describe possible future context‐aware\u000A games which exploit unconscious eye movements and show which possibilities\u000A this new input modality may open up.