Help Privacy Policy Disclaimer
  Advanced SearchBrowse




Conference Paper

Development of a virtual laboratory for the study of complex human behavior


von der Heyde,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available

Pelz, J., Hayhoe, M., Ballard, D., Shrivastava, A., Bayliss, J., & von der Heyde, M. (1999). Development of a virtual laboratory for the study of complex human behavior. In J. Merritt, M. Bolas, & S. Fisher (Eds.), Stereoscopic Displays and Virtual Reality Systems VI (pp. 416-426). Bellingham, WA, USA: SPIE.

Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-E757-9
The study of human perception has evolved from examining simple tasks executed in reduced laboratory conditions to the examination of complex, real-world behaviors. Virtual environments represent the next evolutionary step by allowing full stimulus control and repeatability for human subjects, and a testbed for evaluating models of human behavior. Visual resolution varies dramatically across the visual field, dropping orders of magnitude from central to
peripheral vision. Humans move their gaze about a scene several times every second, projecting taskcritical areas of the scene onto the central retina. These eye movements are made even when the immediate task does not require high spatial resolution. Such “attentionally-driven” eye movements are important because they provide an externally observable marker of the way subjects deploy their attention while performing complex, real-world tasks. Tracking subjects’ eye movements while they perform complex
tasks in virtual environments provides a window into perception. In addition to the ability to track subjects’
eyes in virtual environments, concurrent EEG recording provides a further indicator of cognitive state. We have developed a virtual reality laboratory in which head-mounted displays (HMDs) are instrumented with infrared video-based eyetrackers to monitor subjects’ eye movements while they perform a range of complex tasks such as driving, and manual tasks requiring careful eye-hand coordination. A go-kart mounted on a 6DOF motion platform provides kinesthetic feedback to subjects as they drive through a virtual town; a dual-haptic interface consisting of two SensAble Phantom extended range devices allows
free motion and realistic force-feedback within a 1 m3 volume.