English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Self-motion perception and spatial orientation in virtual environments

Riecke, B. (2006). Self-motion perception and spatial orientation in virtual environments. Talk presented at York University: Centre for Vision Research. Toronto, ON, Canada. 2006-08-04.

Item is

Files

show Files

Creators

show
hide
 Creators:
Riecke, BE1, 2, Author           
Affiliations:
1Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: Despite recent technological advances, convincing self-motion simulation in Virtual Reality (VR) is difficult to achieve, and users of-ten suffer from motion sickness and/or disorientation in the simulated world. Instead of trying to simulate self-motions with physical realism (as is often done for, e.g., driving or flight simulators), we propose in this paper a perceptually oriented approach towards self-motion simulation. Following this paradigm, we performed a series of psychophysical experiments to determine essential visual, auditory, and vestibular/tactile parameters for an effective and perceptually convincing self-motion simulation. These studies are a first step towards our overall goal of achieving lean and elegant self-motion simulation in Virtual Reality (VR) without physically moving the observer. In a series of psychophysical experiments about the self- motion illusion (circular/linear vection), we found that (i) vection as well as presence in the simulated environment is increased by a consistent, naturalistic visual scene when compared to a sliced, inconsistent version of the identical scene, (ii) barely noticeable marks on the projection screen can increase vection as well as presence in an unobtrusive manner, (iii) physical vibrations of the observer's seat as well as inaudible subsonic cues can enhance the vection illusion, (iv) For the first time, it was shown that HRTF- based spatial audio cues can be used to reliably induce vection in up to 80% of blindfolded observers, (v) spatialized 3D audio cues embedded in the simulated environment increase the sensation of self- motion and presence. (vi) small physical motions (jerks of just a few cm or degrees) that accompany the onset of the visually simulated motion enhance vection, (vii) even the mere knowledge that one might potentially be moved physically increased the convincingness of the self-motion illusion significantly, especially when additional vibrations supported the interpretation that one was really moving. We conclude that providing consistent cues about self-motion to multiple sensory modalities can enhance vection, even if physical motion cues are absent. We propose that the spatial reference frame evoked by a naturalistic and cross-modally consistent virtual environment increases the believability of the stimulus, such that it is more easily accepted as a stable reference frame with respect to which visual or auditory motion is more likely to be judged as self- motion than object-motion. Compared to more traditional approaches of enhancing self-motion perception (e.g., motion platforms, free walking areas, or treadmills) the current, perceptually-oriented approach has only minimal requirements in terms of overall costs, required space, safety features, and technical effort and expertise. Thus, our approach might be promising for a wide range of low-cost applications.

Details

show
hide
Language(s):
 Dates: 2006-08
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: -
 Degree: -

Event

show
hide
Title: York University: Centre for Vision Research
Place of Event: Toronto, ON, Canada
Start-/End Date: 2006-08-04
Invited: Yes

Legal Case

show

Project information

show

Source

show