日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

報告書

Spatialized auditory cues enhance the visually-induced self-motion illusion (circular vection) in Virtual Reality

MPS-Authors
/persons/resource/persons84170

Riecke,  BE
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84199

Schulte-Pelkum,  J
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83846

Caniard,  F
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
There are no locators available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)

MPIK-TR-138.pdf
(出版社版), 997KB

付随資料 (公開)
There is no public supplementary material available
引用

Riecke, B., Schulte-Pelkum, J., Caniard, F., & Bülthoff, H.(2005). Spatialized auditory cues enhance the visually-induced self-motion illusion (circular vection) in Virtual Reality (138). Tübingen, Germany: Max Planck Institute for Biological Cybernetics.


引用: https://hdl.handle.net/11858/00-001M-0000-0013-D407-9
要旨
“Circular vection” refers to the illusion of self-motion induced by rotating visual or auditory stimuli. Visually induced vection can be quite compelling, and the illusion has been investigated extensively for over a century. Rotating auditory cues can also induce vection, but only in about 25-60% of blindfolded participants (Lackner, 1977; Larsson et al., 2004). Furthermore, auditory vection is much weaker and far less compelling than visual vection, which can be indistinguishable from real motion. Here, we investigated whether an additional auditory cue (the sound of a fountain that is also visible in the visual stimulus) can be utilized to enhance visually induced self-motion perception. To the best of our knowledge, this is the first study directly addressing audio-visual contributions to vection. Twenty observers viewed rotating photorealistic pictures of a natural scene projected onto a curved projection screen (FOV: 54x45). Three conditions were randomized in a repeated mea-
sures within-subject design: No sound, mono sound, and spatialized sound using a generic head-related transfer
function (HRTF). Adding mono sound to the visual vection stimulus increased convincingness ratings marginally,
but did not affect vection onset time, vection buildup time, vection intensity, or rated presence. Spatializing the
fountain sound such that it moved in accordance with the fountain in the visual scene, however, improved vection
significantly in terms of convincingness, vection buildup time, and presence ratings. The effect size for the vection
measures was, however, rather small (<16%). This might be related to a ceiling effect, as visually induced vection
was already quite strong without the spatialized sound (10s vection onset time). Despite the small effect size, this
study shows that HRTF-based auralization using headphones can be employed to improve visual VR simulations both in terms of self-motion perception and overall presence. Note that facilitation was found even though the visual stimulus was of high quality and realism, and known to be quite powerful in inducing vection. These findings have important implications both for the understanding of cross-modal cue integration and for optimizing VR simulations.