English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Animated self-avatars in immersive virtual reality for studying body perception and distortions

Paul, S., & Mohler, B. J. (n.d.). Animated self-avatars in immersive virtual reality for studying body perception and distortions. In IEEE VR Doctoral Consortium 2015 (pp. 1-3).

Item is

Basic

show hide
Genre: Conference Paper

Files

show Files

Creators

show
hide
 Creators:
Paul, Soumya1, 2, Author           
Mohler, Betty J1, 2, Author           
Affiliations:
1Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497794              
2Research Group Space and Body Perception, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_2528693              

Content

show
hide
Free keywords: -
 Abstract: So far in my research studies with virtual reality I have focused on using body and hand motion tracking systems in order to animate different 3D self-avatars in immersive virtual reality environments (head-mounted displays or desktop virtual reality). We are using self-avatars to explore the following basic research question: what sensory information is used to perceive ones body dimensions? And the applied question of how we can best create a calibrated selfavatar for efficient use in first-person immersive head-mounted display interaction scenarios. The self-avatar used for such research questions and applications has to be precise, easy to use and enable the virtual hand and body to interact with physical objects. This is what my research has focused on thus far and what I am developing for the completion of my first year of my graduate studies. We plan to use LEAP motion for hand and arm movements and the Moven Inertial Measurement suit for full body tracking and the Oculus DK2 head-mounted display. A several step process of setting up and calibrating an animated self-avatar with full body motion and hand tracking is described in this paper. First, the user’s dimensions will be measured, they will be given a self-avatar with these dimensions, then they will be asked to perform pre-determined actions (i.e. touching objects, walking in a specific trajectory), then we will in real-time estimate how precise the animated body and body parts are relative to the real world reference objects, and finally a scaling of the avatar size or retargetting of the motion is performed in order to meet a specific minimum error requirement.

Details

show

Event

show
hide
Title: IEEE VR Doctoral Consortium 2015
Place of Event: Arles, France
Start-/End Date: -

Legal Case

show

Project information

show

Source 1

show
hide
Title: IEEE VR Doctoral Consortium 2015
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 1 - 3 Identifier: -