English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Multimodal multi-user human-robot interface for virtual collaboration

Song, Y., Niitsuma, M., Kubota, T., Hashimoto, H., & Son, H. (2012). Multimodal multi-user human-robot interface for virtual collaboration. In 2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI) (pp. 18-22). Piscataway, NJ, USA: IEEE.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/21.11116/0000-0001-8AD9-F Version Permalink: http://hdl.handle.net/21.11116/0000-0001-8ADA-E
Genre: Conference Paper

Files

show Files

Locators

show
hide
Description:
-

Creators

show
hide
 Creators:
Song, YE, Author
Niitsuma, M, Author
Kubota, T, Author
Hashimoto, H, Author
Son, HI1, 2, Author              
Affiliations:
1Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: We present a intuitive teleoperation scheme by using human gesture and multimodal multi-user human-robot interface. Further, in order to deal with the dynamic daily environment, we apply haptic point cloud rendering and virtual collaboration and all these functions are achieved by portable hardwares which is called “mobile iSpace”. First, a surrounding environment of a teleoperated robot is captured and reconstructed as the 3D point cloud using a depth camera. Virtual world is then generated from the 3D point cloud, and a virtual teleoperated robot model is placed in there. An user uses their own whole body gesture to teleoperate the humanoid type robot. The gesture is captured by the depth camera placed in user-side in real time. And the user has the visual and vibrotactile feedback at the same time by using a head mounted display and a vibrotactile glove. All these system components e.g. the human user, the teleoperated robot and the feedback devices are connected with the Internet-based virtual collaboration system to support a flexible accessibility. So eventually, this makes possible that users can access the remote placed robot whenever and wherever they want.

Details

show
hide
Language(s):
 Dates: 2012-11
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: -
 Identifiers: DOI: 10.1109/URAI.2012.6462920
 Degree: -

Event

show
hide
Title: 9th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI 2012)
Place of Event: Daejeon, South Korea
Start-/End Date: 2012-11-26 - 2012-11-28

Legal Case

show

Project information

show

Source 1

show
hide
Title: 2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: Piscataway, NJ, USA : IEEE
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 18 - 22 Identifier: ISBN: 978-1-4673-3111-1