Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Konferenzbeitrag

Multimodal multi-user human-robot interface for virtual collaboration

MPG-Autoren
/persons/resource/persons84227

Son,  HI
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Song, Y., Niitsuma, M., Kubota, T., Hashimoto, H., & Son, H. (2012). Multimodal multi-user human-robot interface for virtual collaboration. In 2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI) (pp. 18-22). Piscataway, NJ, USA: IEEE.


Zitierlink: https://hdl.handle.net/21.11116/0000-0001-8AD9-F
Zusammenfassung
We present a intuitive teleoperation scheme by using human gesture and multimodal multi-user human-robot interface. Further, in order to deal with the dynamic daily environment, we apply haptic point cloud rendering and virtual collaboration and all these functions are achieved by portable hardwares which is called “mobile iSpace”. First, a surrounding environment of a teleoperated robot is captured and reconstructed as the 3D point cloud using a depth camera. Virtual world is then generated from the 3D point cloud, and a virtual teleoperated robot model is placed in there. An user uses their own whole body gesture to teleoperate the humanoid type robot. The gesture is captured by the depth camera placed in user-side in real time. And the user has the visual and vibrotactile feedback at the same time by using a head mounted display and a vibrotactile glove. All these system components e.g. the human user, the teleoperated robot and the feedback devices are connected with the Internet-based virtual collaboration system to support a flexible accessibility. So eventually, this makes possible that users can access the remote placed robot whenever and wherever they want.