English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Physically plausible full-body hand-object interaction synthesis

Braun, J., Christen, S., Kocabas, M., Aksan, E., & Hilliges, O. (2024). Physically plausible full-body hand-object interaction synthesis. In 2024 International Conference on 3D Vision (3DV) (pp. 464-473). New York, NY: IEEE. doi:10.1109/3DV62453.2024.00109.

Item is

Basic

hide
Genre: Conference Paper

Files

show Files

Locators

hide
Description:
-
OA-Status:
Green
Locator:
https://doi.org/10.1109/3DV62453.2024.00109 (Publisher version)
Description:
-
OA-Status:
Closed Access

Creators

hide
 Creators:
Braun, Jona1, Author
Christen, Sammy1, Author
Kocabas, Muhammed2, Author                 
Aksan, Emre1, Author
Hilliges, Otmar1, Author
Affiliations:
1External Organizations, ou_persistent22              
2Dept. Perceiving Systems, Max Planck Institute for Intelligent Systems, Max Planck Society, ou_1497642              

Content

hide
Free keywords: Abt. Black
 Abstract: We propose a physics-based method for synthesizing dexterous hand-object interactions in a full-body setting. While recent advancements have addressed specific facets of human-object interactions, a comprehensive physics-based approach remains a challenge. Existing methods often focus on isolated segments of the interaction process and rely on data-driven techniques that may result in artifacts. In contrast, our proposed method embraces reinforcement learning (RL) and physics simulation to mitigate the limitations of data-driven approaches. Through a hierarchical framework, we first learn skill priors for both body and hand movements in a decoupled setting. The generic skill priors learn to decode a latent skill embedding into the motion of the underlying part. A high-level policy then controls hand-object interactions in these pretrained latent spaces, guided by task objectives of grasping and 3D target trajectory following. It is trained using a novel reward function that combines an adversarial style term with a task reward, encouraging natural motions while fulfilling the task incentives. Our method successfully accomplishes the complete interaction task, from approaching an object to grasping and subsequent manipulation. We compare our approach against kinematics-based baselines and show that it leads to more physically plausible motions.

Details

hide
Language(s): eng - English
 Dates: 2024-06-122024
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: BibTex Citekey: dfbgrasp2024braun
arXiv: 2309.07907
DOI: 10.1109/3DV62453.2024.00109
 Degree: -

Event

hide
Title: 2024 International Conference on 3D Vision (3DV)
Place of Event: Davos
Start-/End Date: 2024-03-18 - 2024-03-21

Legal Case

show

Project information

show

Source 1

hide
Title: 2024 International Conference on 3D Vision (3DV)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: New York, NY : IEEE
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 464 - 473 Identifier: ISBN: 979-8-3503-6245-9
ISBN: 979-8-3503-6246-6

Source 2

hide
Title: International Conference on 3D Vision (3DV)
Source Genre: Series
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: - Identifier: ISSN: 2475-7888
ISSN: 2378-3826