English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Real-time Hand Tracking Using a Sum of Anisotropic Gaussians Model

MPS-Authors
/persons/resource/persons79499

Sridhar,  Srinath
Computer Graphics, MPI for Informatics, Max Planck Society;

/persons/resource/persons79450

Rhodin,  Helge
Computer Graphics, MPI for Informatics, Max Planck Society;

/persons/resource/persons45449

Seidel,  Hans-Peter       
Computer Graphics, MPI for Informatics, Max Planck Society;

/persons/resource/persons45610

Theobalt,  Christian       
Computer Graphics, MPI for Informatics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

arXiv:1602.03860.pdf
(Preprint), 3MB

Supplementary Material (public)
There is no public supplementary material available
Citation

Sridhar, S., Rhodin, H., Seidel, H.-P., Oulasvirta, A., & Theobalt, C. (2014). Real-time Hand Tracking Using a Sum of Anisotropic Gaussians Model. In Proceedings of the 2nd International Conference on 3D Vision (pp. 319-326). Piscataway, NJ: IEEE explore. doi:10.1109/3DV.2014.37.


Cite as: https://hdl.handle.net/11858/00-001M-0000-002B-9878-6
Abstract
Real-time marker-less hand tracking is of increasing importance in
human-computer interaction. Robust and accurate tracking of arbitrary hand
motion is a challenging problem due to the many degrees of freedom, frequent
self-occlusions, fast motions, and uniform skin color. In this paper, we
propose a new approach that tracks the full skeleton motion of the hand from
multiple RGB cameras in real-time. The main contributions include a new
generative tracking method which employs an implicit hand shape representation
based on Sum of Anisotropic Gaussians (SAG), and a pose fitting energy that is
smooth and analytically differentiable making fast gradient based pose
optimization possible. This shape representation, together with a full
perspective projection model, enables more accurate hand modeling than a
related baseline method from literature. Our method achieves better accuracy
than previous methods and runs at 25 fps. We show these improvements both
qualitatively and quantitatively on publicly available datasets.