English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Silhouette Based Generic Model Adaptation for Marker-less Motion Capturing

MPS-Authors
/persons/resource/persons45577

Sunkel,  Martin
Computer Graphics, MPI for Informatics, Max Planck Society;
International Max Planck Research School, MPI for Informatics, Max Planck Society;

/persons/resource/persons45312

Rosenhahn,  Bodo
Computer Graphics, MPI for Informatics, Max Planck Society;

/persons/resource/persons45449

Seidel,  Hans-Peter       
Computer Graphics, MPI for Informatics, Max Planck Society;

External Resource

https://rdcu.be/dIMML
(Publisher version)

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Sunkel, M., Rosenhahn, B., & Seidel, H.-P. (2007). Silhouette Based Generic Model Adaptation for Marker-less Motion Capturing. In A. Elgammal, B. Rosenhahn, & R. Klette (Eds.), Human Motion - Understanding, Modeling, Capture and Animation (pp. 119-135). Berlin, Germany: Springer.


Cite as: https://hdl.handle.net/11858/00-001M-0000-000F-20B1-F
Abstract
This work presents a
marker-less motion capture system that
incorporates an approach to smoothly adapt a generic model mesh
to the individual shape of a tracked person.
This is done relying on extracted silhouettes only.
Thus, during the capture process the 3D model of a tracked person is learned.

Depending on a sparse number of 2D-3D correspondences, that are computed
along normal directions
from image sequences of different cameras,
a Laplacian mesh editing tool generates
the final adapted model.
With the increasing number of frames
an approach for temporal coherence reduces the effects of insufficient
correspondence data to a minimum and guarantees smooth adaptation results.
Further, we present experiments on non-optimal data that
show the robustness of our algorithm.