English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Live User-guided Intrinsic Video For Static Scenes

MPS-Authors
/persons/resource/persons136488

Meka,  Abhimitra
Computer Graphics, MPI for Informatics, Max Planck Society;

/persons/resource/persons136490

Zollhöfer,  Michael
Computer Graphics, MPI for Informatics, Max Planck Society;

/persons/resource/persons45610

Theobalt,  Christian
Computer Graphics, MPI for Informatics, Max Planck Society;

External Ressource
No external resources are shared
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Meka, A., Fox, G., Zollhöfer, M., Richardt, C., & Theobalt, C. (2017). Live User-guided Intrinsic Video For Static Scenes. IEEE Transactions on Visualization and Computer Graphics, 23(11), 2447-2454. doi:10.1109/TVCG.2017.2734425.


Cite as: http://hdl.handle.net/11858/00-001M-0000-002E-05CC-6
Abstract
We present a novel real-time approach for user-guided intrinsic decomposition of static scenes captured by an RGB-D sensor. In the first step, we acquire a three-dimensional representation of the scene using a dense volumetric reconstruction framework. The obtained reconstruction serves as a proxy to densely fuse reflectance estimates and to store user-provided constraints in three-dimensional space. User constraints, in the form of constant shading and reflectance strokes, can be placed directly on the real-world geometry using an intuitive touch-based interaction metaphor, or using interactive mouse strokes. Fusing the decomposition results and constraints in three-dimensional space allows for robust propagation of this information to novel views by re-projection.We leverage this information to improve on the decomposition quality of existing intrinsic video decomposition techniques by further constraining the ill-posed decomposition problem. In addition to improved decomposition quality, we show a variety of live augmented reality applications such as recoloring of objects, relighting of scenes and editing of material appearance.