English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Hardware-accelerated Dynamic Light Field Rendering

MPS-Authors
/persons/resource/persons44508

Goldluecke,  Bastian
International Max Planck Research School, MPI for Informatics, Max Planck Society;
Graphics - Optics - Vision, MPI for Informatics, Max Planck Society;

/persons/resource/persons44965

Magnor,  Marcus
Graphics - Optics - Vision, MPI for Informatics, Max Planck Society;

/persons/resource/persons45449

Seidel,  Hans-Peter       
Computer Graphics, MPI for Informatics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Goldluecke, B., Magnor, M., & Wilburn, B. (2002). Hardware-accelerated Dynamic Light Field Rendering. In Proceedings Vision, Modeling and Visualization VMV 2002 (pp. 455-462). Berlin, Germany: aka.


Cite as: https://hdl.handle.net/11858/00-001M-0000-000F-2FA6-2
Abstract
We present a system capable of interactively displaying a dynamic scene from novel viewpoints by warping and blending images recorded from multiple synchronized video cameras. It is tuned for streamed data and achieves 20~frames per second on modern consumer-class hardware when rendering a 3D~movie from an arbitrary eye point within the convex hull of the recording camera's positions. The quality of the prediction largely depends on the accuracy of the disparity maps which are reconstructed off-line and provided together with the images. We generalize known algorithms for estimating disparities between two images to the case of multiple image streams, aiming at a minimization of warping artifacts and utilization of temporal coherence.