English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Interactive Volume Caustics in Single-scattering Media

MPS-Authors
/persons/resource/persons44343

Dong,  Zhao
Computer Graphics, MPI for Informatics, Max Planck Society;

/persons/resource/persons45449

Seidel,  Hans-Peter       
Computer Graphics, MPI for Informatics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Hu, W., Dong, Z., Ihrke, I., Grosch, T., Yuan, G., & Seidel, H.-P. (2010). Interactive Volume Caustics in Single-scattering Media. In A. Varshney, C. Wyman, D. Aliaga, & M. M. Oliveira (Eds.), Proceedings I3D 2010 (pp. 109-117). New York, NY: ACM. doi:10.1145/1730804.1730822.


Cite as: https://hdl.handle.net/11858/00-001M-0000-000F-175B-9
Abstract
Volume caustics are intricate illumination patterns formed by light first
interacting with a specular surface and subsequently being scattered inside a
participating medium. Although this phenomenon can be simulated by existing
techniques, image synthesis is usually non-trivial and time-consuming.

Motivated by interactive applications, we propose a novel volume caustics
rendering method for single-scattering participating media. Our method is based
on the observation that line rendering of illumination rays into the screen
buffer establishes a direct light path between the viewer and the light source.
This connection is introduced via a single scattering event for every pixel
affected by the line primitive. Since the GPU is a parallel processor, the
radiance contributions of these light paths to each of the pixels can be
computed and accumulated independently. The implementation of our method is
straightforward and we show that it can be seamlessly integrated with existing
methods for rendering participating media.

We achieve high-quality results at real-time frame rates for large and dynamic
scenes containing homogeneous participating media. For inhomogeneous media, our
method achieves interactive performance that is close to real-time. Our method
is based on a simplified physical model and can thus be used for generating
physically plausible previews of expensive lighting simulations quickly.