3 research outputs found
Interactive global illumination in dynamic environments using commodity graphics hardware
We present a system based on commodity graphics hardware for computing global illumination in dynamic scenes at interactive rates. We designed a progressive global illumination algorithm specifically to take advantage of current graphics hardware features. Our algorithm simulates the transport of light in synthetic environments by following the light emitted from the light source(s) through its multiple bounces on the surfaces of the scene. The entire algorithm runs on ATI’s Radeon 9700 using vertex and fragment shaders, allowing us to compute and display a global illumination solution for reasonably complex scenes with moving objects and moving lights in approximately 250 milliseconds (4 frames per second)
A graphics processing unit based method for dynamic real-time global illumination
Real-time realistic image synthesis for virtual environments has been one of the most actively researched
areas in computer graphics for over a decade. Images that display physically correct illumination of an
environment can be simulated by evaluating a multi-dimensional integral equation, called the rendering
equation, over the surfaces of the environment. Many global illumination algorithms such as pathtracing,
photon mapping and distributed ray-tracing can produce realistic images but are generally unable
to cope with dynamic lighting and objects at interactive rates. It still remains one of most challenging
problems to simulate physically correctly illuminated dynamic environments without a substantial preprocessing
step.
In this thesis we present a rendering system for dynamic environments by implementing a customized
rasterizer for global illumination entirely on the graphics hardware, the Graphical Processing
Unit. Our research focuses on a parameterization of discrete visibility field for efficient indirect illumination
computation. In order to generate the visibility field, we propose a CUDA-based (Compute
Unified Device Architecture) rasterizer which builds Layered Hit Buffers (LHB) by rasterizing polygons
into multi-layered structural buffers in parallel. The LHB provides a fast visibility function for any direction
at any point. We propose a cone approximation solution to resolve an aliasing problem due to
limited directional discretization. We also demonstrate how to remove structure noises by adapting an
interleaved sampling scheme and discontinuity buffer. We show that a gathering method amortized with
a multi-level Quasi Mont Carlo method can evaluate the rendering equation in real-time.
The method can realize real-time walk-through of a complex virtual environment that has a mixture
of diffuse and glossy reflection, computing multiple indirect bounces on the fly. We show that our method
is capable of simulating fully dynamic environments including changes of view, materials, lighting and
objects at interactive rates on commodity level graphics hardware