1,970 research outputs found

    Implementation and Analysis of an Image-Based Global Illumination Framework for Animated Environments

    Get PDF
    We describe a new framework for efficiently computing and storing global illumination effects for complex, animated environments. The new framework allows the rapid generation of sequences representing any arbitrary path in a view space within an environment in which both the viewer and objects move. The global illumination is stored as time sequences of range-images at base locations that span the view space. We present algorithms for determining locations for these base images, and the time steps required to adequately capture the effects of object motion. We also present algorithms for computing the global illumination in the base images that exploit spatial and temporal coherence by considering direct and indirect illumination separately. We discuss an initial implementation using the new framework. Results and analysis of our implementation demonstrate the effectiveness of the individual phases of the approach; we conclude with an application of the complete framework to a complex environment that includes object motion

    Real-time Cinematic Design Of Visual Aspects In Computer-generated Images

    Get PDF
    Creation of visually-pleasing images has always been one of the main goals of computer graphics. Two important components are necessary to achieve this goal --- artists who design visual aspects of an image (such as materials or lighting) and sophisticated algorithms that render the image. Traditionally, rendering has been of greater interest to researchers, while the design part has always been deemed as secondary. This has led to many inefficiencies, as artists, in order to create a stunning image, are often forced to resort to the traditional, creativity-baring, pipelines consisting of repeated rendering and parameter tweaking. Our work shifts the attention away from the rendering problem and focuses on the design. We propose to combine non-physical editing with real-time feedback and provide artists with efficient ways of designing complex visual aspects such as global illumination or all-frequency shadows. We conform to existing pipelines by inserting our editing components into existing stages, hereby making editing of visual aspects an inherent part of the design process. Many of the examples showed in this work have been, until now, extremely hard to achieve. The non-physical aspect of our work enables artists to express themselves in more creative ways, not limited by the physical parameters of current renderers. Real-time feedback allows artists to immediately see the effects of applied modifications and compatibility with existing workflows enables easy integration of our algorithms into production pipelines

    Doctor of Philosophy

    Get PDF
    dissertationReal-time global illumination is the next frontier in real-time rendering. In an attempt to generate realistic images, games have followed the film industry into physically based shading and will soon begin integrating global illumination techniques. Traditional methods require too much memory and too much time to compute for real-time use. With Modular and Delta Radiance Transfer we precompute a scene-independent, low-frequency basis that allows us to calculate complex indirect lighting calculations in a much lower dimensional subspace with a reduced memory footprint and real-time execution. The results are then applied as a light map on many different scenes. To improve the low frequency results, we also introduce a novel screen space ambient occlusion technique that allows us to generate a smoother result with fewer samples. These three techniques, low and high frequency used together, provide a viable indirect lighting solution that can be run in milliseconds on today's hardware, providing a useful new technique for indirect lighting in real-time graphics

    Interactive display of isosurfaces with global illumination

    Get PDF
    Journal ArticleAbstract-In many applications, volumetric data sets are examined by displaying isosurfaces, surfaces where the data, or some function of the data, takes on a given value. Interactive applications typically use local lighting models to render such surfaces. This work introduces a method to precompute or lazily compute global illumination to improve interactive isosurface renderings. The precomputed illumination resides in a separate volume and includes direct light, shadows, and interreflections. Using this volume, interactive globally illuminated renderings of isosurfaces become feasible while still allowing dynamic manipulation of lighting, viewpoint and isovalue

    Flux-Limited Diffusion for Multiple Scattering in Participating Media

    Full text link
    For the rendering of multiple scattering effects in participating media, methods based on the diffusion approximation are an extremely efficient alternative to Monte Carlo path tracing. However, in sufficiently transparent regions, classical diffusion approximation suffers from non-physical radiative fluxes which leads to a poor match to correct light transport. In particular, this prevents the application of classical diffusion approximation to heterogeneous media, where opaque material is embedded within transparent regions. To address this limitation, we introduce flux-limited diffusion, a technique from the astrophysics domain. This method provides a better approximation to light transport than classical diffusion approximation, particularly when applied to heterogeneous media, and hence broadens the applicability of diffusion-based techniques. We provide an algorithm for flux-limited diffusion, which is validated using the transport theory for a point light source in an infinite homogeneous medium. We further demonstrate that our implementation of flux-limited diffusion produces more accurate renderings of multiple scattering in various heterogeneous datasets than classical diffusion approximation, by comparing both methods to ground truth renderings obtained via volumetric path tracing.Comment: Accepted in Computer Graphics Foru
    • …
    corecore