329 research outputs found

    Towards Probe-Less Augmented Reality:a Position Paper

    Get PDF

    Real-Time View-Dependent Visualization of Real World Glossy Surfaces

    Get PDF

    A Scalable GPU-Based Approach to Shading and Shadowing for Photo-Realistic Real-Time Augmented Reality

    Get PDF

    Implementation and Analysis of an Image-Based Global Illumination Framework for Animated Environments

    Get PDF
    We describe a new framework for efficiently computing and storing global illumination effects for complex, animated environments. The new framework allows the rapid generation of sequences representing any arbitrary path in a view space within an environment in which both the viewer and objects move. The global illumination is stored as time sequences of range-images at base locations that span the view space. We present algorithms for determining locations for these base images, and the time steps required to adequately capture the effects of object motion. We also present algorithms for computing the global illumination in the base images that exploit spatial and temporal coherence by considering direct and indirect illumination separately. We discuss an initial implementation using the new framework. Results and analysis of our implementation demonstrate the effectiveness of the individual phases of the approach; we conclude with an application of the complete framework to a complex environment that includes object motion

    Using GIS databases to simulate night light imagery

    Get PDF
    The Digital Imaging and Remote Sensing Image Generation (DIRSIG) model and other image simulators provide the ability to utilize detailed, artificial scenes to generate spectrally and spatially realistic simulated imagery. Simulated imagery is useful in a myriad of ways, such as sensor modeling, algorithm performance assessment, and others. Actually making synthetic scenes, however, is often a time consuming process, requiring the manual placement of the many objects required to define the scene. This is particularly true for scenes of large spatial extent. Proposed is a technique to generate large-area night scenes for DIRSIG. This is accomplished by using freely available Geographic Information System (GIS) data to inform the placement of street light sources. Results to this point have demonstrated that this technique is a feasible way to model the radiance for large urban areas. This determination was made through comparison to real night time data collected by the Visible Infrared Imaging Radiometer Suite (VIIRS). The methodology is presented as a modular framework, so that future researchers can recreate the work done do this point, with the ability to easily substitute components of the workflow, such as using an alternate source of GIS data or a different simulation environment

    Probeless Illumination Estimation for Outdoor Augmented Reality

    Get PDF

    Performance Comparison of Techniques for Approximating Image-Based Lighting by Directional Light Sources

    Get PDF
    corecore