4 research outputs found

    Dynamic Illumination for Augmented Reality with Real-Time Interaction

    Get PDF
    Current augmented and mixed reality systems suffer a lack of correct illumination modeling where the virtual objects render the same lighting condition as the real environment. While we are experiencing astonishing results from the entertainment industry in multiple media forms, the procedure is mostly accomplished offline. The illumination information extracted from the physical scene is used to interactively render the virtual objects which results in a more realistic output in real-time. In this paper, we present a method that detects the physical illumination with dynamic scene, then uses the extracted illumination to render the virtual objects added to the scene. The method has three steps that are assumed to be working concurrently in real-time. The first is the estimation of the direct illumination (incident light) from the physical scene using computer vision techniques through a 360° live-feed camera connected to AR device. The second is the simulation of indirect illumination (reflected light) from the real-world surfaces to virtual objects rendering using region capture of 2D texture from the AR camera view. The third is defining the virtual objects with proper lighting and shadowing characteristics using shader language through multiple passes. Finally, we tested our work with multiple lighting conditions to evaluate the accuracy of results based on the shadow falling from the virtual objects which should be consistent with the shadow falling from the real objects with a reduced performance cost

    Estimating Reflectance Parameters, Light Direction, and Shape From a Single Multispectral Image

    Full text link
    This paper presents a novel approach for estimating the light direction, shape, and reflectance parameters from a single multispectral image. We start from a general formulation that hinges in the notion that the light reflected from an object can be deemed to be a linear combination of specular and diffuse reflections. This permits the recovery of the reflection parameters through an iterative optimization scheme, which we render well posed by adopting a novel reparameterization that reduces the number of degrees of freedom in the cost function. With the estimated specular reflectance parameters, we recover the single point light source position from specular highlights by applying two novel constraints, coplanarity and Kullback-Leibler divergence. Then, by integrating the knowledge of light source and diffuse reflectance parameters, we recover shape of the scene from the diffuse component. Our approach is quite general in nature and can be applied to a family of reflectance models that are based on the Fresnel reflection theory. We demonstrate the utility of our method on synthetic and real world imagery. We also compare our results to several alternatives in the literature

    Estimating Reflectance Parameters, Light Direction, and Shape From a Single Multispectral Image

    No full text
    This paper presents a novel approach for estimating the light direction, shape, and reflectance parameters from a single multispectral image. We start from a general formulation that hinges in the notion that the light reflected from an object can be deemed to be a linear combination of specular and diffuse reflections. This permits the recovery of the reflection parameters through an iterative optimization scheme, which we render well posed by adopting a novel reparameterization that reduces the number of degrees of freedom in the cost function. With the estimated specular reflectance parameters, we recover the single point light source position from specular highlights by applying two novel constraints, coplanarity and Kullback-Leibler divergence. Then, by integrating the knowledge of light source and diffuse reflectance parameters, we recover shape of the scene from the diffuse component. Our approach is quite general in nature and can be applied to a family of reflectance models that are based on the Fresnel reflection theory. We demonstrate the utility of our method on synthetic and real world imagery. We also compare our results to several alternatives in the literature

    Estimating Reflectance Parameters, Light Direction, and Shape From a Single Multispectral Image

    No full text
    corecore