301 research outputs found

    DeLight-Net: Decomposing Reflectance Maps into Specular Materials and Natural Illumination

    Full text link
    In this paper we are extracting surface reflectance and natural environmental illumination from a reflectance map, i.e. from a single 2D image of a sphere of one material under one illumination. This is a notoriously difficult problem, yet key to various re-rendering applications. With the recent advances in estimating reflectance maps from 2D images their further decomposition has become increasingly relevant. To this end, we propose a Convolutional Neural Network (CNN) architecture to reconstruct both material parameters (i.e. Phong) as well as illumination (i.e. high-resolution spherical illumination maps), that is solely trained on synthetic data. We demonstrate that decomposition of synthetic as well as real photographs of reflectance maps, both in High Dynamic Range (HDR), and, for the first time, on Low Dynamic Range (LDR) as well. Results are compared to previous approaches quantitatively as well as qualitatively in terms of re-renderings where illumination, material, view or shape are changed.Comment: Stamatios Georgoulis and Konstantinos Rematas contributed equally to this wor

    Circularly polarized spherical illumination reflectometry

    Get PDF

    What Is Around The Camera?

    Get PDF
    How much does a single image reveal about the environment it was taken in? In this paper, we investigate how much of that information can be retrieved from a foreground object, combined with the background (i.e. the visible part of the environment). Assuming it is not perfectly diffuse, the foreground object acts as a complexly shaped and far-from-perfect mirror. An additional challenge is that its appearance confounds the light coming from the environment with the unknown materials it is made of. We propose a learning-based approach to predict the environment from multiple reflectance maps that are computed from approximate surface normals. The proposed method allows us to jointly model the statistics of environments and material properties. We train our system from synthesized training data, but demonstrate its applicability to real-world data. Interestingly, our analysis shows that the information obtained from objects made out of multiple materials often is complementary and leads to better performance.Comment: Accepted to ICCV. Project: http://homes.esat.kuleuven.be/~sgeorgou/multinatillum

    BxDF material acquisition, representation, and rendering for VR and design

    Get PDF
    Photorealistic and physically-based rendering of real-world environments with high fidelity materials is important to a range of applications, including special effects, architectural modelling, cultural heritage, computer games, automotive design, and virtual reality (VR). Our perception of the world depends on lighting and surface material characteristics, which determine how the light is reflected, scattered, and absorbed. In order to reproduce appearance, we must therefore understand all the ways objects interact with light, and the acquisition and representation of materials has thus been an important part of computer graphics from early days. Nevertheless, no material model nor acquisition setup is without limitations in terms of the variety of materials represented, and different approaches vary widely in terms of compatibility and ease of use. In this course, we describe the state of the art in material appearance acquisition and modelling, ranging from mathematical BSDFs to data-driven capture and representation of anisotropic materials, and volumetric/thread models for patterned fabrics. We further address the problem of material appearance constancy across different rendering platforms. We present two case studies in architectural and interior design. The first study demonstrates Yulio, a new platform for the creation, delivery, and visualization of acquired material models and reverse engineered cloth models in immersive VR experiences. The second study shows an end-to-end process of capture and data-driven BSDF representation using the physically-based Radiance system for lighting simulation and rendering

    On-site surface reflectometry

    Get PDF
    The rapid development of Augmented Reality (AR) and Virtual Reality (VR) applications over the past years has created the need to quickly and accurately scan the real world to populate immersive, realistic virtual environments for the end user to enjoy. While geometry processing has already gone a long way towards that goal, with self-contained solutions commercially available for on-site acquisition of large scale 3D models, capturing the appearance of the materials that compose those models remains an open problem in general uncontrolled environments. The appearance of a material is indeed a complex function of its geometry, intrinsic physical properties and furthermore depends on the illumination conditions in which it is observed, thus traditionally limiting the scope of reflectometry to highly controlled lighting conditions in a laboratory setup. With the rapid development of digital photography, especially on mobile devices, a new trend in the appearance modelling community has emerged, that investigates novel acquisition methods and algorithms to relax the hard constraints imposed by laboratory-like setups, for easy use by digital artists. While arguably not as accurate, we demonstrate the ability of such self-contained methods to enable quick and easy solutions for on-site reflectometry, able to produce compelling, photo-realistic imagery. In particular, this dissertation investigates novel methods for on-site acquisition of surface reflectance based on off-the-shelf, commodity hardware. We successfully demonstrate how a mobile device can be utilised to capture high quality reflectance maps of spatially-varying planar surfaces in general indoor lighting conditions. We further present a novel methodology for the acquisition of highly detailed reflectance maps of permanent on-site, outdoor surfaces by exploiting polarisation from reflection under natural illumination. We demonstrate the versatility of the presented approaches by scanning various surfaces from the real world and show good qualitative and quantitative agreement with existing methods for appearance acquisition employing controlled or semi-controlled illumination setups.Open Acces
    • …
    corecore