1,103 research outputs found

    Review of simulating four classes of window materials for daylighting with non-standard BSDF using the simulation program Radiance

    Full text link
    This review describes the currently available simulation models for window material to calculate daylighting with the program "Radiance". The review is based on four abstract and general classes of window materials, depending on their scattering and redirecting properties (bidirectional scatter distribution function, BSDF). It lists potential and limits of the older models and includes the most recent additions to the software. All models are demonstrated using an exemplary indoor scene and two typical sky conditions. It is intended as clarification for applying window material models in project work or teaching. The underlying algorithmic problems apply to all lighting simulation programs, so the scenarios of materials and skies are applicable to other lighting programs

    Real-Time Underwater Spectral Rendering

    Get PDF
    The light field in an underwater environment is characterized by complex multiple scattering interactions and wavelength‐dependent attenuation, requiring significant computational resources for the simulation of underwater scenes. We present a novel approach that makes it possible to simulate multi‐spectral underwater scenes, in a physically‐based manner, in real time. Our key observation is the following: In the vertical direction, the steady decay in irradiance as a function of depth is characterized by the diffuse downwelling attenuation coefficient, which oceanographers routinely measure for different types of waters. We rely on a database of such real‐world measurements to obtain an analytical approximation to the Radiative Transfer Equation, allowing for real‐time spectral rendering with results comparable to Monte Carlo ground‐truth references, in a fraction of the time. We show results simulating underwater appearance for the different optical water types, including volumetric shadows and dynamic, spatially varying lighting near the water surface

    Real-time Realistic Rain Rendering

    Get PDF
    Artistic outdoor filming and rendering need to choose specific weather conditions in order to properly trigger the audience reaction; for instance, rain, one of the most common conditions, is usually employed to transmit a sense of unrest. Synthetic methods to recreate weather are an important avenue to simplify and cheapen filming, but simulations are a challenging problem due to the variety of different phenomena that need to be computed. Rain alone involves raindrops, splashes on the ground, fog, clouds, lightnings, etc. We propose a new rain rendering algorithm that uses and extends present state of the art approaches in this field. The scope of our method is to achieve real-time renders of rain streaks and splashes on the ground, while considering complex illumination effects and allowing an artistic direction for the drops placement. Our algorithm takes as input an artist-defined rain distribution and density, and then creates particles in the scene following these indications. No restrictions are imposed on the dimensions of the rain area, thus direct rendering approaches could rapidly overwhelm current computational capabilities with huge particle amounts. To solve this situation, we propose techniques that, in rendering time, adaptively sample the particles generated in order to only select the ones in the regions that really need to be simulated and rendered. Particle simulation is executed entirely in the graphics hardware. The algorithm proceeds by placing the particles in their updated coordinates. It then checks whether a particle is falling as a rain streak, it has reached the ground and it is a splash or, finally, if it should be discarded because it has entered a solid object of the scene. Different rendering techniques are used for each case. Complex illumination parameters are computed for rain streaks to select textures matching them. These textures are generated in a preprocess step and realistically simulate light when interacting with the optical properties of the water drops

    Polarimetric remote sensing system analysis: Digital Imaging and Remote Sensing Image Generation (DIRSIG) model validation and impact of polarization phenomenology on material discriminability

    Get PDF
    In addition to spectral information acquired by traditional multi/hyperspectral systems, passive electro optical and infrared (EO/IR) polarimetric sensors also measure the polarization response of different materials in the scene. Such an imaging modality can be useful in improving surface characterization; however, the characteristics of polarimetric systems have not been completely explored by the remote sensing community. Therefore, the main objective of this research was to advance our knowledge in polarimetric remote sensing by investigating the impact of polarization phenomenology on material discriminability. The first part of this research focuses on system validation, where the major goal was to assess the fidelity of the polarimetric images simulated using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model. A theoretical framework, based on polarization vision models used for animal vision studies and industrial defect detection applications, was developed within which the major components of the polarimetric image chain were validated. In the second part of this research, a polarization physics based approach for improved material discriminability was proposed. This approach utilizes the angular variation in the polarization response to infer the physical characteristics of the observed surface by imaging the scene in three different view directions. The usefulness of the proposed approach in improving detection performance in the absence of apriori knowledge about the target geometry was demonstrated. Sensitivity analysis of the proposed system for different scene related parameters was performed to identify the imaging conditions under which the material discriminability is maximized. Furthermore, the detection performance of the proposed polarimetric system was compared to that of the hyperspectral system to identify scenarios where polarization information can be very useful in improving the target contrast

    A Survey of Ocean Simulation and Rendering Techniques in Computer Graphics

    Get PDF
    This paper presents a survey of ocean simulation and rendering methods in computer graphics. To model and animate the ocean's surface, these methods mainly rely on two main approaches: on the one hand, those which approximate ocean dynamics with parametric, spectral or hybrid models and use empirical laws from oceanographic research. We will see that this type of methods essentially allows the simulation of ocean scenes in the deep water domain, without breaking waves. On the other hand, physically-based methods use Navier-Stokes Equations (NSE) to represent breaking waves and more generally ocean surface near the shore. We also describe ocean rendering methods in computer graphics, with a special interest in the simulation of phenomena such as foam and spray, and light's interaction with the ocean surface

    Exploration of Mouth Shading and Lighting in CG Production

    Get PDF
    The lighting and shading of human teeth in current computer animation features and live-action movies with effects are often intentionally avoided or processed by simple methods since they interact with light in complex ways through their intricate layered structure. The semi-translucent appearance of natural human teeth which result from subsurface scattering is difficult to replicate in synthetic scenes, though two techniques are often implemented. The first technique is to create an anatomically correct layered model, and render the teeth with both theoretically and empirically derived optical parameters of human teeth using physical subsurface materials. The second technique largely takes advantage of visual cheating, achieved by irradiance blending of finely painted textures. The result visually confirms that for most situations, non-physically based shading can yield believable rendered teeth by finely controlling contribution layers. In particular situations, such as an extremely close shot of a mouth, however, a physically correct shading model is necessary to produce highly translucent and realistic teeth

    ReLiShaft: realistic real-time light shaft generation taking sky illumination into account

    Get PDF
    © 2018 The Author(s) Rendering atmospheric phenomena is known to have its basis in the fields of atmospheric optics and meteorology and is increasingly used in games and movies. Although many researchers have focused on generating and enhancing realistic light shafts, there is still room for improvement in terms of both qualification and quantification. In this paper, a new technique, called ReLiShaft, is presented to generate realistic light shafts for outdoor rendering. In the first step, a realistic light shaft with respect to the sun position and sky colour in any specific location, date and time is constructed in real-time. Then, Hemicube visibility-test radiosity is employed to reveal the effect of a generated sky colour on environments. Two different methods are considered for indoor and outdoor rendering, ray marching based on epipolar sampling for indoor environments, and filtering on regular epipolar of z-partitioning for outdoor environments. Shadow maps and shadow volumes are integrated to consider the computational costs. Through this technique, the light shaft colour is adjusted according to the sky colour in any specific location, date and time. The results show different light shaft colours in different times of day in real-time

    Photorealistic physically based render engines: a comparative study

    Full text link
    Pérez Roig, F. (2012). Photorealistic physically based render engines: a comparative study. http://hdl.handle.net/10251/14797.Archivo delegad
    corecore