58,890 research outputs found

    Real-Time Underwater Spectral Rendering

    Get PDF
    The light field in an underwater environment is characterized by complex multiple scattering interactions and wavelength‐dependent attenuation, requiring significant computational resources for the simulation of underwater scenes. We present a novel approach that makes it possible to simulate multi‐spectral underwater scenes, in a physically‐based manner, in real time. Our key observation is the following: In the vertical direction, the steady decay in irradiance as a function of depth is characterized by the diffuse downwelling attenuation coefficient, which oceanographers routinely measure for different types of waters. We rely on a database of such real‐world measurements to obtain an analytical approximation to the Radiative Transfer Equation, allowing for real‐time spectral rendering with results comparable to Monte Carlo ground‐truth references, in a fraction of the time. We show results simulating underwater appearance for the different optical water types, including volumetric shadows and dynamic, spatially varying lighting near the water surface

    GPU-Based Volume Rendering of Noisy Multi-Spectral Astronomical Data

    Full text link
    Traditional analysis techniques may not be sufficient for astronomers to make the best use of the data sets that current and future instruments, such as the Square Kilometre Array and its Pathfinders, will produce. By utilizing the incredible pattern-recognition ability of the human mind, scientific visualization provides an excellent opportunity for astronomers to gain valuable new insight and understanding of their data, particularly when used interactively in 3D. The goal of our work is to establish the feasibility of a real-time 3D monitoring system for data going into the Australian SKA Pathfinder archive. Based on CUDA, an increasingly popular development tool, our work utilizes the massively parallel architecture of modern graphics processing units (GPUs) to provide astronomers with an interactive 3D volume rendering for multi-spectral data sets. Unlike other approaches, we are targeting real time interactive visualization of datasets larger than GPU memory while giving special attention to data with low signal to noise ratio - two critical aspects for astronomy that are missing from most existing scientific visualization software packages. Our framework enables the astronomer to interact with the geometrical representation of the data, and to control the volume rendering process to generate a better representation of their datasets.Comment: 4 pages, 1 figure, to appear in the proceedings of ADASS XIX, Oct 4-8 2009, Sapporo, Japan (ASP Conf. Series

    Practical acquisition and rendering of diffraction effects in surface reflectance

    Get PDF
    We propose two novel contributions for measurement based rendering of diffraction effects in surface reflectance of planar homogeneous diffractive materials. As a general solution for commonly manufactured materials, we propose a practical data-driven rendering technique and a measurement approach to efficiently render complex diffraction effects in real-time. Our measurement step simply involves photographing a planar diffractive sam- ple illuminated with an LED flash. Here, we directly record the resultant diffraction pattern on the sample surface due to a narrow band point source illumination. Furthermore, we propose an efficient rendering method that exploits the measurement in conjunction with the Huygens-Fresnel principle to fit relevant diffraction parameters based on a first order approximation. Our proposed data-driven rendering method requires the precomputation of a single diffraction look up table for accurate spectral rendering of com- plex diffraction effects. Secondly, for sharp specular samples, we propose a novel method for practical measurement of the underlying diffraction grating using out-of-focus “bokeh” photography of the specular highlight. We demonstrate how the measured bokeh can be employed as a height field to drive a diffraction shader based on a first order approximation for efficient real-time rendering. Finally, we also drive analytic solutions for a few special cases of diffraction from our measurements and demonstrate realistic rendering results under complex light sources and environments

    The development of local solar irradiance for outdoor computer graphics rendering

    Get PDF
    Atmospheric effects are approximated by solving the light transfer equation, LTE, of a given viewing path. The resulting accumulated spectral energy (its visible band) arriving at the observer’s eyes, defines the colour of the object currently on the line of sight. Due to the convenience of using a single rendering equation to solve the LTE for daylight sky and distant objects (aerial perspective), recent methods had opt for a similar kind of approach. Alas, the burden that the real-time calculation brings to the foil had forced these methods to make simplifications that were not in line with the actual world observation. Consequently, the results of these methods are laden with visual-errors. The two most common simplifications made were: i) assuming the atmosphere as a full-scattering medium only and ii) assuming a single density atmosphere profile. This research explored the possibility of replacing the real-time calculation involved in solving the LTE with an analytical-based approach. Hence, the two simplifications made by the previous real-time methods can be avoided. The model was implemented on top of a flight simulator prototype system since the requirements of such system match the objectives of this study. Results were verified against the actual images of the daylight skies. Comparison was also made with the previous methods’ results to showcase the proposed model strengths and advantages over its peers

    Refined Methods for Creating Realistic Haptic Virtual Textures from Tool-Mediated Contact Acceleration Data

    Get PDF
    Dragging a tool across a textured object creates rich high-frequency vibrations that distinctly convey the physical interaction between the tool tip and the object surface. Varying one’s scanning speed and normal force alters these vibrations, but it does not change the perceived identity of the tool or the surface. Previous research developed a promising data-driven approach to embedding this natural complexity in a haptic virtual environment: the approach centers on recording and modeling the tool contact accelerations that occur during real texture interactions at a limited set of force-speed combinations. This paper aims to optimize these prior methods of texture modeling and rendering to improve system performance and enable potentially higher levels of haptic realism. The key elements of our approach are drawn from time series analysis, speech processing, and discrete-time control. We represent each recorded texture vibration with a low-order auto-regressive moving-average (ARMA) model, and we optimize this set of models for a specific tool-surface pairing (plastic stylus and textured ABS plastic) using metrics that depend on spectral match, final prediction error, and model order. For rendering, we stably resample the texture models at the desired output rate, and we derive a new texture model at each time step using bilinear interpolation on the line spectral frequencies of the resampled models adjacent to the user’s current force and speed. These refined processes enable our TexturePad system to generate a stable and spectrally accurate vibration waveform in real time, moving us closer to the goal of virtual textures that are indistinguishable from their real counterparts

    A Survey of Ocean Simulation and Rendering Techniques in Computer Graphics

    Get PDF
    This paper presents a survey of ocean simulation and rendering methods in computer graphics. To model and animate the ocean's surface, these methods mainly rely on two main approaches: on the one hand, those which approximate ocean dynamics with parametric, spectral or hybrid models and use empirical laws from oceanographic research. We will see that this type of methods essentially allows the simulation of ocean scenes in the deep water domain, without breaking waves. On the other hand, physically-based methods use Navier-Stokes Equations (NSE) to represent breaking waves and more generally ocean surface near the shore. We also describe ocean rendering methods in computer graphics, with a special interest in the simulation of phenomena such as foam and spray, and light's interaction with the ocean surface

    The Iray Light Transport Simulation and Rendering System

    Full text link
    While ray tracing has become increasingly common and path tracing is well understood by now, a major challenge lies in crafting an easy-to-use and efficient system implementing these technologies. Following a purely physically-based paradigm while still allowing for artistic workflows, the Iray light transport simulation and rendering system allows for rendering complex scenes by the push of a button and thus makes accurate light transport simulation widely available. In this document we discuss the challenges and implementation choices that follow from our primary design decisions, demonstrating that such a rendering system can be made a practical, scalable, and efficient real-world application that has been adopted by various companies across many fields and is in use by many industry professionals today

    Interactive Visualization of the Largest Radioastronomy Cubes

    Full text link
    3D visualization is an important data analysis and knowledge discovery tool, however, interactive visualization of large 3D astronomical datasets poses a challenge for many existing data visualization packages. We present a solution to interactively visualize larger-than-memory 3D astronomical data cubes by utilizing a heterogeneous cluster of CPUs and GPUs. The system partitions the data volume into smaller sub-volumes that are distributed over the rendering workstations. A GPU-based ray casting volume rendering is performed to generate images for each sub-volume, which are composited to generate the whole volume output, and returned to the user. Datasets including the HI Parkes All Sky Survey (HIPASS - 12 GB) southern sky and the Galactic All Sky Survey (GASS - 26 GB) data cubes were used to demonstrate our framework's performance. The framework can render the GASS data cube with a maximum render time < 0.3 second with 1024 x 1024 pixels output resolution using 3 rendering workstations and 8 GPUs. Our framework will scale to visualize larger datasets, even of Terabyte order, if proper hardware infrastructure is available.Comment: 15 pages, 12 figures, Accepted New Astronomy July 201
    corecore