207 research outputs found

    Towards Fully Dynamic Surface Illumination in Real-Time Rendering using Acceleration Data Structures

    Get PDF
    The improvements in GPU hardware, including hardware-accelerated ray tracing, and the push for fully dynamic realistic-looking video games, has been driving more research in the use of ray tracing in real-time applications. The work described in this thesis covers multiple aspects such as optimisations, adapting existing offline methods to real-time constraints, and adding effects which were hard to simulate without the new hardware, all working towards a fully dynamic surface illumination rendering in real-time.Our first main area of research concerns photon-based techniques, commonly used to render caustics. As many photons can be required for a good coverage of the scene, an efficient approach for detecting which ones contribute to a pixel is essential. We improve that process by adapting and extending an existing acceleration data structure; if performance is paramount, we present an approximation which trades off some quality for a 2–3× improvement in rendering time. The tracing of all the photons, and especially when long paths are needed, had become the highest cost. As most paths do not change from frame to frame, we introduce a validation procedure allowing the reuse of as many as possible, even in the presence of dynamic lights and objects. Previous algorithms for associating pixels and photons do not robustly handle specular materials, so we designed an approach leveraging ray tracing hardware to allow for caustics to be visible in mirrors or behind transparent objects.Our second research focus switches from a light-based perspective to a camera-based one, to improve the picking of light sources when shading: photon-based techniques are wonderful for caustics, but not as efficient for direct lighting estimations. When a scene has thousands of lights, only a handful can be evaluated at any given pixel due to time constraints. Current selection methods in video games are fast but at the cost of introducing bias. By adapting an acceleration data structure from offline rendering that stochastically chooses a light source based on its importance, we provide unbiased direct lighting evaluation at about 30 fps. To support dynamic scenes, we organise it in a two-level system making it possible to only update the parts containing moving lights, and in a more efficient way.We worked on top of the new ray tracing hardware to handle lighting situations that previously proved too challenging, and presented optimisations relevant for future algorithms in that space. These contributions will help in reducing some artistic constraints while designing new virtual scenes for real-time applications

    The Iray Light Transport Simulation and Rendering System

    Full text link
    While ray tracing has become increasingly common and path tracing is well understood by now, a major challenge lies in crafting an easy-to-use and efficient system implementing these technologies. Following a purely physically-based paradigm while still allowing for artistic workflows, the Iray light transport simulation and rendering system allows for rendering complex scenes by the push of a button and thus makes accurate light transport simulation widely available. In this document we discuss the challenges and implementation choices that follow from our primary design decisions, demonstrating that such a rendering system can be made a practical, scalable, and efficient real-world application that has been adopted by various companies across many fields and is in use by many industry professionals today

    Interactive caustics using local precomputed irradiance

    Get PDF
    Journal ArticleBright patterns of light focused via reflective or refractive objects onto matte surfaces are called "caustics". We present a method for rendering dynamic scenes with moving caustics at interactive rates. This technique requires some simplifying assumptions about caustic behavior allowing us to consider it a local spatial property which we sample in a pre-processing stage. Storing the caustic locally limits caustic rendering to a simple lookup. We examine a number of ways to represent this data, allowing us to trade between accuracy, storage, run time, and precomputation time

    Photorealistic physically based render engines: a comparative study

    Full text link
    PĂ©rez Roig, F. (2012). Photorealistic physically based render engines: a comparative study. http://hdl.handle.net/10251/14797.Archivo delegad

    Real-Time Volumetric Shadows using 1D Min-Max Mipmaps

    Get PDF
    Light scattering in a participating medium is responsible for several important effects we see in the natural world. In the presence of occluders, computing single scattering requires integrating the illumination scattered towards the eye along the camera ray, modulated by the visibility towards the light at each point. Unfortunately, incorporating volumetric shadows into this integral, while maintaining real-time performance, remains challenging. In this paper we present a new real-time algorithm for computing volumetric shadows in single-scattering media on the GPU. This computation requires evaluating the scattering integral over the intersections of camera rays with the shadow map, expressed as a 2D height field. We observe that by applying epipolar rectification to the shadow map, each camera ray only travels through a single row of the shadow map (an epipolar slice), which allows us to find the visible segments by considering only 1D height fields. At the core of our algorithm is the use of an acceleration structure (a 1D minmax mipmap) which allows us to quickly find the lit segments for all pixels in an epipolar slice in parallel. The simplicity of this data structure and its traversal allows for efficient implementation using only pixel shaders on the GPU

    Ray Tracing Gems

    Get PDF
    This book is a must-have for anyone serious about rendering in real time. With the announcement of new ray tracing APIs and hardware to support them, developers can easily create real-time applications with ray tracing as a core component. As ray tracing on the GPU becomes faster, it will play a more central role in real-time rendering. Ray Tracing Gems provides key building blocks for developers of games, architectural applications, visualizations, and more. Experts in rendering share their knowledge by explaining everything from nitty-gritty techniques that will improve any ray tracer to mastery of the new capabilities of current and future hardware. What you'll learn: The latest ray tracing techniques for developing real-time applications in multiple domains Guidance, advice, and best practices for rendering applications with Microsoft DirectX Raytracing (DXR) How to implement high-performance graphics for interactive visualizations, games, simulations, and more Who this book is for: Developers who are looking to leverage the latest APIs and GPU technology for real-time rendering and ray tracing Students looking to learn about best practices in these areas Enthusiasts who want to understand and experiment with their new GPU

    Recreating Early Islamic Glass Lamp Lighting

    Get PDF
    Early Islamic light sources are not simple, static, uniform points, and the fixtures themselves are often combinations of glass, water, fuel and flame. Various physically based renderers such as Radiance are widely used for modeling ancient architectural scenes; however they rarely capture the true ambiance of the environment due to subtle lighting effects. Specifically, these renderers often fail to correctly model complex caustics produced by glass fixtures, water level, and fuel sources. While the original fixtures of the 8th through 10th century Mosque of Córdoba in Spain have not survived, we have applied information gathered from earlier and contemporary sites and artifacts, including those from Byzantium, to assume that it was illuminated by either single jar lamps or supported by polycandela that cast unique downward caustic lighting patterns which helped individuals to navigate and to read. To re-synthesize such lighting, we gathered experimental archaeological data and investigated and validated how various water levels and glass fixture shapes, likely used during early Islamic times, changed the overall light patterns and downward caustics. In this paper, we propose a technique called Caustic Cones, a novel data-driven method to ‘shape’ the light emanating from the lamps to better recreate the downward lighting without resorting to computationally expensive photon mapping renderers. Additionally, we demonstrate on a rendering of the Mosque of Cordoba how our approach greatly benefits archaeologists and architectural historians by providing a more authentic visual simulation of early Islamic glass lamp lighting

    Artistic Path Space Editing of Physically Based Light Transport

    Get PDF
    Die Erzeugung realistischer Bilder ist ein wichtiges Ziel der Computergrafik, mit Anwendungen u.a. in der Spielfilmindustrie, Architektur und Medizin. Die physikalisch basierte Bildsynthese, welche in letzter Zeit anwendungsĂŒbergreifend weiten Anklang findet, bedient sich der numerischen Simulation des Lichttransports entlang durch die geometrische Optik vorgegebener Ausbreitungspfade; ein Modell, welches fĂŒr ĂŒbliche Szenen ausreicht, Photorealismus zu erzielen. Insgesamt gesehen ist heute das computergestĂŒtzte Verfassen von Bildern und Animationen mit wohlgestalteter und theoretisch fundierter Schattierung stark vereinfacht. Allerdings ist bei der praktischen Umsetzung auch die RĂŒcksichtnahme auf Details wie die Struktur des AusgabegerĂ€ts wichtig und z.B. das Teilproblem der effizienten physikalisch basierten Bildsynthese in partizipierenden Medien ist noch weit davon entfernt, als gelöst zu gelten. Weiterhin ist die Bildsynthese als Teil eines weiteren Kontextes zu sehen: der effektiven Kommunikation von Ideen und Informationen. Seien es nun Form und Funktion eines GebĂ€udes, die medizinische Visualisierung einer Computertomografie oder aber die Stimmung einer Filmsequenz -- Botschaften in Form digitaler Bilder sind heutzutage omniprĂ€sent. Leider hat die Verbreitung der -- auf Simulation ausgelegten -- Methodik der physikalisch basierten Bildsynthese generell zu einem Verlust intuitiver, feingestalteter und lokaler kĂŒnstlerischer Kontrolle des finalen Bildinhalts gefĂŒhrt, welche in vorherigen, weniger strikten Paradigmen vorhanden war. Die BeitrĂ€ge dieser Dissertation decken unterschiedliche Aspekte der Bildsynthese ab. Dies sind zunĂ€chst einmal die grundlegende Subpixel-Bildsynthese sowie effiziente Bildsyntheseverfahren fĂŒr partizipierende Medien. Im Mittelpunkt der Arbeit stehen jedoch AnsĂ€tze zum effektiven visuellen VerstĂ€ndnis der Lichtausbreitung, die eine lokale kĂŒnstlerische Einflussnahme ermöglichen und gleichzeitig auf globaler Ebene konsistente und glaubwĂŒrdige Ergebnisse erzielen. Hierbei ist die Kernidee, Visualisierung und Bearbeitung des Lichts direkt im alle möglichen Lichtpfade einschließenden "Pfadraum" durchzufĂŒhren. Dies steht im Gegensatz zu Verfahren nach Stand der Forschung, die entweder im Bildraum arbeiten oder auf bestimmte, isolierte Beleuchtungseffekte wie perfekte Spiegelungen, Schatten oder Kaustiken zugeschnitten sind. Die Erprobung der vorgestellten Verfahren hat gezeigt, dass mit ihnen real existierende Probleme der Bilderzeugung fĂŒr Filmproduktionen gelöst werden können

    Visually pleasing real-time global illumination rendering for fully-dynamic scenes

    Get PDF
    Global illumination (GI) rendering plays a crucial role in the photo-realistic rendering of virtual scenes. With the rapid development of graphics hardware, GI has become increasingly attractive even for real-time applications nowadays. However, the computation of physically-correct global illumination is time-consuming and cannot achieve real-time, or even interactive performance. Although the realtime GI is possible using a solution based on precomputation, such a solution cannot deal with fully-dynamic scenes. This dissertation focuses on solving these problems by introducing visually pleasing real-time global illumination rendering for fully-dynamic scenes. To this end, we develop a set of novel algorithms and techniques for rendering global illumination effects using the graphics hardware. All these algorithms not only result in real-time or interactive performance, but also generate comparable quality to the previous works in off-line rendering. First, we present a novel implicit visibility technique to circumvent expensive visibility queries in hierarchical radiosity by evaluating the visibility implicitly. Thereafter, we focus on rendering visually plausible soft shadows, which is the most important GI effect caused by the visibility determination. Based on the pre-filtering shadowmapping theory, wesuccessively propose two real-time soft shadow mapping methods: "convolution soft shadow mapping" (CSSM) and "variance soft shadow mapping" (VSSM). Furthermore, we successfully apply our CSSM method in computing the shadow effects for indirect lighting. Finally, to explore the GI rendering in participating media, we investigate a novel technique to interactively render volume caustics in the single-scattering participating media.Das Rendern globaler Beleuchtung ist fĂŒr die fotorealistische Darstellung virtueller Szenen von entscheidender Bedeutung. Dank der rapiden Entwicklung der Grafik-Hardware wird die globale Beleuchtung heutzutage sogar fĂŒr Echtzeitanwendungen immer attraktiver. Trotz allem ist die Berechnung physikalisch korrekter globaler Beleuchtung zeitintensiv und interaktive Laufzeiten können mit "standard Hardware" noch nicht erzielt werden. Obwohl das Rendering auf der Grundlage von Vorberechnungen in Echtzeit möglich ist, kann ein solcher Ansatz nicht auf voll-dynamische Szenen angewendet werden. Diese Dissertation zielt darauf ab, das Problem der globalen Beleuchtungsberechnung durch EinfĂŒhrung von neuen Techniken fĂŒr voll-dynamische Szenen in Echtzeit zu lösen. Dazu stellen wir eine Reihe neuer Algorithmen vor, die die Effekte der globaler Beleuchtung auf der Grafik-Hardware berechnen. All diese Algorithmen erzielen nicht nur Echtzeit bzw. interaktive Laufzeiten sondern liefern auch eine QualitĂ€t, die mit bisherigen offline Methoden vergleichbar ist. ZunĂ€chst prĂ€sentieren wir eine neue Technik zur Berechnung impliziter Sichtbarkeit, die aufwĂ€ndige Sichbarkeitstests in hierarchischen Radiosity-Datenstrukturen vermeidet. Anschliessend stellen wir eine Methode vor, die weiche Schatten, ein wichtiger Effekt fĂŒr die globale Beleuchtung, in Echtzeit berechnet. Auf der Grundlage der Theorie ĂŒber vorgefilterten Schattenwurf, zeigen wir nacheinander zwei Echtzeitmethoden zur Berechnung weicher SchattenwĂŒrfe: "Convolution Soft Shadow Mapping" (CSSM) und "Variance Soft Shadow Mapping" (VSSM). DarĂŒber hinaus wenden wir unsere CSSM-Methode auch erfolgreich auf den Schatteneffekt in der indirekten Beleuchtung an. Abschliessend prĂ€sentieren wir eine neue Methode zum interaktiven Rendern von Volumen-Kaustiken in einfach streuenden, halbtransparenten Medien
    • 

    corecore