1,697 research outputs found

    BxDF material acquisition, representation, and rendering for VR and design

    Get PDF
    Photorealistic and physically-based rendering of real-world environments with high fidelity materials is important to a range of applications, including special effects, architectural modelling, cultural heritage, computer games, automotive design, and virtual reality (VR). Our perception of the world depends on lighting and surface material characteristics, which determine how the light is reflected, scattered, and absorbed. In order to reproduce appearance, we must therefore understand all the ways objects interact with light, and the acquisition and representation of materials has thus been an important part of computer graphics from early days. Nevertheless, no material model nor acquisition setup is without limitations in terms of the variety of materials represented, and different approaches vary widely in terms of compatibility and ease of use. In this course, we describe the state of the art in material appearance acquisition and modelling, ranging from mathematical BSDFs to data-driven capture and representation of anisotropic materials, and volumetric/thread models for patterned fabrics. We further address the problem of material appearance constancy across different rendering platforms. We present two case studies in architectural and interior design. The first study demonstrates Yulio, a new platform for the creation, delivery, and visualization of acquired material models and reverse engineered cloth models in immersive VR experiences. The second study shows an end-to-end process of capture and data-driven BSDF representation using the physically-based Radiance system for lighting simulation and rendering

    Evaluation of climate-based daylighting techniques for complex fenestration and shading systems

    Get PDF
    The latest advancements in glazing technology are driving facade design towards complex and adaptive fenestration systems. Accurate simulation of their optical properties and operational controls for building daylight performance evaluation requires advanced modelling techniques, such as climate-based daylight modelling (CBDM). At the same time, computational efficiency is key to quickly simulate this complex performance over a full year. Over the years, several CBDM techniques were developed to answer these two main challenges, but they were never systematically benchmarked against each other. This paper compares state-of-the-art RADIANCE-based simulation techniques in terms of annual daylight performance metrics required by national guidelines and international green building rating schemes. The comparison is performed on three different shading systems: diffuse Venetian blinds, specular Venetian blinds, and perforated solar screens. Findings show that simulation methods are characterised by significant differences in their implementation and visual rendering, but most annual daylight metrics result in consistent values (within ± 20%). A notable exception is Annual Sunlight Exposure, which is highly sensitive to the chosen simulation method, with differences of up to 47 percentage points. Additional outcomes from the present work are used to compile a list of generalised recommendations for designers and policy makers

    Efficient multi-bounce lightmap creation using GPU forward mapping

    Get PDF
    Computer graphics can nowadays produce images in realtime that are hard to distinguish from photos of a real scene. One of the most important aspects to achieve this is the interaction of light with materials in the virtual scene. The lighting computation can be separated in two different parts. The first part is concerned with the direct illumination that is applied to all surfaces lit by a light source; algorithms related to this have been greatly improved over the last decades and together with the improvements of the graphics hardware can now produce realistic effects. The second aspect is about the indirect illumination which describes the multiple reflections of light from each surface. In reality, light that hits a surface is never fully absorbed, but instead reflected back into the scene. And even this reflected light is then reflected again and again until its energy is depleted. These multiple reflections make indirect illumination very computationally expensive. The first problem regarding indirect illumination is therefore, how it can be simplified to compute it faster. Another question concerning indirect illumination is, where to compute it. It can either be computed in the fixed image that is created when rendering the scene or it can be stored in a light map. The drawback of the first approach is, that the results need to be recomputed for every frame in which the camera changed. The second approach, on the other hand, is already used for a long time. Once a static scene has been set up, the lighting situation is computed regardless of the time it takes and the result is then stored into a light map. This is a texture atlas for the scene in which each surface point in the virtual scene has exactly one surface point in the 2D texture atlas. When displaying the scene with this approach, the indirect illumination does not need to be recomputed, but is simply sampled from the light map. The main contribution of this thesis is the development of a technique that computes the indirect illumination solution for a scene at interactive rates and stores the result into a light atlas for visualizing it. To achieve this, we overcome two main obstacles. First, we need to be able to quickly project data from any given camera configuration into the parts of the texture that are currently used for visualizing the 3D scene. Since our approach for computing and storing indirect illumination requires a huge amount of these projections, it needs to be as fast as possible. Therefore, we introduce a technique that does this projection entirely on the graphics card with a single draw call. Second, the reflections of light into the scene need to be computed quickly. Therefore, we separate the computation into two steps, one that quickly approximates the spreading of the light into the scene and a second one that computes the visually smooth final result using the aforementioned projection technique. The final technique computes the indirect illumination at interactive rates even for big scenes. It is furthermore very flexible to let the user choose between high quality results or fast computations. This allows the method to be used for quickly editing the lighting situation with high speed previews and then computing the final result in perfect quality at still interactive rates. The technique introduced for projecting data into the texture atlas is in itself highly flexible and also allows for fast painting onto objects and projecting data onto it, considering all perspective distortions and self-occlusions

    A Novel Framework for Highlight Reflectance Transformation Imaging

    Get PDF
    We propose a novel pipeline and related software tools for processing the multi-light image collections (MLICs) acquired in different application contexts to obtain shape and appearance information of captured surfaces, as well as to derive compact relightable representations of them. Our pipeline extends the popular Highlight Reflectance Transformation Imaging (H-RTI) framework, which is widely used in the Cultural Heritage domain. We support, in particular, perspective camera modeling, per-pixel interpolated light direction estimation, as well as light normalization correcting vignetting and uneven non-directional illumination. Furthermore, we propose two novel easy-to-use software tools to simplify all processing steps. The tools, in addition to support easy processing and encoding of pixel data, implement a variety of visualizations, as well as multiple reflectance-model-fitting options. Experimental tests on synthetic and real-world MLICs demonstrate the usefulness of the novel algorithmic framework and the potential benefits of the proposed tools for end-user applications.Terms: "European Union (EU)" & "Horizon 2020" / Action: H2020-EU.3.6.3. - Reflective societies - cultural heritage and European identity / Acronym: Scan4Reco / Grant number: 665091DSURF project (PRIN 2015) funded by the Italian Ministry of University and ResearchSardinian Regional Authorities under projects VIGEC and Vis&VideoLa

    Towards Predictive Rendering in Virtual Reality

    Get PDF
    The strive for generating predictive images, i.e., images representing radiometrically correct renditions of reality, has been a longstanding problem in computer graphics. The exactness of such images is extremely important for Virtual Reality applications like Virtual Prototyping, where users need to make decisions impacting large investments based on the simulated images. Unfortunately, generation of predictive imagery is still an unsolved problem due to manifold reasons, especially if real-time restrictions apply. First, existing scenes used for rendering are not modeled accurately enough to create predictive images. Second, even with huge computational efforts existing rendering algorithms are not able to produce radiometrically correct images. Third, current display devices need to convert rendered images into some low-dimensional color space, which prohibits display of radiometrically correct images. Overcoming these limitations is the focus of current state-of-the-art research. This thesis also contributes to this task. First, it briefly introduces the necessary background and identifies the steps required for real-time predictive image generation. Then, existing techniques targeting these steps are presented and their limitations are pointed out. To solve some of the remaining problems, novel techniques are proposed. They cover various steps in the predictive image generation process, ranging from accurate scene modeling over efficient data representation to high-quality, real-time rendering. A special focus of this thesis lays on real-time generation of predictive images using bidirectional texture functions (BTFs), i.e., very accurate representations for spatially varying surface materials. The techniques proposed by this thesis enable efficient handling of BTFs by compressing the huge amount of data contained in this material representation, applying them to geometric surfaces using texture and BTF synthesis techniques, and rendering BTF covered objects in real-time. Further approaches proposed in this thesis target inclusion of real-time global illumination effects or more efficient rendering using novel level-of-detail representations for geometric objects. Finally, this thesis assesses the rendering quality achievable with BTF materials, indicating a significant increase in realism but also confirming the remainder of problems to be solved to achieve truly predictive image generation

    Radiometric modeling of mechanical draft cooling towers to assist in the extraction of their absolute temperature from remote thermal imagery

    Get PDF
    Determination of the internal temperature of a mechanical draft cooling tower (MDCT) from remotely-sensed thermal imagery is important for many applications that provide input to energy-related process models. The problem of determining the temperature of an MDCT is unique due to the geometry of the tower and due to the exhausted water vapor plume. The radiance leaving the tower is dependent on the optical and thermal properties of the tower materials (i.e., emissivity, BRDF, temperature, etc.) as well as the internal geometry of the tower. The tower radiance is then propagated through the exhaust plume and through the atmosphere to arrive at the sensor. The expelled effluent from the tower consists of a warm plume with a higher water vapor concentration than the ambient atmosphere. Given that a thermal image has been atmospherically compensated, the remaining sources of error in extracted tower temperature due to the exhausted plume and the tower geometry must be accounted for. A temperature correction factor due to these error sources is derived through the use of three-dimensional radiometric modeling. A range of values for each important parameter are modeled to create a target space (i.e., look-up table) that predicts the internal MDCT temperature for every combination of parameter values. The look-up table provides data for the creation of a fast-running parameterized model. This model, along with user knowledge of the scene, provides a means to convert the image-derived apparent temperature into the estimated absolute temperature of an MDCT

    Radiation techniques for urban thermal simulation with the Finite Element Method

    Get PDF
    Modern societies are increasingly organized in cities. In the present times, more than half of the world’s population lives in urban settlements. In this context, architectural and building scale works have the need of extending their scope to the urban environment. One of the main challenges of these times is understanting all the thermal exchanges that happen in the city. The radiative part appears as the less developed one; its characterization and interaction with built structures has gained attention for building physics, architecture and environmental engineering. Providing a linkage between these areas, the emerging field of urban physics has become important for tackling studies of such nature. Urban thermal studies are intrinsically linked to multidisciplinary work approaches. Performing full-scale measurements is hard, and prototype models are difficult to develop. Therefore, computational simulations are essential in order to understand how the city behaves and to evaluate projected modifications. The methodological and algorithmic improvement of simulation is one of the mainlines of work for computational physics and many areas of computer science. The field of computer graphics has addressed the adaptation of rendering algorithms to daylighting using physically-based radiation models on architectural scenes. The Finite Element Method (FEM) has been widely used for thermal analysis. The maturity achieved by FEM software allows for treating very large models with a high geometrical detail and complexity. However, computing radiation exchanges in this context implies a hard computational challenge, and forces to push the limits of existing physical models. Computer graphics techniques can be adapted to FEM to estimate solar loads. In the thermal radiation range, the memory requirements for storing the interaction between the elements grows because all the urban surfaces become radiation sources. In this thesis, a FEM-based methodology for urban thermal analysis is presented. A set of radiation techniques (both for solar and thermal radiation) are developed and integrated into the FEM software Cast3m. Radiosity and ray tracing are used as the main algorithms for radiation computations. Several studies are performed for different city scenes. The FEM simulation results are com-pared with measured temperature results obtained by means of urban thermography. Post-processing techniques are used to obtain rendered thermograms, showing that the proposed methodology pro-duces accurate results for the cases analyzed. Moreover, its good computational performance allows for performing this kind of study using regular desktop PCs.Las sociedades modernas están cada vez más organizadas en ciudades. Más de la mitad de la población mundial vive en asentamientos urbanos en la actualidad. En este contexto, los trabajos a escala arquitectónica y de edificio deben extender su alcance al ambiente urbano. Uno de los mayores desafíos de estos tiempos consiste en entender todos los intercambios térmicos que suceden en la ciudad. La parte radiativa es la menos desarrollada; su caracterización y su interacción con edificaciones ha ganado la atención de la física de edificios, la arquitectura y la ingeniería ambiental. Como herramienta de conexión entre estas áreas, la física urbana es un área que resulta importante para atacar estudios de tal naturaleza. Los estudios térmicos urbanos están intrinsecamente asociados a trabajos multidisciplinarios. Llevar a cabo mediciones a escala real resulta difícil, y el desarrollo de prototipos de menor escala es complejo. Por lo tanto, la simulación computacional es esencial para entender el comportamiento de la ciudad y para evaluar modificaciones proyectadas. La mejora metodológica y algorítmica de las simulaciones es una de las mayores líneas de trabajo para la física computacional y muchas áreas de las ciencias de la computación. El área de la computación gráfica ha abordado la adaptación de algoritmos de rendering para cómputo de iluminación natural, utilizando modelos de radiación basados en la física y aplicándolos sobre escenas arquitectónicas. El Método de Elementos Finitos (MEF) ha sido ampliamente utilizado para análisis térmico. La madurez alcanzada por soluciones de software MEF permite tratar grandes modelos con un alto nivel de detalle y complejidad geométrica. Sin embargo, el cómputo del intercambio radiativo en este contexto implica un desafío computacional, y obliga a empujar los límites de las descripciones físicas conocidas. Algunas técnicas de computación gráfica pueden ser adaptadas a MEF para estimar las cargas solares. En el espectro de radiación térmica, los requisitos de memoria necesarios para almacenar la interacción entre los elementos crecen debido a que todas las superficies urbanas se transforman en fuentes emisoras de radiación. En esta tesis se presenta una metodología basada en MEF para el análisis térmico de escenas urbanas. Un conjunto de técnicas de radiación (para radiación solar y térmica) son desarrolladas e integradas en el software MEF Cast3m. Los algoritmos de radiosidad y ray tracing son utilizados para el cómputo radiativo. Se presentan varios estudios que utilizan diferentes modelos de ciudades. Los resultados obtenidos mediante MEF son comparados con temperaturas medidas por medio de termografías urbanas. Se utilizan técnicas de post-procesamiento para renderizar imágenes térmicas, que permiten concluir que la metodología propuesta produce resultados precisos para los casos analizados. Asimismo, su buen desempeño computacional posibilita realizar este tipo de estudios en computadoras personales

    Photorealistic physically based render engines: a comparative study

    Full text link
    Pérez Roig, F. (2012). Photorealistic physically based render engines: a comparative study. http://hdl.handle.net/10251/14797.Archivo delegad
    corecore