8 research outputs found

    A Radiance Model for Predicting Radio Wave Propagation in Irregular Dense Urban Areas

    Get PDF
    International audienceWe present a deterministic model of radio wave propagation based on radiance transfers. Our model uses radiosity techniques to determine facet-to-facet specular reflections according to a three-dimensional building description. The model contains two main components. First, visibility between elements is determined and used to establish links that represent radiance transfers that include diffraction and free spece losses based on geometric approximations. Second links are used to define a transfer equation whose solution provides the transfer intereflections. The solution is obtained by using hierarchical techniques. The results of the model show good agreement with measurements made in urban areas

    An Overview of BRDF Models

    Get PDF
    This paper is focused on the Bidirectional Reflectance Distribution Function (BRDF) in the context of algorithms for computational production of realistic synthetic images. We provide a review of most relevant analytical BRDF models proposed in the literature which have been used for realistic rendering. We also show different approaches used for obtaining efficient models from acquired reflectance data, and the related function fitting techniques, suitable for using that data in efficient rendering algorithms. We consider algorithms for computation of BRDF integrals, by using Monte-Carlo based numerical integration. In this context, we review known techniques to design efficient BRDF sampling schemes for both analytical and measured BRDF models.The authors have been partially supported by the Spanish Research Program under project TIN2004-07672-C03-02 and the Andalusian Research Program under project P08-TIC-03717

    Offset Surface Light Fields

    Get PDF
    For producing realistic images, reflection is an important visual effect. Reflections of the environment are important not only for highly reflective objects, such as mirrors, but also for more common objects such as brushed metals and glossy plastics. Generating these reflections accurately at real-time rates for interactive applications, however, is a difficult problem. Previous works in this area have made assumptions that sacrifice accuracy in order to preserve interactivity. I will present an algorithm that tries to handle reflection accurately in the general case for real-time rendering. The algorithm uses a database of prerendered environment maps to render both the original object itself and an additional bidirectional reflection distribution function (BRDF). The algorithm performs image-based rendering in reflection space in order to achieve accurate results. It also uses graphics processing unit (GPU) features to accelerate rendering

    Doctor of Philosophy

    Get PDF
    dissertationWhile boundary representations, such as nonuniform rational B-spline (NURBS) surfaces, have traditionally well served the needs of the modeling community, they have not seen widespread adoption among the wider engineering discipline. There is a common perception that NURBS are slow to evaluate and complex to implement. Whereas computer-aided design commonly deals with surfaces, the engineering community must deal with materials that have thickness. Traditional visualization techniques have avoided NURBS, and there has been little cross-talk between the rich spline approximation community and the larger engineering field. Recently there has been a strong desire to marry the modeling and analysis phases of the iterative design cycle, be it in car design, turbulent flow simulation around an airfoil, or lighting design. Research has demonstrated that employing a single representation throughout the cycle has key advantages. Furthermore, novel manufacturing techniques employing heterogeneous materials require the introduction of volumetric modeling representations. There is little question that fields such as scientific visualization and mechanical engineering could benefit from the powerful approximation properties of splines. In this dissertation, we remove several hurdles to the application of NURBS to problems in engineering and demonstrate how their unique properties can be leveraged to solve problems of interest

    Image based surface reflectance remapping for consistent and tool independent material appearence

    Get PDF
    Physically-based rendering in Computer Graphics requires the knowledge of material properties other than 3D shapes, textures and colors, in order to solve the rendering equation. A number of material models have been developed, since no model is currently able to reproduce the full range of available materials. Although only few material models have been widely adopted in current rendering systems, the lack of standardisation causes several issues in the 3D modelling workflow, leading to a heavy tool dependency of material appearance. In industry, final decisions about products are often based on a virtual prototype, a crucial step for the production pipeline, usually developed by a collaborations among several departments, which exchange data. Unfortunately, exchanged data often tends to differ from the original, when imported into a different application. As a result, delivering consistent visual results requires time, labour and computational cost. This thesis begins with an examination of the current state of the art in material appearance representation and capture, in order to identify a suitable strategy to tackle material appearance consistency. Automatic solutions to this problem are suggested in this work, accounting for the constraints of real-world scenarios, where the only available information is a reference rendering and the renderer used to obtain it, with no access to the implementation of the shaders. In particular, two image-based frameworks are proposed, working under these constraints. The first one, validated by means of perceptual studies, is aimed to the remapping of BRDF parameters and useful when the parameters used for the reference rendering are available. The second one provides consistent material appearance across different renderers, even when the parameters used for the reference are unknown. It allows the selection of an arbitrary reference rendering tool, and manipulates the output of other renderers in order to be consistent with the reference

    Interactive illumination and navigation control in an image-based environment.

    Get PDF
    Fu Chi-wing.Thesis (M.Phil.)--Chinese University of Hong Kong, 1999.Includes bibliographical references (leaves 141-149).Abstract --- p.iAcknowledgments --- p.iiiChapter 1 --- Introduction --- p.1Chapter 1.1 --- Introduction to Image-based Rendering --- p.1Chapter 1.2 --- Scene Complexity Independent Property --- p.2Chapter 1.3 --- Application of this Research Work --- p.3Chapter 1.4 --- Organization of this Thesis --- p.4Chapter 2 --- Illumination Control --- p.7Chapter 2.1 --- Introduction --- p.7Chapter 2.2 --- Apparent BRDF of Pixel --- p.8Chapter 2.3 --- Sampling Illumination Information --- p.11Chapter 2.4 --- Re-rendering --- p.13Chapter 2.4.1 --- Light Direction --- p.15Chapter 2.4.2 --- Light Intensity --- p.15Chapter 2.4.3 --- Multiple Light Sources --- p.15Chapter 2.4.4 --- Type of Light Sources --- p.18Chapter 2.5 --- Data Compression --- p.22Chapter 2.5.1 --- Intra-pixel coherence --- p.22Chapter 2.5.2 --- Inter-pixel coherence --- p.22Chapter 2.6 --- Implementation and Result --- p.22Chapter 2.6.1 --- An Interactive Viewer --- p.22Chapter 2.6.2 --- Lazy Re-rendering --- p.24Chapter 2.7 --- Conclusion --- p.24Chapter 3 --- Navigation Control - Triangle-based Warping Rule --- p.29Chapter 3.1 --- Introduction to Navigation Control --- p.29Chapter 3.2 --- Related Works --- p.30Chapter 3.3 --- Epipolar Geometry (Perspective Projection Manifold) --- p.31Chapter 3.4 --- Drawing Order for Pixel-Sized Entities --- p.35Chapter 3.5 --- Triangle-based Image Warping --- p.36Chapter 3.5.1 --- Image-based Triangulation --- p.36Chapter 3.5.2 --- Image-based Visibility Sorting --- p.40Chapter 3.5.3 --- Topological Sorting --- p.44Chapter 3.6 --- Results --- p.46Chapter 3.7 --- Conclusion --- p.48Chapter 4 --- Panoramic Projection Manifold --- p.52Chapter 4.1 --- Epipolar Geometry (Spherical Projection Manifold) --- p.53Chapter 4.2 --- Image Triangulation --- p.56Chapter 4.2.1 --- Optical Flow --- p.56Chapter 4.2.2 --- Image Gradient and Potential Function --- p.57Chapter 4.2.3 --- Triangulation --- p.58Chapter 4.3 --- Image-based Visibility Sorting --- p.58Chapter 4.3.1 --- Mapping Criteria --- p.58Chapter 4.3.2 --- Ordering of Two Triangles --- p.59Chapter 4.3.3 --- Graph Construction and Topological Sort --- p.63Chapter 4.4 --- Results --- p.63Chapter 4.5 --- Conclusion --- p.65Chapter 5 --- Panoramic-based Navigation using Real Photos --- p.69Chapter 5.1 --- Introduction --- p.69Chapter 5.2 --- System Overview --- p.71Chapter 5.3 --- Correspondence Matching --- p.72Chapter 5.3.1 --- Basic Model of Epipolar Geometry --- p.72Chapter 5.3.2 --- Epipolar Geometry between two Neighbor Panoramic Nodes --- p.73Chapter 5.3.3 --- Line and Patch Correspondence Matching --- p.74Chapter 5.4 --- Triangle-based Warping --- p.75Chapter 5.4.1 --- Why Warp Triangle --- p.75Chapter 5.4.2 --- Patch and Layer Construction --- p.76Chapter 5.4.3 --- Triangulation and Mesh Subdivision --- p.76Chapter 5.4.4 --- Layered Triangle-based Warping --- p.77Chapter 5.5 --- Implementation --- p.78Chapter 5.5.1 --- Image Sampler and Panoramic Stitcher --- p.78Chapter 5.5.2 --- Interactive Correspondence Matcher and Triangulation --- p.79Chapter 5.5.3 --- Basic Panoramic Viewer --- p.79Chapter 5.5.4 --- Formulating Drag Vector and vn --- p.80Chapter 5.5.5 --- Controlling Walkthrough Parameter --- p.82Chapter 5.5.6 --- Interactive Web-based Panoramic Viewer --- p.83Chapter 5.6 --- Results --- p.84Chapter 5.7 --- Conclusion and Possible Enhancements --- p.84Chapter 6 --- Compositing Warped Images for Object-based Viewing --- p.89Chapter 6.1 --- Modeling Object-based Viewing --- p.89Chapter 6.2 --- Triangulation and Convex Hull Criteria --- p.92Chapter 6.3 --- Warping Multiple Views --- p.94Chapter 6.3.1 --- Method I --- p.95Chapter 6.3.2 --- Method II --- p.95Chapter 6.3.3 --- Method III --- p.95Chapter 6.4 --- Results --- p.97Chapter 6.5 --- Conclusion --- p.100Chapter 7 --- Complete Rendering Pipeline --- p.107Chapter 7.1 --- Reviews on Illumination and Navigation --- p.107Chapter 7.1.1 --- Illumination Rendering Pipeline --- p.107Chapter 7.1.2 --- Navigation Rendering Pipeline --- p.108Chapter 7.2 --- Analysis of the Two Rendering Pipelines --- p.109Chapter 7.2.1 --- Combination on the Architectural Level --- p.109Chapter 7.2.2 --- Ensuring Physical Correctness --- p.111Chapter 7.3 --- Generalizing Apparent BRDF --- p.112Chapter 7.3.1 --- Difficulties to Encode BRDF with Spherical Harmonics --- p.112Chapter 7.3.2 --- Generalize Apparent BRDF --- p.112Chapter 7.3.3 --- Related works for Encoding the generalized apparent BRDF --- p.113Chapter 7.4 --- Conclusion --- p.116Chapter 8 --- Conclusion --- p.117Chapter A --- Spherical Harmonics --- p.120Chapter B --- It is Rare for Cycles to Exist in the Drawing Order Graph --- p.123Chapter B.1 --- Theorem 3 --- p.123Chapter B.2 --- Inside and Outside-directed Triangles in a Triangular Cycle --- p.125Chapter B.2.1 --- Assumption --- p.126Chapter B.2.2 --- Inside-directed and Outside-directed triangles --- p.126Chapter B.3 --- Four Possible Cases to Form a Cycle --- p.127Chapter B.3.1 --- Case(l) Triangular Fan --- p.128Chapter B.3.2 --- Case(2) Two Outside-directed Triangles --- p.129Chapter B.3.3 --- Case(3) Three Outside-directed Triangles --- p.130Chapter B.3.4 --- Case(4) More than Three Outside-directed Triangles --- p.131Chapter B.4 --- Experiment --- p.132Chapter C --- Deriving the Epipolar Line Formula on Cylindrical Projection Manifold --- p.133Chapter C.1 --- Notations --- p.133Chapter C.2 --- General Formula --- p.134Chapter C.3 --- Simplify the General Formula to a Sine Curve --- p.137Chapter C.4 --- Show that the Epipolar Line is a Sine Curve Segment --- p.139Chapter D --- Publications Related to this Research Work --- p.141Bibliography --- p.14

    Towards Predictive Rendering in Virtual Reality

    Get PDF
    The strive for generating predictive images, i.e., images representing radiometrically correct renditions of reality, has been a longstanding problem in computer graphics. The exactness of such images is extremely important for Virtual Reality applications like Virtual Prototyping, where users need to make decisions impacting large investments based on the simulated images. Unfortunately, generation of predictive imagery is still an unsolved problem due to manifold reasons, especially if real-time restrictions apply. First, existing scenes used for rendering are not modeled accurately enough to create predictive images. Second, even with huge computational efforts existing rendering algorithms are not able to produce radiometrically correct images. Third, current display devices need to convert rendered images into some low-dimensional color space, which prohibits display of radiometrically correct images. Overcoming these limitations is the focus of current state-of-the-art research. This thesis also contributes to this task. First, it briefly introduces the necessary background and identifies the steps required for real-time predictive image generation. Then, existing techniques targeting these steps are presented and their limitations are pointed out. To solve some of the remaining problems, novel techniques are proposed. They cover various steps in the predictive image generation process, ranging from accurate scene modeling over efficient data representation to high-quality, real-time rendering. A special focus of this thesis lays on real-time generation of predictive images using bidirectional texture functions (BTFs), i.e., very accurate representations for spatially varying surface materials. The techniques proposed by this thesis enable efficient handling of BTFs by compressing the huge amount of data contained in this material representation, applying them to geometric surfaces using texture and BTF synthesis techniques, and rendering BTF covered objects in real-time. Further approaches proposed in this thesis target inclusion of real-time global illumination effects or more efficient rendering using novel level-of-detail representations for geometric objects. Finally, this thesis assesses the rendering quality achievable with BTF materials, indicating a significant increase in realism but also confirming the remainder of problems to be solved to achieve truly predictive image generation
    corecore