144 research outputs found
Recommended from our members
Rendering Deformable Surface Reflectance Fields
Animation of photorealistic computer graphics models is an important goal for many applications. Image-based modeling has emerged as a promising approach to capture and visualize real-world objects. Animating image-based models, however, is still a largely unsolved problem. In this paper, we extend a popular image-based representation called surface reflectance field to animate and render deformable real-world objects under arbitrary illumination. Deforming the surface reflectance field is achieved by modifying the underlying impostor geometry. We augment the impostor by a local parameterization that allows the correct evaluation of acquired reflectance images, preserving the original light model on the deformed surface. We present a deferred shading scheme to handle the increased amount of data involved in shading the deformable surface reflectance field. We show animations of various objects that were acquired with 3D photography.Engineering and Applied Science
An Overview of BRDF Models
This paper is focused on the Bidirectional Reflectance Distribution Function (BRDF) in the context of algorithms for computational production of realistic synthetic images. We provide a review of most relevant analytical BRDF models proposed in the literature which have been used for realistic rendering. We also show different approaches used for obtaining efficient models from acquired reflectance data, and the related function fitting techniques, suitable for using that data in efficient rendering algorithms. We consider algorithms for computation of BRDF integrals, by using Monte-Carlo based numerical integration. In this context, we review known techniques to design efficient BRDF sampling schemes for both analytical and measured BRDF models.The authors have been partially supported by
the Spanish Research Program under project
TIN2004-07672-C03-02 and the Andalusian Research
Program under project P08-TIC-03717
Data-Driven Reflectance Estimation Under Natural Lighting
Bidirectional Reflectance Distribution Functions, (BRDFs), describe how light is reflected off of a material. BRDFs are captured so that the materials can be re-lit under new while maintaining accuracy. BRDF models can approximate the reflectance of a material, but are unable to accurately represent the full BRDF of the material. Acquisition setups for BRDFs trade accuracy for speed with the most accurate methods, gonioreflectometers, being the slowest. Image-based BRDF acquisition approaches range from using complicated controlled lighting setups to uncontrolled known lighting to assuming the lighting is unknown. We propose a data-driven method for recovering BRDFs under known, but uncontrolled lighting. This approach utilizes a dataset of 100 measured BRDFs to accurately reconstruct the BRDF from a single photograph. We model the BRDFs as Gaussian Mixture Models, (GMMs), and use an Expectation Maximization, (EM), approach to determine cluster membership. We apply this approach to captured data as well as synthetic. We continue this work by relaxing assumptions about either lighting, material, or geometry. This work was supported in part by NSF grant IIS-1350323 and gifts from Google, Activision, and Nvidia
Intuitive and Accurate Material Appearance Design and Editing
Creating and editing high-quality materials for photorealistic rendering can be a difficult task due to the diversity and complexity of material appearance. Material design is the process by which artists specify the reflectance properties of a surface, such as its diffuse color and specular roughness. Even with the support of commercial software packages, material design can be a time-consuming trial-and-error task due to the counter-intuitive nature of the complex reflectance models. Moreover, many material design tasks require the physical realization of virtually designed materials as the final step, which makes the process even more challenging due to rendering artifacts and the limitations of fabrication. In this dissertation, we propose a series of studies and novel techniques to improve the intuitiveness and accuracy of material design and editing. Our goal is to understand how humans visually perceive materials, simplify user interaction in the design process and, and improve the accuracy of the physical fabrication of designs. Our first work focuses on understanding the perceptual dimensions for measured material data. We build a perceptual space based on a low-dimensional reflectance manifold that is computed from crowd-sourced data using a multi-dimensional scaling model. Our analysis shows the proposed perceptual space is consistent with the physical interpretation of the measured data. We also put forward a new material editing interface that takes advantage of the proposed perceptual space. We visualize each dimension of the manifold to help users understand how it changes the material appearance. Our second work investigates the relationship between translucency and glossiness in material perception. We conduct two human subject studies to test if subsurface scattering impacts gloss perception and examine how the shape of an object influences this perception. Based on our results, we discuss why it is necessary to include transparent and translucent media for future research in gloss perception and material design. Our third work addresses user interaction in the material design system. We present a novel Augmented Reality (AR) material design prototype, which allows users to visualize their designs against a real environment and lighting. We believe introducing AR technology can make the design process more intuitive and improve the authenticity of the results for both novice and experienced users. To test this assumption, we conduct a user study to compare our prototype with the traditional material design system with gray-scale background and synthetic lighting. The results demonstrate that with the help of AR techniques, users perform better in terms of objectively measured accuracy and time and they are subjectively more satisfied with their results. Finally, our last work turns to the challenge presented by the physical realization of designed materials. We propose a learning-based solution to map the virtually designed appearance to a meso-scale geometry that can be easily fabricated. Essentially, this is a fitting problem, but compared with previous solutions, our method can provide the fabrication recipe with higher reconstruction accuracy for a large fitting gamut. We demonstrate the efficacy of our solution by comparing the reconstructions with existing solutions and comparing fabrication results with the original design. We also provide an application of bi-scale material editing using the proposed method
Recommended from our members
A Data-Driven Reflectance Model
We present a generative model for isotropic bidirectional reflectance distribution functions (BRDFs) based on acquired reflectance data. Instead of using analytical reflectance models, we represent each BRDF as a dense set of measurements. This allows us to interpolate and extrapolate in the space of acquired BRDFs to create new BRDFs. We treat each acquired BRDF as a single high-dimensional vector taken from a space of all possible BRDFs. We apply both linear (subspace) and non-linear (manifold) dimensionality reduction tools in an effort to discover a lower-dimensional representation that characterizes our measurements. We let users define perceptually meaningful parametrization directions to navigate in the reduced-dimension BRDF space. On the low-dimensional manifold, movement along these directions produces novel but valid BRDFs.Engineering and Applied Science
- …