18 research outputs found

    An intuitive control space for material appearance

    Get PDF
    Many different techniques for measuring material appearance have been proposed in the last few years. These have produced large public datasets, which have been used for accurate, data-driven appearance modeling. However, although these datasets have allowed us to reach an unprecedented level of realism in visual appearance, editing the captured data remains a challenge. In this paper, we present an intuitive control space for predictable editing of captured BRDF data, which allows for artistic creation of plausible novel material appearances, bypassing the difficulty of acquiring novel samples. We first synthesize novel materials, extending the existing MERL dataset up to 400 mathematically valid BRDFs. We then design a large-scale experiment, gathering 56,000 subjective ratings on the high-level perceptual attributes that best describe our extended dataset of materials. Using these ratings, we build and train networks of radial basis functions to act as functionals mapping the perceptual attributes to an underlying PCA-based representation of BRDFs. We show that our functionals are excellent predictors of the perceived attributes of appearance. Our control space enables many applications, including intuitive material editing of a wide range of visual properties, guidance for gamut mapping, analysis of the correlation between perceptual attributes, or novel appearance similarity metrics. Moreover, our methodology can be used to derive functionals applicable to classic analytic BRDF representations. We release our code and dataset publicly, in order to support and encourage further research in this direction

    An intuitive control space for material appearance

    Get PDF
    Many different techniques for measuring material appearance have been proposed in the last few years. These have produced large public datasets, which have been used for accurate, data-driven appearance modeling. However, although these datasets have allowed us to reach an unprecedented level of realism in visual appearance, editing the captured data remains a challenge. In this paper, we present an intuitive control space for predictable editing of captured BRDF data, which allows for artistic creation of plausible novel material appearances, bypassing the difficulty of acquiring novel samples. We first synthesize novel materials, extending the existing MERL dataset up to 400 mathematically valid BRDFs. We then design a large-scale experiment, gathering 56,000 subjective ratings on the high-level perceptual attributes that best describe our extended dataset of materials. Using these ratings, we build and train networks of radial basis functions to act as functionals mapping the perceptual attributes to an underlying PCA-based representation of BRDFs. We show that our functionals are excellent predictors of the perceived attributes of appearance. Our control space enables many applications, including intuitive material editing of a wide range of visual properties, guidance for gamut mapping, analysis of the correlation between perceptual attributes, or novel appearance similarity metrics. Moreover, our methodology can be used to derive functionals applicable to classic analytic BRDF representations. We release our code and dataset publicly, in order to support and encourage further research in this direction

    Toward Evaluating Progressive Rendering Methods in Appearance Design Tasks

    Get PDF
    Progressive rendering is becoming a popular alternative to precomputation approaches for appearance design tasks. Images created by different progressive algorithms exhibit various kinds of visual artifacts at the early stages of computation. We present a user study that investigates the effects of these artifacts on user performance in appearance design tasks. Specifically, we ask both novice and expert subjects to perform lighting and material editing tasks with the following algorithms: random path tracing, quasi-random path tracing, progressive photon mapping, and virtual point light (VPL) rendering. Data collected from the experiments suggest that path tracing is strongly preferred to progressive photon mapping and VPL rendering by both experts and novices. There is no indication that quasi-random path tracing is systematically preferred to random path tracing or vice versa; the same holds between progressive photon mapping and VPL rendering. Interestingly, we did not observe any significant difference in user workflow for the different algorithms. As can be expected, experts are faster and more accurate than novices, but surprisingly both groups have similar subjective preferences and workflow

    A Precomputed Polynomial Representation for Interactive BRDF Editing with Global Illumination

    Get PDF
    The ability to interactively edit BRDFs in their final placement within a computer graphics scene is vital to making informed choices for material properties. We significantly extend previous work on BRDF editing for static scenes (with fixed lighting and view), by developing a precomputed polynomial representation that enables interactive BRDF editing with global illumination. Unlike previous recomputation based rendering techniques, the image is not linear in the BRDF when considering interreflections. We introduce a framework for precomputing a multi-bounce tensor of polynomial coefficients, that encapsulates the nonlinear nature of the task. Significant reductions in complexity are achieved by leveraging the low-frequency nature of indirect light. We use a high-quality representation for the BRDFs at the first bounce from the eye, and lower-frequency (often diffuse) versions for further bounces. This approximation correctly captures the general global illumination in a scene, including color-bleeding, near-field object reflections, and even caustics. We adapt Monte Carlo path tracing for precomputing the tensor of coefficients for BRDF basis functions. At runtime, the high-dimensional tensors can be reduced to a simple dot product at each pixel for rendering. We present a number of examples of editing BRDFs in complex scenes, with interactive feedback rendered with global illumination

    Bidirectional Appearance Distribution Function for Stylized Shading

    Get PDF
    We define a new shading tool called a Bidirectional Appearance Distribution Function (BADF) tailored to the direct control of stylized appearance. A BADF can be thought of as defining the appearance of a sphere from all possible illumination directions. Our BADF formulation generalizes and improves upon previous stylized shading techniques by enabling the direct control of shading profiles in screen space, exaggerating surface features in a flexible manner, and letting users control stylized appearance from multiple lighting or viewing directions. This allows users to start from a simple shading behavior, and refine from there towards greater stylization. Our GPU implementation works in real-time, which benefits both editing, and rendering in interactive systems. These features make BADFs an efficient tool for many applications in artistic and scientific illustration domains

    Bimodal perception of audio-visual material properties for virtual environments

    Get PDF
    INRIA Research Report 6687High-quality rendering of both audio and visual material properties is very important in interactive virtual environments, since convincingly rendered materials increase realism and the sense of immersion. We studied how the level of detail of auditory and visual stimuli interact in the perception of audio-visual material rendering quality. Our study is based on perception of material discrimination, when varying the levels of detail of modal synthesis for sound, and Bidirectional Reflectance Distribution Functions for graphics. We performed an experiment for two different models (a Dragon and a Bunny model) and two material types (Plastic and Gold). The results show a significant interaction between auditory and visual level of detail in the perception of material similarity, when comparing approximate levels of detail to a high-quality audio-visual reference rendering. We show how this result can contribute to significant savings in computation time in an interactive audio-visual rendering system. To our knowledge this is the first study which shows interaction of audio and graphics representation in a material perception task

    Bimodal perception of audio-visual material properties for virtual environments

    Get PDF
    International audienceHigh-quality rendering of both audio and visual material properties is very important in interac- tive virtual environments, since convicingly rendered materials increase realism and the sense of immersion. We studied how the level of detail of auditory and visual stimuli interact in the per- ception of audio-visual material rendering quality. Our study is based on perception of material discrimination, when varying the levels of detail of modal synthesis for sound, and Bidirectional Re ectance Distribution Functions for graphics. We performed an experiment for two di erent models (a Dragon and a Bunny model) and two material types (Plastic and Gold). The results show a signi cant interaction between auditory and visual level of detail in the perception of material similarity, when comparing approximate levels of detail to a high-quality audio-visual reference rendering. We show how this result can contribute to signi cant savings in computation time in an interactive audio-visual rendering system. To our knowledge this is the rst study which shows interaction of audio and graphics representation in a material perception task
    corecore