3,022 research outputs found

    Modelling and Visualisation of the Optical Properties of Cloth

    Get PDF
    Cloth and garment visualisations are widely used in fashion and interior design, entertaining, automotive and nautical industry and are indispensable elements of visual communication. Modern appearance models attempt to offer a complete solution for the visualisation of complex cloth properties. In the review part of the chapter, advanced methods that enable visualisation at micron resolution, methods used in three-dimensional (3D) visualisation workflow and methods used for research purposes are presented. Within the review, those methods offering a comprehensive approach and experiments on explicit clothes attributes that present specific optical phenomenon are analysed. The review of appearance models includes surface and image-based models, volumetric and explicit models. Each group is presented with the representative authors’ research group and the application and limitations of the methods. In the final part of the chapter, the visualisation of cloth specularity and porosity with an uneven surface is studied. The study and visualisation was performed using image data obtained with photography. The acquisition of structure information on a large scale namely enables the recording of structure irregularities that are very common on historical textiles, laces and also on artistic and experimental pieces of cloth. The contribution ends with the presentation of cloth visualised with the use of specular and alpha maps, which is the result of the image processing workflow

    Visual Prototyping of Cloth

    Get PDF
    Realistic visualization of cloth has many applications in computer graphics. An ongoing research problem is how to best represent and capture appearance models of cloth, especially when considering computer aided design of cloth. Previous methods can be used to produce highly realistic images, however, possibilities for cloth-editing are either restricted or require the measurement of large material databases to capture all variations of cloth samples. We propose a pipeline for designing the appearance of cloth directly based on those elements that can be changed within the production process. These are optical properties of fibers, geometrical properties of yarns and compositional elements such as weave patterns. We introduce a geometric yarn model, integrating state-of-the-art textile research. We further present an approach to reverse engineer cloth and estimate parameters for a procedural cloth model from single images. This includes the automatic estimation of yarn paths, yarn widths, their variation and a weave pattern. We demonstrate that we are able to match the appearance of original cloth samples in an input photograph for several examples. Parameters of our model are fully editable, enabling intuitive appearance design. Unfortunately, such explicit fiber-based models can only be used to render small cloth samples, due to large storage requirements. Recently, bidirectional texture functions (BTFs) have become popular for efficient photo-realistic rendering of materials. We present a rendering approach combining the strength of a procedural model of micro-geometry with the efficiency of BTFs. We propose a method for the computation of synthetic BTFs using Monte Carlo path tracing of micro-geometry. We observe that BTFs usually consist of many similar apparent bidirectional reflectance distribution functions (ABRDFs). By exploiting structural self-similarity, we can reduce rendering times by one order of magnitude. This is done in a process we call non-local image reconstruction, which has been inspired by non-local means filtering. Our results indicate that synthesizing BTFs is highly practical and may currently only take a few minutes for small BTFs. We finally propose a novel and general approach to physically accurate rendering of large cloth samples. By using a statistical volumetric model, approximating the distribution of yarn fibers, a prohibitively costly, explicit geometric representation is avoided. As a result, accurate rendering of even large pieces of fabrics becomes practical without sacrificing much generality compared to fiber-based techniques

    Woven Fabric Model Creation from a Single Image

    Get PDF
    We present a fast, novel image-based technique, for reverse engineering woven fabrics at a yarn level. These models can be used in a wide range of interior design and visual special effects applications. In order to recover our pseudo-BTF, we estimate the 3D structure and a set of yarn parameters (e.g. yarn width, yarn crossovers) from spatial and frequency domain cues. Drawing inspiration from previous work [Zhao et al. 2012], we solve for the woven fabric pattern, and from this build data set. In contrast however, we use a combination of image space analysis, frequency domain analysis and in challenging cases match image statistics with those from previously captured known patterns. Our method determines, from a single digital image, captured with a DSLR camera under controlled uniform lighting, the woven cloth structure, depth and albedo, thus removing the need for separately measured depth data. The focus of this work is on the rapid acquisition of woven cloth structure and therefore we use standard approaches to render the results.Our pipeline first estimates the weave pattern, yarn characteristics and noise statistics using a novel combination of low level image processing and Fourier Analysis. Next, we estimate a 3D structure for the fabric sample us- ing a first order Markov chain and our estimated noise model as input, also deriving a depth map and an albedo. Our volumetric textile model includes information about the 3D path of the center of the yarns, their variable width and hence the volume occupied by the yarns, and colors.We demonstrate the efficacy of our approach through comparison images of test scenes rendered using: (a) the original photograph, (b) the segmented image, (c) the estimated weave pattern and (d) the rendered result.<br/

    Mechanics-Aware Modeling of Cloth Appearance

    Get PDF

    The material image: Artists’ approaches to reproducing texture in art

    Get PDF
    Since the introduction of computers, there has been a desire to improve the appearance of computer-generated objects in virtual spaces and to display the objects within complex scenes exactly as they appear in reality. This is a straightforward process for artists who through the medium of paint or silver halide are able to directly observe from nature and interpret and capture the world in a highly convincing way. However for computer generated images, the process is more complex, computers have no capability to compare whether the rendering looks right or wrong—only humans can make the final subjective decision. The evolving question is: what are the elements of paintings and drawings produced by artists that capture the qualities, texture, grain, reflection, translucency and absorption of a material, that through the application of coloured brush marks, demonstrate a convincing likeness of the material qualities of e.g.wood, metal, glass and fabric? This paper considers the relationship between texture, objects and artists’ approaches to reproducing texture in art. However texture is problematic as our visual system is able to discriminate the difference between natural and patterned texture, and incorrectly rendered surfaces can hinder understanding. Furthermore to render surfaces with no discernible pattern structure that comprises unlimited variations can result, as demonstrated by the computer generated rendering, in exceptionally large file sizes. The paper explores the relationship between imaging, artists’ approaches to reproducing representations of the attributes of material qualities, the fluid dynamics of a painterly mark, and 2.5D relief in printing. The objective is not to reproduce existing paintings or prints, but to build the surface using a deposition of pigments, paints and inks that explores the relationship between image and surface
    • …
    corecore