5,232 research outputs found

    Digital 3D reconstruction of historical textile fragment

    Get PDF
    This paper presents a new methodology for reproducing historic fragment in 3D with realistic behaviour, providing users with a feel for the fragment detailing. The fragment piece originates from the English National Trust archive held in the collection at Claydon House. The aim is to utilize a combination of both 2D pattern software and state-of-the-art 3D technology to recreate a compelling and a highly realistic representation of historic fragment. The process starts with investigation of the textile construction. Textile fragments will be incomplete and/or have a level of deterioration therefore various recording techniques are to be explored. A combination of both photography and 3D scanning technology will be utilized throughout the methodology to accurately record the digital data. The equipment setting will be analyzed in order to produce an accurate working method. This paper forming part of a larger study, will specifically focus on the methodology for recording data from one fragment piece

    A Multi-scale Yarn Appearance Model with Fiber Details

    Full text link
    Rendering realistic cloth has always been a challenge due to its intricate structure. Cloth is made up of fibers, plies, and yarns, and previous curved-based models, while detailed, were computationally expensive and inflexible for large cloth. To address this, we propose a simplified approach. We introduce a geometric aggregation technique that reduces ray-tracing computation by using fewer curves, focusing only on yarn curves. Our model generates ply and fiber shapes implicitly, compensating for the lack of explicit geometry with a novel shadowing component. We also present a shading model that simplifies light interactions among fibers by categorizing them into four components, accurately capturing specular and scattered light in both forward and backward directions. To render large cloth efficiently, we propose a multi-scale solution based on pixel coverage. Our yarn shading model outperforms previous methods, achieving rendering speeds 3-5 times faster with less memory in near-field views. Additionally, our multi-scale solution offers a 20% speed boost for distant cloth observation

    Visual Prototyping of Cloth

    Get PDF
    Realistic visualization of cloth has many applications in computer graphics. An ongoing research problem is how to best represent and capture appearance models of cloth, especially when considering computer aided design of cloth. Previous methods can be used to produce highly realistic images, however, possibilities for cloth-editing are either restricted or require the measurement of large material databases to capture all variations of cloth samples. We propose a pipeline for designing the appearance of cloth directly based on those elements that can be changed within the production process. These are optical properties of fibers, geometrical properties of yarns and compositional elements such as weave patterns. We introduce a geometric yarn model, integrating state-of-the-art textile research. We further present an approach to reverse engineer cloth and estimate parameters for a procedural cloth model from single images. This includes the automatic estimation of yarn paths, yarn widths, their variation and a weave pattern. We demonstrate that we are able to match the appearance of original cloth samples in an input photograph for several examples. Parameters of our model are fully editable, enabling intuitive appearance design. Unfortunately, such explicit fiber-based models can only be used to render small cloth samples, due to large storage requirements. Recently, bidirectional texture functions (BTFs) have become popular for efficient photo-realistic rendering of materials. We present a rendering approach combining the strength of a procedural model of micro-geometry with the efficiency of BTFs. We propose a method for the computation of synthetic BTFs using Monte Carlo path tracing of micro-geometry. We observe that BTFs usually consist of many similar apparent bidirectional reflectance distribution functions (ABRDFs). By exploiting structural self-similarity, we can reduce rendering times by one order of magnitude. This is done in a process we call non-local image reconstruction, which has been inspired by non-local means filtering. Our results indicate that synthesizing BTFs is highly practical and may currently only take a few minutes for small BTFs. We finally propose a novel and general approach to physically accurate rendering of large cloth samples. By using a statistical volumetric model, approximating the distribution of yarn fibers, a prohibitively costly, explicit geometric representation is avoided. As a result, accurate rendering of even large pieces of fabrics becomes practical without sacrificing much generality compared to fiber-based techniques

    State of Being

    Get PDF
    My work speaks to the processes of adaptation and assimilation, phenomena that explain the way in which we transform life experience and incorporate the effects of such experience into the daily workings of our psyche. To this extent my work is a self-analysis, an autobiographical reckoning, a non-verbal representation of collective experiences rendered in forms upon which images are spontaneously drawn or painted with fiber. The process of making art as a means of accessing creative instincts is a manifestation of the way in which I experience life. Adapting and assimilating to our human condition is an art, a form of survival that allows for self-expression as a technique of understanding, a way of translating beauty into collective consciousness, a means of transforming atrocity too enormous for words, an offer of conversation that transcends human reason, a sharing of imagination that embraces the past, the present and the future. As the world grows increasingly complex, our very existence is threatened by terrorist attacks, natural disasters, and socioeconomic confusion. A culture driven by consumerism responds to global competition for technology that races against the speed of light. Human misunderstanding is relegated to war, courts of law and bi-partisan politics. Adapting and assimilating life circumstances and experiences with a sensitivity to the interplay of intensely colorful fiber in my hands affects an optimistic and energetic reinterpretation of life\u27s complexity. In a time of uncertainty, art is a reason for hope

    Mechanics-Aware Modeling of Cloth Appearance

    Get PDF

    HAPTIC VISUALIZATION USING VISUAL TEXTURE INFORMATION

    Get PDF
    Haptic enables users to interact and manipulate virtual objects. Although haptic research has influenced many areas yet the inclusion of computer haptic into computer vision, especially content based image retrieval (CBIR), is still few and limited. The purpose of this research is to design and validate a haptic texture search framework that will allow texture retrieval to be done not just visually but also haptically. Hence, this research is addressing the gap between the computer haptic and CBIR fields. In this research, the focus is on cloth textures. The design of the proposed framework involves haptic texture rendering algorithm and query algorithm. The proposed framework integrates computer haptic and content based image retrieval (CBIR) where haptic texture rendering is performed based on extracted cloth data. For the query purposes, the data are characterized and the texture similarity is calculated. Wavelet decomposition is utilized to extract data information from texture data. In searching process, the data are retrieved based on data distribution. The experiments to validate the framework have shown that haptic texture rendering can be performed by employing techniques that involve either a simple waveform or visual texture information. While rendering process was performed instability forces were generated during the rendering process was due to the limitation of the device. In the query process, accuracy is determined by the number of feature vector elements, data extraction, and similarity measurement algorithm. A user testing to validate the framework shows that users’ perception of haptic feedback differs depending on the different type of rendering algorithm. A simple rendering algorithm, i.e. sine wave, produces a more stable force feedback, yet lacks surface details compared to the visual texture information approach
    • …
    corecore