5,767 research outputs found
Recommended from our members
An investigation on the framework of dressing virtual humans
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Realistic human models are widely used in variety of applications. Much research has been carried out on improving realism of virtual humans from various aspects, such as body shapes, hair, and facial expressions and so on. In most occasions, these virtual humans need to wear garments. However, it is time-consuming and tedious to dress a human model using current software packages [Maya2004]. Several methods for dressing virtual humans have been proposed recently [Bourguignon2001, Turquin2004, Turquin2007 and Wang2003B]. The method proposed by Bourguignon et al [Bourguignon2001] can only generate 3D garment contour instead of 3D surface. The method presented by Turquin et al. [Turquin2004, Turquin2007] could generate various kinds of garments from sketches but their garments followed the shape of the body and the side of a garment looked not convincing because of using simple linear interpolation. The method proposed by Wang et al. [Wang2003B] lacked interactivity from users, so users had very limited control on the garment shape.This thesis proposes a framework for dressing virtual humans to obtain convincing dressing results, which overcomes problems existing in previous papers mentioned above by using nonlinear interpolation, level set-based shape modification, feature constraints and so on. Human models used in this thesis are reconstructed from real human body data obtained using a body scanning system. Semantic information is then extracted from human models to assist in generation of 3 dimensional (3D) garments. The proposed framework allows users to dress virtual humans using garment patterns and sketches. The proposed dressing method is based on semantic virtual humans. A semantic human model is a human body with semantic information represented by certain of structure and body features. The semantic human body is reconstructed from body scanned data from a real human body. After segmenting the human model into six parts some key features are extracted. These key features are used as constraints for garment construction.Simple 3D garment patterns are generated using the techniques of sweep and offset. To dress a virtual human, users just choose a garment pattern, which is put on the human body at the default position with a default size automatically. Users are allowed to change simple parameters to specify some sizes of a garment by sketching the desired position on the human body.To enable users to dress virtual humans by their own design styles in an intuitive way, this thesis proposes an approach for garment generation from user-drawn sketches. Users can directly draw sketches around reconstructed human bodies and then generates 3D garments based on user-drawn strokes. Some techniques for generating 3D garments and dressing virtual humans are proposed. The specific focus of the research lies in generation of 3D geometric garments, garment shape modification, local shape modification, garment surface processing and decoration creation. A sketch-based interface has been developed allowing users to draw garment contour representing the front-view shape of a garment, and the system can generate a 3D geometric garment surface accordingly. To improve realism of a garment surface, this thesis presents three methods as follows. Firstly, the procedure of garment vertices generation takes key body features as constraints. Secondly, an optimisation algorithm is carried out after generation of garment vertices to optimise positions of garment vertices. Finally, some mesh processing schemes are applied to further process the garment surface. Then, an elaborate 3D geometric garment surface can be obtained through this series of processing. Finally, this thesis proposes some modification and editing methods. The user-drawn sketches are processed into spline curves, which allow users to modify the existing garment shape by dragging the control points into desired positions. This makes it easy for users to obtain a more satisfactory garment shape compared with the existing one. Three decoration tools including a 3D pen, a brush and an embroidery tool, are provided letting users decorate the garment surface by adding some small 3D details such as brand names, symbols and so on. The prototype of the framework is developed using Microsoft Visual Studio C++,OpenGL and GPU programming
A virtual garment design and simulation system
In this paper, a 3D graphics environment for virtual garment design and simulation is presented. The proposed system enables the three dimensional construction of a garment from its cloth panels, for which the underlying structure is a mass-spring model. The garment construction process is performed through automatic pattern generation, posterior correction, and seaming. Afterwards, it is possible to do fitting on virtual mannequins as if in a real life tailor's workshop. The system provides the users with the flexibility to design their own garment patterns and make changes on the garment even after the dressing of the model. Furthermore, rendering alternatives for the visualization of knitted and woven fabric are presented. © 2007 IEEE
AN INVESTIGATION ON THE VIRTUAL PROTOTYPING VALIDITY â SIMULATION OF GARMENT DRAPE
Achievement of desired garment form is essential in the development of clothing design, which depends on properties of its raw material - mainly fabric. Virtual prototyping can serve as a tool for assessing the form and fit of garments before real production and deciding whether to make changes in ease values, pattern cut or fabric parameters. The aim of the study is investigation of reliability of virtual prototyping results using Modaris 3D (Lectra) due to influences of changeable fabric parameters on garment drape effects, as well as verifiability with three-dimensional (3D) scanning (Vitus Smart XXLÂź) of real products. For the research half-circle cut skirt designed in appropriate size for standard figure dummy. Skirt virtually simulated on mannequin which previously scanned and imported into the system. Properties of three different types of fabrics examined in a material testing laboratory according to requirements of relevant standards. Skirt virtually tried-on defining fabric properties by gained testing results and afterwards made from real fabrics, put on the dummy and scanned. Drape effects of the various virtual prototypes and real product scans compared, both in the CAD system and the scanning system (Anthroscan) using cross-sections and their measurements (depths and diameters of folds, circumferences). Fabric parameters has an influence on the reliability of virtual prototyping results in terms of accuracy of parameters determined and put into the system. Cross-sections with measurements reveal differences between virtually sewn and real skirt drape configurations
Virtual Garments: A Fully Geometric Approach for Clothing Design
International audienceModeling dressed characters is known as a very tedious process. It usually requires specifying 2D fabric patterns, positioning and assembling them in 3D, and then performing a physically-based simulation. The latter accounts for gravity and collisions to compute the rest shape of the garment, with the adequate folds and wrinkles. This paper presents a more intuitive way to design virtual clothing. We start with a 2D sketching system in which the user draws the contours and seam-lines of the garment directly on a virtual mannequin. Our system then converts the sketch into an initial 3D surface using an existing method based on a precomputed distance field around the mannequin. The system then splits the created surface into different panels delimited by the seam-lines. The generated panels are typically not developable. However, the panels of a realistic garment must be developable, since each panel must unfold into a 2D sewing pattern. Therefore our system automatically approximates each panel with a developable surface, while keeping them assembled along the seams. This process allows us to output the corresponding sewing patterns. The last step of our method computes a natural rest shape for the 3D garment, including the folds due to the collisions with the body and gravity. The folds are generated using procedural modeling of the buckling phenomena observed in real fabric. The result of our algorithm consists of a realistic looking 3D mannequin dressed in the designed garment and the 2D patterns which can be used for distortion free texture mapping. The patterns we create also allow us to sew real replicas of the virtual garments
Enhancing the employability of fashion students through the use of 3D CAD
The textile and apparel industry has one of the longest and most intricate supply chains within manufacturing. Advancement in technology has facilitated its globalisation, enabling companies to span geographical borders. This has led to new methods of communication using electronic data formats. Throughout the latter part of the 20th Century, 2D CAD technology established itself as an invaluable tool within design and product development. More recently 3D virtual simulation software has made small but significant steps within this market. The technological revolution has opened significant opportunities for those forward thinking companies that are beginning to utilise 3D software. This advanced technology requires designers with unique skill sets. This paper investigates the skills required by fashion graduates from an industry perspective.
To reflect current industrial working practices, it is essential for educational establishments to incorporate technologies that will enhance the employability of graduates. This study developed an adapted action research model based on the work of Kurt Lewin, which reviewed the learning and teaching of 3D CAD within higher education. It encompassed the selection of 3D CAD software development, analysis of industry requirements, and the implementation of 3D CAD into the learning and teaching of a selection of fashion students over a three year period. Six interviews were undertaken with industrial design and product development specialists to determine: current working practices, opinions of virtual 3D software and graduate skill requirements.
It was found that the companies had similar working practices independent of the software utilised within their product development process. The companies which employed 3D CAD software considered further developments were required before the technology could be fully integrated. Further to this it was concluded that it was beneficial for graduates to be furnished with knowledge of emerging technologies which reflect industry and enhance their employability skills
Historic Costume Simulation and its Application
This study highlights the potential of new technology as a means to provide new possibility for costumes in fragile condition to be utilised. The aim of this study is to create accurate digital duplicates of costumes from historical sources, and to explore the possibility of developing them as an exhibitory and educational method applying 3D apparel CAD and new media. To achieve this, three attributes for qualities of effective digital costumes were suggested: faithful reproduction, virtual fabrication, and interactive and stereographic appreciation. Based on these qualities, digital costumes and a PC application were produced and evaluated
Historical Costume Simulation
The aim of this study is to produce accurate reproductions of digital clothing from historical sources and to investigate the implications of developing it for online museum exhibits. In order to achieve this, the study is going through several stages. Firstly, the theoretical background of the main issues will be established through the review of various published papers on 3D apparel CAD, drape and digital curation. Next, using a 3D apparel CAD system, this study attempts the realistic visualization of the costumes based on the establishment of a valid simulation reference. This paper reports the pilot exercise carried out to scope the requirements for going forward
Wearable performance
This is the post-print version of the article. The official published version can be accessed from the link below - Copyright @ 2009 Taylor & FrancisWearable computing devices worn on the body provide the potential for digital interaction in the world. A new stage of computing technology at the beginning of the 21st Century links the personal and the pervasive through mobile wearables. The convergence between the miniaturisation of microchips (nanotechnology), intelligent textile or interfacial materials production, advances in biotechnology and the growth of wireless, ubiquitous computing emphasises not only mobility but integration into clothing or the human body. In artistic contexts one expects such integrated wearable devices to have the two-way function of interface instruments (e.g. sensor data acquisition and exchange) worn for particular purposes, either for communication with the environment or various aesthetic and compositional expressions. 'Wearable performance' briefly surveys the context for wearables in the performance arts and distinguishes display and performative/interfacial garments. It then focuses on the authors' experiments with 'design in motion' and digital performance, examining prototyping at the DAP-Lab which involves transdisciplinary convergences between fashion and dance, interactive system architecture, electronic textiles, wearable technologies and digital animation. The concept of an 'evolving' garment design that is materialised (mobilised) in live performance between partners originates from DAP Lab's work with telepresence and distributed media addressing the 'connective tissues' and 'wearabilities' of projected bodies through a study of shared embodiment and perception/proprioception in the wearer (tactile sensory processing). Such notions of wearability are applied both to the immediate sensory processing on the performer's body and to the processing of the responsive, animate environment. Wearable computing devices worn on the body provide the potential for digital interaction in the world. A new stage of computing technology at the beginning of the 21st Century links the personal and the pervasive through mobile wearables. The convergence between the miniaturisation of microchips (nanotechnology), intelligent textile or interfacial materials production, advances in biotechnology and the growth of wireless, ubiquitous computing emphasises not only mobility but integration into clothing or the human body. In artistic contexts one expects such integrated wearable devices to have the two-way function of interface instruments (e.g. sensor data acquisition and exchange) worn for particular purposes, either for communication with the environment or various aesthetic and compositional expressions. 'Wearable performance' briefly surveys the context for wearables in the performance arts and distinguishes display and performative/interfacial garments. It then focuses on the authors' experiments with 'design in motion' and digital performance, examining prototyping at the DAP-Lab which involves transdisciplinary convergences between fashion and dance, interactive system architecture, electronic textiles, wearable technologies and digital animation. The concept of an 'evolving' garment design that is materialised (mobilised) in live performance between partners originates from DAP Lab's work with telepresence and distributed media addressing the 'connective tissues' and 'wearabilities' of projected bodies through a study of shared embodiment and perception/proprioception in the wearer (tactile sensory processing). Such notions of wearability are applied both to the immediate sensory processing on the performer's body and to the processing of the responsive, animate environment
A new automated workflow for 3D character creation based on 3D scanned data
In this paper we present a new workflow allowing the creation of 3D characters in an automated way that does not require the expertise of an animator. This workflow is based of the acquisition of real human data captured by 3D body scanners, which is them processed to generate firstly animatable body meshes, secondly skinned body meshes and finally textured 3D garments
- âŠ