6 research outputs found

    Візуалізація 3D моделі церкви Святого Андрія Первозванного села Хоружівка

    Get PDF
    Метою кваліфікаційної роботи бакалавра є реалізація проекту у вигляді 3D моделі, яка буде доступна будь-якому користувачу за умови наявності комп’ютера. Проведено детальний аналіз обраної предметної області. Виконано огляд актуальних досліджень та публікацій, аналіз аналогів, постановку задачі. Було сформовано мету та задачі дослідження. Також було описано проектування 3D моделі. Було виконано проектування IDEF0 та її декомпозиція, діаграма варіантів використання. Результатом проведеної роботи є розроблена якісна 3D модель історичної пам’ятки «Церква Святого Андрія Первозваного села Хоружівка»

    The challenges in computer supported conceptual engineering design

    Get PDF
    Computer Aided Engineering Design (CAED) supports the engineering design process during the detail design, but it is not commonly used in the conceptual design stage. This article explores through literature why this is and how the engineering design research community is responding through the development of new conceptual CAED systems and HCI (Human Computer Interface) prototypes. First the requirements and challenges for future conceptual CAED and HCI solutions to better support conceptual design are explored and categorised. Then the prototypes developed in both areas, since 2000, are discussed. Characteristics already considered and those required for future development of CAED systems and HCIs are proposed and discussed, one of the key ones being experience. The prototypes reviewed offer innovative solutions, but only address selected requirements of conceptual design, and are thus unlikely to not provide a solution which would fit the wider needs of the engineering design industry. More importantly, while the majority of prototypes show promising results they are of low maturity and require further development

    User-based gesture vocabulary for form creation during a product design process

    Get PDF
    There are inconsistencies between the nature of the conceptual design and the functionalities of the computational systems supporting it, which disrupt the designers’ process, focusing on technology rather than designers’ needs. A need for elicitation of hand gestures appropriate for the requirements of the conceptual design, rather than those arbitrarily chosen or focusing on ease of implementation was identified.The aim of this thesis is to identify natural and intuitive hand gestures for conceptual design, performed by designers (3rd, 4th year product design engineering students and recent graduates) working on their own, without instruction and without limitations imposed by the facilitating technology. This was done via a user centred study including 44 participants. 1785 gestures were collected. Gestures were explored as a sole mean for shape creation and manipulation in virtual 3D space. Gestures were identified, described in writing, sketched, coded based on the taxonomy used, categorised based on hand form and the path travelled and variants identified. Then they were statistically analysed to ascertain agreement rates between the participants, significance of the agreement and the likelihood of number of repetitions for each category occurring by chance. The most frequently used and statistically significant gestures formed the consensus set of vocabulary for conceptual design. The effect of the shape of the manipulated object on the gesture performed, and if the sequence of the gestures participants proposed was different from the established CAD solid modelling practices were also observed.Vocabulary was evaluated by non-designer participants, and the outcomes have shown that the majority of gestures were appropriate and easy to perform. Evaluation was performed theoretically and in the VR environment. Participants selected their preferred gestures for each activity, and a variant of the vocabulary for conceptual design was created as an outcome, that aims to ensure that extensive training is not required, extending the ability to design beyond trained designers only.There are inconsistencies between the nature of the conceptual design and the functionalities of the computational systems supporting it, which disrupt the designers’ process, focusing on technology rather than designers’ needs. A need for elicitation of hand gestures appropriate for the requirements of the conceptual design, rather than those arbitrarily chosen or focusing on ease of implementation was identified.The aim of this thesis is to identify natural and intuitive hand gestures for conceptual design, performed by designers (3rd, 4th year product design engineering students and recent graduates) working on their own, without instruction and without limitations imposed by the facilitating technology. This was done via a user centred study including 44 participants. 1785 gestures were collected. Gestures were explored as a sole mean for shape creation and manipulation in virtual 3D space. Gestures were identified, described in writing, sketched, coded based on the taxonomy used, categorised based on hand form and the path travelled and variants identified. Then they were statistically analysed to ascertain agreement rates between the participants, significance of the agreement and the likelihood of number of repetitions for each category occurring by chance. The most frequently used and statistically significant gestures formed the consensus set of vocabulary for conceptual design. The effect of the shape of the manipulated object on the gesture performed, and if the sequence of the gestures participants proposed was different from the established CAD solid modelling practices were also observed.Vocabulary was evaluated by non-designer participants, and the outcomes have shown that the majority of gestures were appropriate and easy to perform. Evaluation was performed theoretically and in the VR environment. Participants selected their preferred gestures for each activity, and a variant of the vocabulary for conceptual design was created as an outcome, that aims to ensure that extensive training is not required, extending the ability to design beyond trained designers only

    Toward semantic model generation from sketch and multi-touch interactions

    Get PDF
    Designers usually start their design process by exploring and evolving their ideas rapidly through sketching since this helps them to make numerous attempts at creating, practicing, simulating, and representing ideas. Creativity inherent in solving the ill-defined problems (Eastman, 1969) often emerges when designers explore potential solutions while sketching in the design process (Schön, 1992). When using computer programs such as CAD or Building Information Modeling (BIM) tools, designers often preplan the tasks prior to executing commands instead of engaging in the process of designing. Researchers argue that these programs force designers to focus on how to use a tool (i.e. how to execute series of commands) rather than how to explore a design, and thus hinder creativity in the early stages of the design process (Goel, 1995; Dorta, 2007). Since recent design and documentation works have been computer-generated using BIM software, transitions between ideas in sketches and those in digital CAD systems have become necessary. By employing sketch interactions, we argue that a computer system can provide a rapid, flexible, and iterative method to create 3D models with sufficient data for facilitating smooth transitions between designers’ early sketches and BIM programs. This dissertation begins by describing the modern design workflows and discussing the necessary data to be exchanged in the early stage of design. It then briefly introduces the modern cognitive theories, including embodiment (Varela, Rosch, & Thompson, 1992), situated action (Suchman, 1986), and distributed cognition (Hutchins, 1995). It continues by identifying problems in current CAD programs used in the early stage of the design process, using these theories as lenses. After reviewing modern attempts, including sketch tools and design automation tools, we describe the design and implementation of a sketch and multi-touch program, SolidSketch, to facilitate and augment our abilities to work on ill-defined problems in the early stage of design. SolidSketch is a parametric modeling program that enables users to construct 3D parametric models rapidly through sketch and multi-touch interactions. It combines the benefits of traditional design tools, such as physical models and pencil sketches (i.e. rapid, low-cost, and flexible methods), with the computational power offered by digital modeling tools, such as CAD. To close the gap between modern BIM and traditional sketch tools, the models created with SolidSketch can be read by other BIM programs. We then evaluate the programs with comparisons to the commercial CAD programs and other sketch programs. We also report a case study in which participants used the system for their design explorations. Finally, we conclude with the potential impacts of this new technology and the next steps for ultimately bringing greater computational power to the early stages of design.Ph.D

    Editing 3D models on smart devices

    Get PDF
    AbstractThis study proposes a 3D CAD system available on smart devices, which are now a part of everyday life and which are widely applied in various domains, such as education and robot industry. If an engineer has a new idea while traveling or on the move, or in the case of collaboration between more than two engineers, this 3D CAD system allows modeling to be performed in a rapid and simple manner on a smart device. This 3D CAD system uses the common multi-touch gestures associated with smart devices to keep the modeling operations simple and easy for users. However, it is difficult to input the precise geometric information to generate 3D CAD models by such gestures. It is also impractical to provide a full set of modeling operations on a smart device due to hardware limitations. For this reason, the system excludes several complicated modeling operations. This work provides a scheme to regenerate a parametric 3D model on a PC-based CAD system via a macro-parametrics approach by transferring the 3D model created on a smart device in an editable form to a PC-based CAD system. If fine editing is needed, the user can perform additional work on a PC after reconstruction. Through the developed system, it is possible to produce a 3D editable model swiftly and simply in the smart device environment, allowing for reduced design time while also facilitating collaboration. This paper discusses the first-ever system design of a 3D CAD system on a smart device, the selection of the modeling operations, the assignment of gestures to these operations, and use of operation modes. This is followed by an introduction of the implementation methods, and finally a demonstration of case studies using a prototype system with examples
    corecore