2,388 research outputs found

    Tangible augmented reality intervention for a product dissection task

    Get PDF
    Functional decomposition is a process used by engineers and designers for identifying functions in various design tasks. The concepts of function and functional decomposition are important in engineering to understand for converting complex problems into abstractions. However, research results indicate that engineers and designers struggle to apply functional decomposition methods during design tasks, such as product dissection, due to the abstract nature of product functions. Augmented reality, coupled with tangible objects, provides a platform on which information about different design aspect models can be blended and presented to the user during a product dissection or other design task. Past research discusses the interrelations between function, behavior and structure, and explains how providing information on one area contributes to understanding in the other areas. This research develops an AR application to overlay context-sensitive information about product behavior and structure during a product design task, a product dissection, to contribute to users’ understanding of function. A user study was conducted to evaluate the effectiveness of this AR-supported product dissection. Participants were asked to dissect a hair dryer and toy dart blaster and generate a function diagram for each to describe how they work: one dissection using the AR application and the other a physical dissection, to serve as a baseline. Function diagrams allow for comparison of user’s functional understanding. The function diagrams were evaluated for accuracy on several metrics including the total number of functions generated and the number of syntactical, semantic and relational errors present in each diagram. While no statistical or qualitative differences were found between the diagrams generated during the AR-supported and non-AR supported tasks, from study of the users’ function diagram and pre- and post-study test scores, several observations were made and trends were noted about users’ understanding of function and their difficulty in applying different aspects of functional decomposition

    Influence of Personality on Shape-Based Design Activities

    Get PDF
    As the literature demonstrates, designers\u2019 personality influences design activities like different ways to represent environments and/or products, technological advances, etc.. Nevertheless, an exhaustive analysis on the influence of personality on design activities involving different representations is missing.This research explores this gap by studying this influence on specific design activities, the shape-based ones (i.e., analysis of specific shapes and highlighting of functions suggested by them). People showing different personalities undergo tests where they carry out design activities exploiting several representations.The results confirm the influence of personality on shape-based design activities and allow highlighting different keys to interpret and exploit these results. Thanks to the results of this study, researchers can increase their knowledge about subjective aspects of design as well as about how these aspects coexist with classic and emerging representations. As well, designers can try tomaximize the effectiveness of their efforts by selecting the best combinations of personality, representations, and characteristics of the expected design results time by time

    Development and Validation of a Hybrid Virtual/Physical Nuss Procedure Surgical Trainer

    Get PDF
    With continuous advancements and adoption of minimally invasive surgery, proficiency with nontrivial surgical skills involved is becoming a greater concern. Consequently, the use of surgical simulation has been increasingly embraced by many for training and skill transfer purposes. Some systems utilize haptic feedback within a high-fidelity anatomically-correct virtual environment whereas others use manikins, synthetic components, or box trainers to mimic primary components of a corresponding procedure. Surgical simulation development for some minimally invasive procedures is still, however, suboptimal or otherwise embryonic. This is true for the Nuss procedure, which is a minimally invasive surgery for correcting pectus excavatum (PE) – a congenital chest wall deformity. This work aims to address this gap by exploring the challenges of developing both a purely virtual and a purely physical simulation platform of the Nuss procedure and their implications in a training context. This work then describes the development of a hybrid mixed-reality system that integrates virtual and physical constituents as well as an augmentation of the haptic interface, to carry out a reproduction of the primary steps of the Nuss procedure and satisfy clinically relevant prerequisites for its training platform. Furthermore, this work carries out a user study to investigate the system’s face, content, and construct validity to establish its faithfulness as a training platform

    INTERFACE DESIGN FOR A VIRTUAL REALITY-ENHANCED IMAGE-GUIDED SURGERY PLATFORM USING SURGEON-CONTROLLED VIEWING TECHNIQUES

    Get PDF
    Initiative has been taken to develop a VR-guided cardiac interface that will display and deliver information without affecting the surgeons’ natural workflow while yielding better accuracy and task completion time than the existing setup. This paper discusses the design process, the development of comparable user interface prototypes as well as an evaluation methodology that can measure user performance and workload for each of the suggested display concepts. User-based studies and expert recommendations are used in conjunction to es­ tablish design guidelines for our VR-guided surgical platform. As a result, a better understanding of autonomous view control, depth display, and use of virtual context, is attained. In addition, three proposed interfaces have been developed to allow a surgeon to control the view of the virtual environment intra-operatively. Comparative evaluation of the three implemented interface prototypes in a simulated surgical task scenario, revealed performance advantages for stereoscopic and monoscopic biplanar display conditions, as well as the differences between three types of control modalities. One particular interface prototype demonstrated significant improvement in task performance. Design recommendations are made for this interface as well as the others as we prepare for prospective development iterations

    Mobile Augmented Reality: User Interfaces, Frameworks, and Intelligence

    Get PDF
    Mobile Augmented Reality (MAR) integrates computer-generated virtual objects with physical environments for mobile devices. MAR systems enable users to interact with MAR devices, such as smartphones and head-worn wearables, and perform seamless transitions from the physical world to a mixed world with digital entities. These MAR systems support user experiences using MAR devices to provide universal access to digital content. Over the past 20 years, several MAR systems have been developed, however, the studies and design of MAR frameworks have not yet been systematically reviewed from the perspective of user-centric design. This article presents the first effort of surveying existing MAR frameworks (count: 37) and further discuss the latest studies on MAR through a top-down approach: (1) MAR applications; (2) MAR visualisation techniques adaptive to user mobility and contexts; (3) systematic evaluation of MAR frameworks, including supported platforms and corresponding features such as tracking, feature extraction, and sensing capabilities; and (4) underlying machine learning approaches supporting intelligent operations within MAR systems. Finally, we summarise the development of emerging research fields and the current state-of-the-art, and discuss the important open challenges and possible theoretical and technical directions. This survey aims to benefit both researchers and MAR system developers alike.Peer reviewe

    Art and Medicine: A Collaborative Project Between Virginia Commonwealth University in Qatar and Weill Cornell Medicine in Qatar

    Get PDF
    Four faculty researchers, two from Virginia Commonwealth University in Qatar, and two from Weill Cornell Medicine in Qatar developed a one semester workshop-based course in Qatar exploring the connections between art and medicine in a contemporary context. Students (6 art / 6 medicine) were enrolled in the course. The course included presentations by clinicians, medical engineers, artists, computing engineers, an art historian, a graphic designer, a painter, and other experts from the fields of art, design, and medicine. To measure the student experience of interdisciplinarity, the faculty researchers employed a mixed methods approach involving psychometric tests and observational ethnography. Data instruments included pre- and post-course semi-structured audio interviews, pre-test / post-test psychometric instruments (Budner Scale and Torrance Tests of Creativity), observational field notes, self-reflective blogging, and videography. This book describes the course and the experience of the students. It also contains images of the interdisciplinary work they created for a culminating class exhibition. Finally, the book provides insight on how different fields in a Middle Eastern context can share critical /analytical thinking tools to refine their own professional practices

    User Interface Design in Virtual Reality Research

    Get PDF
    Thesis Statement The primary objective of this research is to develop and investigate a user interface that supports learning to be implemented in the virtual reality application Anatomy Builder VR, an ongoing project from the Department of Visualization. Through the conception of this interface, we will explore the research question “how can user interface design in virtual reality applications support learning and engagement?”. Theoretical Framework Through the use of iterative design, we will develop an interface to be implemented in the virtual reality application Anatomy Builder VR.To accomplish this, we will create several prototypes to be evaluated by a focus group before implementing a high fidelity interface into the application. The three prototypes will be used to conduct a user study that will improve the quality and functionality of the final interface as a whole. Project Description Effective user interface design is extremely important when creating an application focused on learning. If the application’s interface is misleading, the user will either incorrectly learn the information or stop using the application altogether. For this reason, we will center our research on the question “how can user interface design in virtual reality applications support learning and engagement?”. Expected outcomes include designing a user interface that will provide an intuitive and engaging learning experience. Our interface will be implemented intoAnatomy Builder VR, an application that allows users to assemble a human or canine skeleton while learning comparative anatomy. Anatomy Builder VR is a current collaborative project between Department of Visualization and Department of Veterinary Integrative Biosciences. We will investigate how our design impacts the user’s anatomy learning experience
    • …
    corecore