4,227 research outputs found

    Haptic Hybrid Prototyping (HHP): An AR Application for Texture Evaluation with Semantic Content in Product Design

    Get PDF
    The manufacture of prototypes is costly in economic and temporal terms and in order to carry this out it is necessary to accept certain deviations with respect to the final finishes. This article proposes haptic hybrid prototyping, a haptic-visual product prototyping method created to help product design teams evaluate and select semantic information conveyed between product and user through texturing and ribs of a product in early stages of conceptualization. For the evaluation of this tool, an experiment was realized in which the haptic experience was compared during the interaction with final products and through the HHP. As a result, it was observed that the answers of the interviewees coincided in both situations in 81% of the cases. It was concluded that the HHP enables us to know the semantic information transmitted through haptic-visual means between product and user as well as being able to quantify the clarity with which this information is transmitted. Therefore, this new tool makes it possible to reduce the manufacturing lead time of prototypes as well as the conceptualization phase of the product, providing information on the future success of the product in the market and its economic return

    Exploration of Reaction Pathways and Chemical Transformation Networks

    Full text link
    For the investigation of chemical reaction networks, the identification of all relevant intermediates and elementary reactions is mandatory. Many algorithmic approaches exist that perform explorations efficiently and automatedly. These approaches differ in their application range, the level of completeness of the exploration, as well as the amount of heuristics and human intervention required. Here, we describe and compare the different approaches based on these criteria. Future directions leveraging the strengths of chemical heuristics, human interaction, and physical rigor are discussed.Comment: 48 pages, 4 figure

    Disassembly Sequence Evaluation: A User Study Leveraging Immersive Computing Technologies

    Get PDF
    As interest in product recovery, reuse, and recycling rises, planning and evaluating disassembly sequences are becoming increasingly important. The manner in which a product can be taken apart strongly influences end-of-life (EOL) operations and costs. Early disassembly planning can also inform non-EOL processes including repair and routine maintenance. Recently, research has concentrated on creating optimization algorithms which automatically generate disassembly sequences. These algorithms often require data that are unavailable or estimated with high uncertainty. Furthermore, industries often employ CAD modeling software to evaluate disassembly sequences during the design stage. The combination of these methods result in mathematically generated solutions, however, the solutions may not account for attributes that are difficult to quantify (human interaction). To help designers better explore and understand disassembly sequence opportunities, the research presented in this paper combines the value of mathematical modeling with the benefits of immersive computing technologies (ICT) to aid in early design decision making. For the purposes of this research, an ICT application was developed. The application displays both 3D geometry of a product and an interactive graph visualization of existing disassembly sequences. The user can naturally interact with the geometric models and explore sequences outlined in the graph visualization. The calculated optimal path can be highlighted allowing the user to quickly compare the optimal sequence against alternatives. The application has been implemented in a three wall immersive projection environment. A user study involving a hydraulic pump assembly was conducted. The results suggest that this approach may be a viable method of evaluating disassembly sequences early in design

    Computer Aided Drafting Virtual Reality Interface

    Get PDF
    Computer Aided Drafting (CAD) is pervasive in engineering fields today. It has become indispensable for planning, creating, visualizing, troubleshooting, collaborating, and communicating designs before they exist in physical form. From the beginning, CAD was created to be used by means of a mouse, keyboard, and monitor. Along the way, other, more specialized interface devices were created specifically for CAD that allowed for easier and more intuitive navigation within a 3D space, but they were at best stopgap solutions. Virtual Reality (VR) allows users to navigate and interact with digital 3D objects and environments the same way they would in the real world. For this reason, VR is a natural CAD interface solution. Using VR as an interface for CAD software, creating will be more intuitive and visualizing will be second nature. For this project, a prototype VR CAD program was created using Unreal Engine for use with the HTC Vive to compare against traditional WIMP (windows, icons, menus, pointer) interface CAD programs for the time it takes to learn each program, create similar models, and impressions of using each program, specifically the intuitiveness of the user interface and model manipulation. FreeCAD, SolidWorks, and Blender were the three traditional interface modeling programs chosen to compare against VR because of their wide-spread use for modeling in 3D printing, industry, and gaming, respectively. During the course of the project, two VR modeling programs were released, Google Blocks and MakeVR Pro; because they were of a similar type as the prototype software created in Unreal Engine, they were included for comparison as part of this project. The comparison showed that the VR CAD programs were faster to learn and create models and more intuitive to use than the traditional interface CAD programs

    Trajectory Deformations from Physical Human-Robot Interaction

    Full text link
    Robots are finding new applications where physical interaction with a human is necessary: manufacturing, healthcare, and social tasks. Accordingly, the field of physical human-robot interaction (pHRI) has leveraged impedance control approaches, which support compliant interactions between human and robot. However, a limitation of traditional impedance control is that---despite provisions for the human to modify the robot's current trajectory---the human cannot affect the robot's future desired trajectory through pHRI. In this paper, we present an algorithm for physically interactive trajectory deformations which, when combined with impedance control, allows the human to modulate both the actual and desired trajectories of the robot. Unlike related works, our method explicitly deforms the future desired trajectory based on forces applied during pHRI, but does not require constant human guidance. We present our approach and verify that this method is compatible with traditional impedance control. Next, we use constrained optimization to derive the deformation shape. Finally, we describe an algorithm for real time implementation, and perform simulations to test the arbitration parameters. Experimental results demonstrate reduction in the human's effort and improvement in the movement quality when compared to pHRI with impedance control alone

    Realistic Virtual Cuts

    Get PDF

    A system for rapid creation and assessment of conceptual large vehicle designs using immersive virtual reality

    Get PDF
    Currently, new product concepts are often evaluated by developing detailed virtual part and assembly models with traditional computer aided design (CAD) tools followed by appropriate analyses (e.g., finite element analysis, computational fluid dynamics, etc.). The creation of these models and analyses are tremendously time consuming. If a number of different conceptual configurations have been determined, it may not be possible to model and analyze each of them due to the complexity of these evaluation processes. Thus, promising concepts might be eliminated based solely on insufficient time and resources for assessment. In addition, the virtual models and analyses performed are usually of much higher detail and accuracy than what is needed for such early assessment. By eliminating the time-consuming complexity of a CAD environment and incorporating qualitative assessment tools, engineers could spend more time evaluating concepts that may have been previously abandoned due to time constraints. To address these issues, the Advanced Systems Design Suite (ASDS), was created. The ASDS incorporates a PC user interface with an immersive virtual reality (VR) environment to ease the creation and assessment of conceptual design prototypes individually or collaboratively in an immersive VR environment. Assessment tools incorporate metamodeling approximations and immersive visualization to evaluate the feasibility of each concept. In this paper, the ASDS system and interface along with specifically designed immersive VR assessment tools such as state saving and dynamic viewpoint creation are presented for conceptual large vehicle design. A test case example of redesigning an airplane is presented to explore the feasibility of the proposed system

    Virtual Reality Games for Motor Rehabilitation

    Get PDF
    This paper presents a fuzzy logic based method to track user satisfaction without the need for devices to monitor users physiological conditions. User satisfaction is the key to any product’s acceptance; computer applications and video games provide a unique opportunity to provide a tailored environment for each user to better suit their needs. We have implemented a non-adaptive fuzzy logic model of emotion, based on the emotional component of the Fuzzy Logic Adaptive Model of Emotion (FLAME) proposed by El-Nasr, to estimate player emotion in UnrealTournament 2004. In this paper we describe the implementation of this system and present the results of one of several play tests. Our research contradicts the current literature that suggests physiological measurements are needed. We show that it is possible to use a software only method to estimate user emotion
    corecore