16 research outputs found

    The challenges in computer supported conceptual engineering design

    Get PDF
    Computer Aided Engineering Design (CAED) supports the engineering design process during the detail design, but it is not commonly used in the conceptual design stage. This article explores through literature why this is and how the engineering design research community is responding through the development of new conceptual CAED systems and HCI (Human Computer Interface) prototypes. First the requirements and challenges for future conceptual CAED and HCI solutions to better support conceptual design are explored and categorised. Then the prototypes developed in both areas, since 2000, are discussed. Characteristics already considered and those required for future development of CAED systems and HCIs are proposed and discussed, one of the key ones being experience. The prototypes reviewed offer innovative solutions, but only address selected requirements of conceptual design, and are thus unlikely to not provide a solution which would fit the wider needs of the engineering design industry. More importantly, while the majority of prototypes show promising results they are of low maturity and require further development

    Design of a six degree-of-freedom haptic hybrid platform manipultor

    Get PDF
    Thesis (Master)--Izmir Institute of Technology, Mechanical Engineering, Izmir, 2010Includes bibliographical references (leaves: 97-103)Text in English; Abstract: Turkish and Englishxv, 115 leavesThe word Haptic, based on an ancient Greek word called haptios, means related with touch. As an area of robotics, haptics technology provides the sense of touch for robotic applications that involve interaction with human operator and the environment. The sense of touch accompanied with the visual feedback is enough to gather most of the information about a certain environment. It increases the precision of teleoperation and sensation levels of the virtual reality (VR) applications by exerting physical properties of the environment such as forces, motions, textures. Currently, haptic devices find use in many VR and teleoperation applications. The objective of this thesis is to design a novel Six Degree-of-Freedom (DOF) haptic desktop device with a new structure that has the potential to increase the precision in the haptics technology. First, previously developed haptic devices and manipulator structures are reviewed. Following this, the conceptual designs are formed and a hybrid structured haptic device is designed manufactured and tested. Developed haptic device.s control algorithm and VR application is developed in Matlab© Simulink. Integration of the mechanism with mechanical, electromechanical and electronic components and the initial tests of the system are executed and the results are presented. According to the results, performance of the developed device is discussed and future works are addressed

    Expressive cutting, deforming, and painting of three-dimensional digital shapes through asymmetric bimanual haptic manipulation

    Get PDF
    Practitioners of the geosciences, design, and engineering disciplines communicate complex ideas about shape by manipulating three-dimensional digital objects to match their conceptual model. However, the two-dimensional control interfaces, common in software applications, create a disconnect to three-dimensional manipulations. This research examines cutting, deforming, and painting manipulations for expressive three-dimensional interaction. It presents a cutting algorithm specialized for planning cuts on a triangle mesh, the extension of a deformation algorithm for inhomogeneous meshes, and the definition of inhomogeneous meshes by painting into a deformation property map. This thesis explores two-handed interactions with haptic force-feedback where each hand can fulfill an asymmetric bimanual role. These digital shape manipulations demonstrate a step toward the creation of expressive three-dimensional interactions

    Effectiveness of haptic feedback coupled with the use of a head-mounted display for the evaluation of virtual mechanisms

    Get PDF
    Adequate immersion in virtual environments is a key to having a successful virtual simulation experience. As people have more of a sense of being there (telepresence) when they experience a virtual simulation, their experience becomes more realistic and therefore they are able to make valid assessments of their environments. This thesis presents the results of a study focused on the evaluation of participants\u27 perceptional and preferential differences between a haptic and non-haptic virtual experience coupled with the use and non-use of a head-mounted display (HMD). Several measurements were used in order to statistically compare the performance of participants from four groups, haptic with the HMD, non-haptic with the HMD, haptic without the HMD, and non-haptic without the HMD. The study found that the virtual environment (VE) display type, either HMD or desktop monitor, affected participants\u27 ability to detect mechanism differences related to motion, arm length, and distances (mechanism length and location) as well as influenced the amount of time required to evaluate each mechanism design during trial one. The treatment type (haptic or non-haptic) affected participants\u27 ability to estimate mechanism differences, influenced the detection of mechanism arm length differences, and resulted in differences in the amount of time needed to evaluate each mechanism design. Regardless of which treatment participants initially experienced, participants overwhelmingly preferred the haptic treatment to the non?-haptic treatment. The results of this study will help scientists make more informed decisions related to haptic device utilization, as well as head-mounted display use, an the interaction of the two. Several recommendations for future human factor studies related to haptic sensation, HMD use, and virtual reality are also included

    A Novel Haptics-Based Interface and Sculpting System for Physics-Based Geometric Design

    No full text
    Standard free-form splines such as B-splines and NURBS are widely employed in a wide range of CAD/CAM systems. Conventional geometric modeling and design techniques using these popular splines often requires tedious control-point manipulation and/or painstaking constraint specification (for functional requirements) via unnatural mouse-based computer interfaces. In this paper, we propose a novel and natural haptic interface and present a physics-based geometric modeling approach that supports the interactive sculpting of spline-based virtual material. Our desktop modeling system permits both expert and non-expert users to interactively deform virtual materials with real properties using force feedback. Using commercially available (and low-cost) haptic devices, modelers can feel the physically realistic presence of virtual spline objects such as B-splines throughout the design process. Our haptics-based B-spline is a special case of more powerful dynamic NURBS (D-NURBS) models. We develop various haptic sculpting tools to expedite the deformation of B-spline surfaces with haptic feedback and constraints. The most significant contribution of this paper is that point, normal, and curvature constraints can be specified interactively and modified naturally using forces. To achieve the real-time sculpting performance, we devise a novel dual representation for B-spline surfaces in both physical and mathematical space: the physics-based mass-spring model is mathematically constrained by the B-spline surface throughout the sculpting session. We envision that the integration of haptics with traditional computer-aided design makes it possible to realize all the potential offered by both haptic sculpting and physics-based modeling in computer-integrated design, vir..

    Factors Affecting Human Force Perception and Performance in Haptic-Enabled Virtual Environments

    Get PDF
    Haptic technology enables computer users to touch and/or manipulate virtual objects in virtual environments (VEs). Similar to other human-in-the-loop applications, haptic applications require interactions between humans and computers. Thus, human-factors studies are required to recognize the limitations and capabilities of the user. This thesis establishes human-factors criteria to improve various haptic applications such as perception-based haptic compression techniques and haptic-enabled computer-aided design (CAD). Today, data compression plays a significant role in the transmission of haptic information since the efficient use of the available bandwidth is a concern. Most lossy haptic compression techniques rely on the limitations of human force perception, and this is used in the design of perception-based haptic compression techniques. Researchers have studied force perception when a user is in static interaction with a stationary object. This thesis focuses on cases where the human user and the object are in relative motion. The limitations of force perception are quantified using psychophysical methods, and the effects of several factors, including user hand velocity and sensory adaptation, are investigated. The results indicate that fewer haptic details need to be calculated or transmitted when the user's hand is in motion. In traditional CAD systems, users usually design virtual prototypes using a mouse via their vision system only, and it is difficult to design curved surfaces due to the number, shape, and position of the curves. Adding haptics to CAD systems enables users to explore and manipulate virtual objects using the sense of touch. In addition, human performance is important in CAD environments. To maintain the accuracy, active haptic manipulation of the user response can be incorporated in CAD applications. This thesis investigates the effect of forces on the accuracy of movement in VEs. The results indicate that factors such as the base force intensity and force increment/decrement can be incorporated in the control of users' movements in VEs. In other words, we can pull/push the users' hands by increasing/decreasing the force without the users being aware of it

    Factors Affecting Human Force Perception and Performance in Haptic-Enabled Virtual Environments

    Get PDF
    Haptic technology enables computer users to touch and/or manipulate virtual objects in virtual environments (VEs). Similar to other human-in-the-loop applications, haptic applications require interactions between humans and computers. Thus, human-factors studies are required to recognize the limitations and capabilities of the user. This thesis establishes human-factors criteria to improve various haptic applications such as perception-based haptic compression techniques and haptic-enabled computer-aided design (CAD). Today, data compression plays a significant role in the transmission of haptic information since the efficient use of the available bandwidth is a concern. Most lossy haptic compression techniques rely on the limitations of human force perception, and this is used in the design of perception-based haptic compression techniques. Researchers have studied force perception when a user is in static interaction with a stationary object. This thesis focuses on cases where the human user and the object are in relative motion. The limitations of force perception are quantified using psychophysical methods, and the effects of several factors, including user hand velocity and sensory adaptation, are investigated. The results indicate that fewer haptic details need to be calculated or transmitted when the user's hand is in motion. In traditional CAD systems, users usually design virtual prototypes using a mouse via their vision system only, and it is difficult to design curved surfaces due to the number, shape, and position of the curves. Adding haptics to CAD systems enables users to explore and manipulate virtual objects using the sense of touch. In addition, human performance is important in CAD environments. To maintain the accuracy, active haptic manipulation of the user response can be incorporated in CAD applications. This thesis investigates the effect of forces on the accuracy of movement in VEs. The results indicate that factors such as the base force intensity and force increment/decrement can be incorporated in the control of users' movements in VEs. In other words, we can pull/push the users' hands by increasing/decreasing the force without the users being aware of it

    Tactile Haptics: A Study of Roughness Perception in Virtual Environments

    Get PDF
    This thesis presents the design of a tactile device that can be used to display varying magnitudes of roughness. The device is designed to be attached to an existing force feedback device in order to create a package that is able to display both macro-level (force feedback) and micro-level (tactile feedback) information to the users. This device allows the users to feel a simulated texture by placing an index finger on an aperture. The stimulus is created with a spiral brush made of nylon bristles. The brush is attached to a DC motor and the speed and direction of rotation of the brush are used to generate textures at the fingertip through the aperture. Three psychophysical experiments are conducted to study the effects of speed and direction on the roughness perception. The first experiment is designed to investigate the sensitivity to a change in the speed of the brush. This experiment is conducted for two levels of base speed and it is found that as the base speed increases, the just noticeable difference (JND) with respect to speed decreases. In the second experiment, it is found that this tactile device is able to represent textures of rough nature, such as sandpaper. It is also found that the human roughness perception cannot be described in a unique manner. Two opposite definitions of rough textures are identified in this experiment. While some users relate an increase in the speed of the brush to increasing roughness, others relate it to decreasing roughness. Further, the results show that the effects of direction are insignificant on the roughness perception for both groups of users. In the third experiment, the effects of direction are studied more closely by presenting the two directions successively with a time gap of 0.5s0.5s. It is found that with this small time gap, the users are able to discriminate between directions, unlike in the previous experiment. The roughness perception is affected by the change in direction when the time gap is small. These findings open further areas that need to be investigated before a robust tactile device can be designed
    corecore