664 research outputs found

    Doctor of Philosophy

    Get PDF
    dissertationVirtual environments provide a consistent and relatively inexpensive method of training individuals. They often include haptic feedback in the form of forces applied to a manipulandum or thimble to provide a more immersive and educational experience. However, the limited haptic feedback provided in these systems tends to be restrictive and frustrating to use. Providing tactile feedback in addition to this kinesthetic feedback can enhance the user's ability to manipulate and interact with virtual objects while providing a greater level of immersion. This dissertation advances the state-of-the-art by providing a better understanding of tactile feedback and advancing combined tactilekinesthetic systems. The tactile feedback described within this dissertation is provided by a finger-mounted device called the contact location display (CLD). Rather than displaying the entire contact surface, the device displays (feeds back) information only about the center of contact between the user's finger and a virtual surface. In prior work, the CLD used specialized two-dimensional environments to provide smooth tactile feedback. Using polygonal environments would greatly enhance the device's usefulness. However, the surface discontinuities created by the facets on these models are rendered through the CLD, regardless of traditional force shading algorithms. To address this issue, a haptic shading algorithm was developed to provide smooth tactile and kinesthetic interaction with general polygonal models. Two experiments were used to evaluate the shading algorithm. iv To better understand the design requirements of tactile devices, three separate experiments were run to evaluate the perception thresholds for cue localization, backlash, and system delay. These experiments establish quantitative design criteria for tactile devices. These results can serve as the maximum (i.e., most demanding) device specifications for tactile-kinesthetic haptic systems where the user experiences tactile feedback as a function of his/her limb motions. Lastly, a revision of the CLD was constructed and evaluated. By taking the newly evaluated design criteria into account, the CLD device became smaller and lighter weight, while providing a full two degree-of-freedom workspace that covers the bottom hemisphere of the finger. Two simple manipulation experiments were used to evaluate the new CLD device

    A hybrid method for haptic feedback to support manual virtual product assembly

    Get PDF
    The purpose of this research is to develop methods to support manual virtual assembly using haptic (force) feedback in a virtual environment. The results of this research will be used in an engineering framework for assembly simulation, training, and maintenance. The key research challenge is to advance the ability of users to assemble complex, low clearance CAD parts as they exist digitally without the need to create expensive physical prototypes. The proposed method consists of a Virtual Reality (VR) system that combines voxel collision detection and boundary representation methods into a hybrid algorithm containing the necessary information for both force feedback and constraint recognition. The key to this approach will be successfully developing the data structure and logic needed to switch between collision detection and constraint recognition while maintaining a haptic refresh rate of 1000 Hz. VR is a set of unique technologies that support human-centered computer interaction. Experience with current VR systems that simulate low clearance assembly operations with haptic feedback indicate that such systems are highly desirable tools in the evaluation of preliminary designs, as well as virtual training and maintenance processes. This work will result in a novel interface for assembly methods prototyping, and an interface that will allow intuitive interaction with parts based on a powerful combination of analytical, visual and haptic tools

    A novel haptic model and environment for maxillofacial surgical operation planning and manipulation

    Get PDF
    This paper presents a practical method and a new haptic model to support manipulations of bones and their segments during the planning of a surgical operation in a virtual environment using a haptic interface. To perform an effective dental surgery it is important to have all the operation related information of the patient available beforehand in order to plan the operation and avoid any complications. A haptic interface with a virtual and accurate patient model to support the planning of bone cuts is therefore critical, useful and necessary for the surgeons. The system proposed uses DICOM images taken from a digital tomography scanner and creates a mesh model of the filtered skull, from which the jaw bone can be isolated for further use. A novel solution for cutting the bones has been developed and it uses the haptic tool to determine and define the bone-cutting plane in the bone, and this new approach creates three new meshes of the original model. Using this approach the computational power is optimized and a real time feedback can be achieved during all bone manipulations. During the movement of the mesh cutting, a novel friction profile is predefined in the haptical system to simulate the force feedback feel of different densities in the bone

    Realistic Haptics Interaction in Complex Virtual Environments

    Get PDF

    Constraint-based technique for haptic volume exploration

    Get PDF
    Journal ArticleWe present a haptic rendering technique that uses directional constraints to facilitate enhanced exploration modes for volumetric datasets. The algorithm restricts user motion in certain directions by incrementally moving a proxy point along the axes of a local reference frame. Reaction forces are generated by a spring coupler between the proxy and the data probe, which can be tuned to the capabilities of the haptic interface. Secondary haptic effects including field forces, friction, and texture can be easily incorporated to convey information about additional characteristics of the data. We illustrate the technique with two examples: displaying fiber orientation in heart muscle layers and exploring diffusion tensor fiber tracts in brain white matter tissue. Initial evaluation of the approach indicates that haptic constraints provide an intuitive means for displaying directional information in volume data

    BREP Identification During Voxel-Based Collision Detection for Haptic Manual Assembly

    Get PDF
    This paper presents a novel method to tie geometric boundary representation (BREP) to voxel-based collision detection for use in haptic manual assembly simulation. Virtual Reality, in particular haptics, has been applied with promising results to improve preliminary product design, assembly prototyping and maintenance operations. However, current methodologies do not provide support for low clearance assembly tasks, reducing the applicability of haptics to a small subset of potential situations. This paper discusses a new approach, which combines highly accurate CAD geometry (boundary representation) with voxel models to support a hybrid method involving both geometric constraint enforcement and voxel-based collision detection to provide stable haptic force feedback. With the methods presented here, BREP data can be accessed during voxel-based collision detection. This information can be used for constraint recognition and lead to constraint-guidance during the assembly process

    Six Degrees-of-Freedom Haptic Interaction with Fluids

    Get PDF
    International audienceIn this work, we propose a novel approach that allows real-time six Degrees of Freedom (DoF) haptic interaction with fluids of variable viscosity. Our haptic rendering technique, based on a Smoothed-Particle Hydrodynamics physical model, provides a realistic haptic feedback through physically based forces. 6DoF haptic interaction with fluids is made possible thanks to a new coupling scheme and a unified particle model, allowing the use of arbitrary-shaped rigid bodies. Particularly, fluid containers can be created to hold fluid and hence transmit to the user force feedback coming from fluid stirring, pouring, shaking, and scooping, to name a few. In addition, we adapted an existing visual rendering algorithm to meet the frame rate requirements of the haptic algorithms. We evaluate and illustrate the main features of our approach through different scenarios, highlighting the 6DoF haptic feedback and the use of containers

    The Use of the Voxmap Pointshell Method of Collision Detection in Virtual Assembly Methods Planning

    Get PDF
    Virtual reality (VR) provides the ability to work with digital models in an environment that provides 3 dimensional interaction. This technology can be used to evaluate how humans interact with products before costly physical prototypes are built. One of the advantages of using VR technology in design evaluation is the ability to easily explore many different what-if design scenarios. One of the areas of current research in the use of VR is in assembly methods planning. As a result of prior work performed at Iowa State University, it became clear that collision detection is an important component in the development of virtual assembly methods planning applications. This paper describes the use of the Voxmap Pointshell method of collision detection as it is applied to a general purpose virtual assembly planning application
    corecore