8 research outputs found

    Asynchronous haptic simulation of contacting deformable objects with variable stiffness

    Get PDF
    International audienceAbstract--This paper presents a new asynchronous approach for haptic rendering of deformable objects. When stiff nonlinear deformations take place, they introduce important and rapid variations of the force sent to the user. This problem is similar to the stiff virtual wall for which a high refresh rate is required to obtain a stable haptic feedback. However, when dealing with several interacting deformable objects, it is usually impossible to simulate all objects at high rates. To address this problem we propose a quasi-static framework that allows for stable interactions of asynchronously computed deformable objects. In the proposed approach, a deformable object can be computed at high refresh rates, while the remaining deformable virtual objects remain computed at low refresh rates. Moreover, contacts and other constraints between the different objects of the virtual environment are accurately solved using a shared Linear Complementarity Problem (LCP). Finally, we demonstrate our method on two test cases: a snap-in example involving non-linear deformations and a virtual thread interacting with a deformable object

    Haptic rendering of complex deformations through handle-space force linearization

    Full text link
    The force-update-rate requirements of transparent rendering of vir-tual environments are in conflict with the computational cost re-quired for computing complex interactions between deforming ob-jects. In this paper we introduce a novel method for satisfying high force update rates with deformable objects, yet retaining the visual quality of complex deformations and interactions. The objects that are haptically manipulated may have many de-grees of freedom, but haptic interaction is often implemented in practice through low-dimensional force-feedback devices. We ex-ploit the low-dimensional domain of the interaction for devising a novel linear approximation of interaction forces that can be ef-ficiently evaluated at force-update rates. Moreover, our linearized force model is time-implicit, which implies that it accounts for con-tact constraints and the internal dynamics of deforming objects. In this paper we show examples of haptic interaction in complex sit-uations such as large deformations, collision between deformable objects (with friction), or even self-collision

    Multi-scale simulation for microsurgery trainer

    Full text link

    Virtual environments for medical training : graphic and haptic simulation of tool-tissue interactions

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2004.Includes bibliographical references (leaves 122-127).For more than 2,500 years, surgical teaching has been based on the so called "see one, do one, teach one" paradigm, in which the surgical trainee learns by operating on patients under close supervision of peers and superiors. However, higher demands on the quality of patient care and rising malpractice costs have made it increasingly risky to train on patients. Minimally invasive surgery, in particular, has made it more difficult for an instructor to demonstrate the required manual skills. It has been recognized that, similar to flight simulators for pilots, virtual reality (VR) based surgical simulators promise a safer and more comprehensive way to train manual skills of medical personnel in general and surgeons in particular. One of the major challenges in the development of VR-based surgical trainers is the real-time and realistic simulation of interactions between surgical instruments and biological tissues. It involves multi-disciplinary research areas including soft tissue mechanical behavior, tool-tissue contact mechanics, computer haptics, computer graphics and robotics integrated into VR-based training systems. The research described in this thesis addresses many of the problems of simulating tool-tissue interactions in medical virtual environments. First, two kinds of physically based real time soft tissue models - the local deformation and the hybrid deformation model - were developed to compute interaction forces and visual deformation fields that provide real-time feed back to the user. Second, a system to measure in vivo mechanical properties of soft tissues was designed, and eleven sets of animal experiments were performed to measure in vivo and in vitro biomechanical properties of porcine intra-abdominal organs. Viscoelastic tissue(cont.) parameters were then extracted by matching finite element model predictions with the empirical data. Finally, the tissue parameters were combined with geometric organ models segmented from the Visible Human Dataset and integrated into a minimally invasive surgical simulation system consisting of haptic interface devices inside a mannequin and a graphic display. This system was used to demonstrate deformation and cutting of the esophagus, where the user can haptically interact with the virtual soft tissues and see the corresponding organ deformation on the visual display at the same time.by Jung Kim.Ph.D

    Haptics-based Modeling and Simulation of Micro-Implants Surgery

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Surface Geometry and the Haptic Rendering of Rigid Point Contacts

    Get PDF
    This thesis examines the haptic rendering of rigid point contacts in virtual simulations. The haptic renderers generate force feedback so that the operator can interact with the virtual scenes in a realistic way. They must be able to recreate the physical phenomena experienced in the real world without displaying any haptic artifacts. The existing renderers are decomposed into a projection function and a regulation scheme. It is shown that the pop-through artifact, whereby the virtual tool instantaneously jumps between two distant surface points, is caused whenever the operator encounters a singularity within the renderer's projection function. This was well known for the minimum distance based renderers, but it is shown here that such singularities arise with the constraint based renderers as well. A new projection function is designed to minimize the existence of singularities within the model. When paired with an appropriate regulation scheme, this forms the proposed mapping renderer. The new projection is calculated by mapping the model onto a canonical shape where the haptic problem is trivial, e.g. a circle in the case of a 2D model of genus zero, which avoids pop-through on smooth models. The haptic problem is then recast as a virtual constraint problem, where the traditional regulation schemes, designed originally for planar surfaces, are shown to introduce a velocity dependent error on curved surfaces that can distort the model's rendering and to couple the regulation towards and dynamics along the constraint. Set stabilization control, based on feedback linearizing the haptic device with respect to a virtual output consisting of coordinates transversal and tangential to the model surface, is proposed as an alternative. It is shown to be able to decouple the system into transversal and tangential subsystems that can then be made asymptotically stable and assigned arbitrary dynamics, respectively
    corecore