thesis

Virtual environments for medical training : graphic and haptic simulation of tool-tissue interactions

Abstract

Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2004.Includes bibliographical references (leaves 122-127).For more than 2,500 years, surgical teaching has been based on the so called "see one, do one, teach one" paradigm, in which the surgical trainee learns by operating on patients under close supervision of peers and superiors. However, higher demands on the quality of patient care and rising malpractice costs have made it increasingly risky to train on patients. Minimally invasive surgery, in particular, has made it more difficult for an instructor to demonstrate the required manual skills. It has been recognized that, similar to flight simulators for pilots, virtual reality (VR) based surgical simulators promise a safer and more comprehensive way to train manual skills of medical personnel in general and surgeons in particular. One of the major challenges in the development of VR-based surgical trainers is the real-time and realistic simulation of interactions between surgical instruments and biological tissues. It involves multi-disciplinary research areas including soft tissue mechanical behavior, tool-tissue contact mechanics, computer haptics, computer graphics and robotics integrated into VR-based training systems. The research described in this thesis addresses many of the problems of simulating tool-tissue interactions in medical virtual environments. First, two kinds of physically based real time soft tissue models - the local deformation and the hybrid deformation model - were developed to compute interaction forces and visual deformation fields that provide real-time feed back to the user. Second, a system to measure in vivo mechanical properties of soft tissues was designed, and eleven sets of animal experiments were performed to measure in vivo and in vitro biomechanical properties of porcine intra-abdominal organs. Viscoelastic tissue(cont.) parameters were then extracted by matching finite element model predictions with the empirical data. Finally, the tissue parameters were combined with geometric organ models segmented from the Visible Human Dataset and integrated into a minimally invasive surgical simulation system consisting of haptic interface devices inside a mannequin and a graphic display. This system was used to demonstrate deformation and cutting of the esophagus, where the user can haptically interact with the virtual soft tissues and see the corresponding organ deformation on the visual display at the same time.by Jung Kim.Ph.D

    Similar works