128 research outputs found

    Realistic tool-tissue interaction models for surgical simulation and planning

    Get PDF
    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in pre- and intra-operative surgical planning. Realistic modeling of medical interventions involving tool-tissue interactions has been considered to be a key requirement in the development of high-fidelity simulators and planners. The soft-tissue constitutive laws, organ geometry and boundary conditions imposed by the connective tissues surrounding the organ, and the shape of the surgical tool interacting with the organ are some of the factors that govern the accuracy of medical intervention planning.\ud \ud This thesis is divided into three parts. First, we compare the accuracy of linear and nonlinear constitutive laws for tissue. An important consequence of nonlinear models is the Poynting effect, in which shearing of tissue results in normal force; this effect is not seen in a linear elastic model. The magnitude of the normal force for myocardial tissue is shown to be larger than the human contact force discrimination threshold. Further, in order to investigate and quantify the role of the Poynting effect on material discrimination, we perform a multidimensional scaling study. Second, we consider the effects of organ geometry and boundary constraints in needle path planning. Using medical images and tissue mechanical properties, we develop a model of the prostate and surrounding organs. We show that, for needle procedures such as biopsy or brachytherapy, organ geometry and boundary constraints have more impact on target motion than tissue material parameters. Finally, we investigate the effects surgical tool shape on the accuracy of medical intervention planning. We consider the specific case of robotic needle steering, in which asymmetry of a bevel-tip needle results in the needle naturally bending when it is inserted into soft tissue. We present an analytical and finite element (FE) model for the loads developed at the bevel tip during needle-tissue interaction. The analytical model explains trends observed in the experiments. We incorporated physical parameters (rupture toughness and nonlinear material elasticity) into the FE model that included both contact and cohesive zone models to simulate tissue cleavage. The model shows that the tip forces are sensitive to the rupture toughness. In order to model the mechanics of deflection of the needle, we use an energy-based formulation that incorporates tissue-specific parameters such as rupture toughness, nonlinear material elasticity, and interaction stiffness, and needle geometric and material properties. Simulation results follow similar trends (deflection and radius of curvature) to those observed in macroscopic experimental studies of a robot-driven needle interacting with gels

    Research on real-time physics-based deformation for haptic-enabled medical simulation

    Full text link
    This study developed a multiple effective visuo-haptic surgical engine to handle a variety of surgical manipulations in real-time. Soft tissue models are based on biomechanical experiment and continuum mechanics for greater accuracy. Such models will increase the realism of future training systems and the VR/AR/MR implementations for the operating room

    Computer simulated needle manipulation of Chinese acupuncture with realistic haptic feedback.

    Get PDF
    Leung Ka Man.Thesis submitted in: August 2002.Thesis (M.Phil.)--Chinese University of Hong Kong, 2003.Includes bibliographical references (leaves 81-84).Abstracts in English and Chinese.Abstract --- p.iiAcknowledgements --- p.ivContents --- p.vList of Figures --- p.viiiList of Tables --- p.xChapter 1. --- Introduction --- p.1Chapter 1.1 --- Surgical Needle Simulation --- p.4Chapter 1.1.1 --- Data Source --- p.5Chapter 1.1.2 --- Computer-aided training simulation --- p.6Chapter 1.1.3 --- Existing Systems --- p.8Chapter 1.2 --- Research Goal --- p.10Chapter 1.3 --- Organization of this Thesis --- p.12Chapter 2. --- Haptization of Needle Interactions --- p.13Chapter 2.1 --- Data Collection --- p.13Chapter 2.1.1 --- Force Measurement --- p.14Chapter 2.1.2 --- Data Correlation --- p.17Chapter 2.1.3 --- Expert Opinion --- p.18Chapter 2.2 --- Haptic Display Devices --- p.18Chapter 2.2.1 --- General-purpose Devices --- p.19Chapter 2.2.2 --- Tailor-made Devices --- p.20Chapter 2.3 --- Haptic Models for Tissues --- p.21Chapter 2.3.1 --- Stiffness Models --- p.21Chapter 2.3.2 --- Friction Models --- p.22Chapter 2.3.3 --- Modelling of needle operations --- p.23Chapter 2.4 --- Chapter Summary --- p.24Chapter 3. --- Haptic Rendering of Bi-directional Needle Manipulation --- p.25Chapter 3.1 --- Data Source and Pre-processing --- p.26Chapter 3.1.1 --- Virtual Body Surface Construction --- p.28Chapter 3.1.2 --- Tissue Mapping for Haptic Rendering --- p.29Chapter 3.2 --- The PHANToM®ёØ Haptic Device --- p.31Chapter 3.3 --- Force Profile Analysis --- p.33Chapter 3.4 --- Haptic Model Construction --- p.37Chapter 3.4.1 --- Skin --- p.41Chapter 3.4.2 --- Adipose Tissue --- p.48Chapter 3.4.3 --- Muscle --- p.49Chapter 3.4.4 --- Bone --- p.50Chapter 3.5 --- Force Composition --- p.51Chapter 3.5.1 --- Structure Weight Compensation --- p.52Chapter 3.5.2 --- Path Constraint Force --- p.52Chapter 3.5.3 --- Needle Axial Force --- p.53Chapter 3.6 --- Interactive Calibration --- p.60Chapter 3.7 --- Skin Deformation --- p.61Chapter 3.8 --- Chapter Summary --- p.63Chapter 4. --- Parallel Visual-Haptic Rendering --- p.64Chapter 4.1 --- Parallel Network Architecture --- p.64Chapter 4.2 --- Visual Rendering Pipeline --- p.65Chapter 4.3 --- Haptic Rendering Pipeline --- p.67Chapter 4.4 --- Chapter Summary --- p.67Chapter 5. --- User Interface --- p.68Chapter 5.1 --- Needle Practice --- p.68Chapter 5.1.1 --- Moving Mode --- p.69Chapter 5.1.2 --- Acupuncture Atlas --- p.70Chapter 5.1.3 --- Training Results --- p.70Chapter 5.1.4 --- User Controls --- p.71Chapter 5.2 --- Device Calibration --- p.72Chapter 5.3 --- Model Settings --- p.72Chapter 5.4 --- Chapter Summary --- p.72Chapter 6. --- Conclusion --- p.73Chapter 6.1 --- Research Summary --- p.73Chapter 6.2 --- Suggested Improvement --- p.74Chapter 6.3 --- Future Research Works --- p.75Appendix A: Mapping Table for Tissues --- p.76Appendix B: Incremental Viscoelastic Model --- p.78Appendix C: Model Parameter Values --- p.80Bibliography --- p.8

    Real-time hybrid cutting with dynamic fluid visualization for virtual surgery

    Get PDF
    It is widely accepted that a reform in medical teaching must be made to meet today's high volume training requirements. Virtual simulation offers a potential method of providing such trainings and some current medical training simulations integrate haptic and visual feedback to enhance procedure learning. The purpose of this project is to explore the capability of Virtual Reality (VR) technology to develop a training simulator for surgical cutting and bleeding in a general surgery

    Haptics in Robot-Assisted Surgery: Challenges and Benefits

    Get PDF
    Robotic surgery is transforming the current surgical practice, not only by improving the conventional surgical methods but also by introducing innovative robot-enhanced approaches that broaden the capabilities of clinicians. Being mainly of man-machine collaborative type, surgical robots are seen as media that transfer pre- and intra-operative information to the operator and reproduce his/her motion, with appropriate filtering, scaling, or limitation, to physically interact with the patient. The field, however, is far from maturity and, more critically, is still a subject of controversy in medical communities. Limited or absent haptic feedback is reputed to be among reasons that impede further spread of surgical robots. In this paper objectives and challenges of deploying haptic technologies in surgical robotics is discussed and a systematic review is performed on works that have studied the effects of providing haptic information to the users in major branches of robotic surgery. It has been tried to encompass both classical works and the state of the art approaches, aiming at delivering a comprehensive and balanced survey both for researchers starting their work in this field and for the experts

    Haptic Training Simulator for Pedicle Screw Insertion in Scoliosis Surgery

    Get PDF
    This thesis develops a haptic training simulator that imitates the sensations experienced by a surgeon in pedicle screw insertions in a scoliosis surgery. Pedicle screw insertion is a common treatment for fixing spinal deformities in idiopathic scoliosis. Surgeons using the free hand technique are guided primarily by haptic feedback. A vital step in this free hand technique is the use of a probe to make a channel through the vertebrae pedicle. This is a sensitive process which carries risk of serious mechanical, neurological and vascular complications. Surgeons are currently trained using cadavers or live patients. Cadavers often have vertebrae that are softer than the real surgeons would typically encounter, while training on live patients carries the obvious issue of increased risk of complications to the patient. In this thesis, a haptic virtual reality simulator is designed and studied as a training tool for surgeons in this procedure. Creating a pathway through the pedicle by the free-hand technique is composed of two main degrees of freedom: rotation and linear progression. The rotary stage of the device which was developed by a previous student, is enhanced in this research by adding hardware, improving the haptic model and proposing techniques to couple the rotary and linear degree of freedom. Haptic model parameters for a spine surgery with normal bone density are then clinically tuned within a user study. Over ten surgeons of varying experience levels used the simulator and were able to change various parameters in order to tune the simulator to what felt most realistic. The surgeons also evaluated the simulator for its feasibility and usefulness. Four research questions were investigated. First, can a reference set of values be found that replicate the surgeon's interpretation of the surgical scenario? Second, how are the rotary stage parameters influenced in the presence of linear effects? Third, do the results differ across different expertise levels? Finally, can the simulator serve as a useful tool in the education of surgical trainees for teaching channel creation in pedicle screw insertion? Statistical analysis are carried out to examine the research questions. The results indicates the feasibility of the simulator for surgical education

    3D Multimodal Interaction with Physically-based Virtual Environments

    Get PDF
    The virtual has become a huge field of exploration for researchers: it could assist the surgeon, help the prototyping of industrial objects, simulate natural phenomena, be a fantastic time machine or entertain users through games or movies. Far beyond the only visual rendering of the virtual environment, the Virtual Reality aims at -literally- immersing the user in the virtual world. VR technologies simulate digital environments with which users can interact and, as a result, perceive through different modalities the effects of their actions in real time. The challenges are huge: the user's motions need to be perceived and to have an immediate impact on the virtual world by modifying the objects in real-time. In addition, the targeted immersion of the user is not only visual: auditory or haptic feedback needs to be taken into account, merging all the sensory modalities of the user into a multimodal answer. The global objective of my research activities is to improve 3D interaction with complex virtual environments by proposing novel approaches for physically-based and multimodal interaction. I have laid the foundations of my work on designing the interactions with complex virtual worlds, referring to a higher demand in the characteristics of the virtual environments. My research could be described within three main research axes inherent to the 3D interaction loop: (1) the physically-based modeling of the virtual world to take into account the complexity of the virtual object behavior, their topology modifications as well as their interactions, (2) the multimodal feedback for combining the sensory modalities into a global answer from the virtual world to the user and (3) the design of body-based 3D interaction techniques and devices for establishing the interfaces between the user and the virtual world. All these contributions could be gathered in a general framework within the 3D interaction loop. By improving all the components of this framework, I aim at proposing approaches that could be used in future virtual reality applications but also more generally in other areas such as medical simulation, gesture training, robotics, virtual prototyping for the industry or web contents.Le virtuel est devenu un vaste champ d'exploration pour la recherche et offre de nos jours de nombreuses possibilitĂ©s : assister le chirurgien, rĂ©aliser des prototypes de piĂšces industrielles, simuler des phĂ©nomĂšnes naturels, remonter dans le temps ou proposer des applications ludiques aux utilisateurs au travers de jeux ou de films. Bien plus que le rendu purement visuel d'environnement virtuel, la rĂ©alitĂ© virtuelle aspire Ă  -littĂ©ralement- immerger l'utilisateur dans le monde virtuel. L'utilisateur peut ainsi interagir avec le contenu numĂ©rique et percevoir les effets de ses actions au travers de diffĂ©rents retours sensoriels. Permettre une vĂ©ritable immersion de l'utilisateur dans des environnements virtuels de plus en plus complexes confronte la recherche en rĂ©alitĂ© virtuelle Ă  des dĂ©fis importants: les gestes de l'utilisateur doivent ĂȘtre capturĂ©s puis directement transmis au monde virtuel afin de le modifier en temps-rĂ©el. Les retours sensoriels ne sont pas uniquement visuels mais doivent ĂȘtre combinĂ©s avec les retours auditifs ou haptiques dans une rĂ©ponse globale multimodale. L'objectif principal de mes activitĂ©s de recherche consiste Ă  amĂ©liorer l'interaction 3D avec des environnements virtuels complexes en proposant de nouvelles approches utilisant la simulation physique et exploitant au mieux les diffĂ©rentes modalitĂ©s sensorielles. Dans mes travaux, je m'intĂ©resse tout particuliĂšrement Ă  concevoir des interactions avec des mondes virtuels complexes. Mon approche peut ĂȘtre dĂ©crite au travers de trois axes principaux de recherche: (1) la modĂ©lisation dans les mondes virtuels d'environnements physiques plausibles oĂč les objets rĂ©agissent de maniĂšre naturelle, mĂȘme lorsque leur topologie est modifiĂ©e ou lorsqu'ils sont en interaction avec d'autres objets, (2) la mise en place de retours sensoriels multimodaux vers l'utilisateur intĂ©grant des composantes visuelles, haptiques et/ou sonores, (3) la prise en compte de l'interaction physique de l'utilisateur avec le monde virtuel dans toute sa richesse : mouvements de la tĂȘte, des deux mains, des doigts, des jambes, voire de tout le corps, en concevant de nouveaux dispositifs ou de nouvelles techniques d'interactions 3D. Les diffĂ©rentes contributions que j'ai proposĂ©es dans chacun de ces trois axes peuvent ĂȘtre regroupĂ©es au sein d'un cadre plus gĂ©nĂ©ral englobant toute la boucle d'interaction 3D avec les environnements virtuels. Elles ouvrent des perspectives pour de futures applications en rĂ©alitĂ© virtuelle mais Ă©galement plus gĂ©nĂ©ralement dans d'autres domaines tels que la simulation mĂ©dicale, l'apprentissage de gestes, la robotique, le prototypage virtuel pour l'industrie ou bien les contenus web

    3D Multimodal Interaction with Physically-based Virtual Environments

    Get PDF
    The virtual has become a huge field of exploration for researchers: it could assist the surgeon, help the prototyping of industrial objects, simulate natural phenomena, be a fantastic time machine or entertain users through games or movies. Far beyond the only visual rendering of the virtual environment, the Virtual Reality aims at -literally- immersing the user in the virtual world. VR technologies simulate digital environments with which users can interact and, as a result, perceive through different modalities the effects of their actions in real time. The challenges are huge: the user's motions need to be perceived and to have an immediate impact on the virtual world by modifying the objects in real-time. In addition, the targeted immersion of the user is not only visual: auditory or haptic feedback needs to be taken into account, merging all the sensory modalities of the user into a multimodal answer. The global objective of my research activities is to improve 3D interaction with complex virtual environments by proposing novel approaches for physically-based and multimodal interaction. I have laid the foundations of my work on designing the interactions with complex virtual worlds, referring to a higher demand in the characteristics of the virtual environments. My research could be described within three main research axes inherent to the 3D interaction loop: (1) the physically-based modeling of the virtual world to take into account the complexity of the virtual object behavior, their topology modifications as well as their interactions, (2) the multimodal feedback for combining the sensory modalities into a global answer from the virtual world to the user and (3) the design of body-based 3D interaction techniques and devices for establishing the interfaces between the user and the virtual world. All these contributions could be gathered in a general framework within the 3D interaction loop. By improving all the components of this framework, I aim at proposing approaches that could be used in future virtual reality applications but also more generally in other areas such as medical simulation, gesture training, robotics, virtual prototyping for the industry or web contents.Le virtuel est devenu un vaste champ d'exploration pour la recherche et offre de nos jours de nombreuses possibilitĂ©s : assister le chirurgien, rĂ©aliser des prototypes de piĂšces industrielles, simuler des phĂ©nomĂšnes naturels, remonter dans le temps ou proposer des applications ludiques aux utilisateurs au travers de jeux ou de films. Bien plus que le rendu purement visuel d'environnement virtuel, la rĂ©alitĂ© virtuelle aspire Ă  -littĂ©ralement- immerger l'utilisateur dans le monde virtuel. L'utilisateur peut ainsi interagir avec le contenu numĂ©rique et percevoir les effets de ses actions au travers de diffĂ©rents retours sensoriels. Permettre une vĂ©ritable immersion de l'utilisateur dans des environnements virtuels de plus en plus complexes confronte la recherche en rĂ©alitĂ© virtuelle Ă  des dĂ©fis importants: les gestes de l'utilisateur doivent ĂȘtre capturĂ©s puis directement transmis au monde virtuel afin de le modifier en temps-rĂ©el. Les retours sensoriels ne sont pas uniquement visuels mais doivent ĂȘtre combinĂ©s avec les retours auditifs ou haptiques dans une rĂ©ponse globale multimodale. L'objectif principal de mes activitĂ©s de recherche consiste Ă  amĂ©liorer l'interaction 3D avec des environnements virtuels complexes en proposant de nouvelles approches utilisant la simulation physique et exploitant au mieux les diffĂ©rentes modalitĂ©s sensorielles. Dans mes travaux, je m'intĂ©resse tout particuliĂšrement Ă  concevoir des interactions avec des mondes virtuels complexes. Mon approche peut ĂȘtre dĂ©crite au travers de trois axes principaux de recherche: (1) la modĂ©lisation dans les mondes virtuels d'environnements physiques plausibles oĂč les objets rĂ©agissent de maniĂšre naturelle, mĂȘme lorsque leur topologie est modifiĂ©e ou lorsqu'ils sont en interaction avec d'autres objets, (2) la mise en place de retours sensoriels multimodaux vers l'utilisateur intĂ©grant des composantes visuelles, haptiques et/ou sonores, (3) la prise en compte de l'interaction physique de l'utilisateur avec le monde virtuel dans toute sa richesse : mouvements de la tĂȘte, des deux mains, des doigts, des jambes, voire de tout le corps, en concevant de nouveaux dispositifs ou de nouvelles techniques d'interactions 3D. Les diffĂ©rentes contributions que j'ai proposĂ©es dans chacun de ces trois axes peuvent ĂȘtre regroupĂ©es au sein d'un cadre plus gĂ©nĂ©ral englobant toute la boucle d'interaction 3D avec les environnements virtuels. Elles ouvrent des perspectives pour de futures applications en rĂ©alitĂ© virtuelle mais Ă©galement plus gĂ©nĂ©ralement dans d'autres domaines tels que la simulation mĂ©dicale, l'apprentissage de gestes, la robotique, le prototypage virtuel pour l'industrie ou bien les contenus web

    Patient Specific Systems for Computer Assisted Robotic Surgery Simulation, Planning, and Navigation

    Get PDF
    The evolving scenario of surgery: starting from modern surgery, to the birth of medical imaging and the introduction of minimally invasive techniques, has seen in these last years the advent of surgical robotics. These systems, making possible to get through the difficulties of endoscopic surgery, allow an improved surgical performance and a better quality of the intervention. Information technology contributed to this evolution since the beginning of the digital revolution: providing innovative medical imaging devices and computer assisted surgical systems. Afterwards, the progresses in computer graphics brought innovative visualization modalities for medical datasets, and later the birth virtual reality has paved the way for virtual surgery. Although many surgical simulators already exist, there are no patient specific solutions. This thesis presents the development of patient specific software systems for preoperative planning, simulation and intraoperative assistance, designed for robotic surgery: in particular for bimanual robots that are becoming the future of single port interventions. The first software application is a virtual reality simulator for this kind of surgical robots. The system has been designed to validate the initial port placement and the operative workspace for the potential application of this surgical device. Given a bimanual robot with its own geometry and kinematics, and a patient specific 3D virtual anatomy, the surgical simulator allows the surgeon to choose the optimal positioning of the robot and the access port in the abdominal wall. Additionally, it makes possible to evaluate in a virtual environment if a dexterous movability of the robot is achievable, avoiding unwanted collisions with the surrounding anatomy to prevent potential damages in the real surgical procedure. Even if the software has been designed for a specific bimanual surgical robot, it supports any open kinematic chain structure: as far as it can be described in our custom format. The robot capabilities to accomplish specific tasks can be virtually tested using the deformable models: interacting directly with the target virtual organs, trying to avoid unwanted collisions with the surrounding anatomy not involved in the intervention. Moreover, the surgical simulator has been enhanced with algorithms and data structures to integrate biomechanical parameters into virtual deformable models (based on mass-spring-damper network) of target solid organs, in order to properly reproduce the physical behaviour of the patient anatomy during the interactions. The main biomechanical parameters (Young's modulus and density) have been integrated, allowing the automatic tuning of some model network elements, such as: the node mass and the spring stiffness. The spring damping coefficient has been modeled using the Rayleigh approach. Furthermore, the developed method automatically detect the external layer, allowing the usage of both the surface and internal Young's moduli, in order to model the main parts of dense organs: the stroma and the parenchyma. Finally the model can be manually tuned to represent lesion with specific biomechanical properties. Additionally, some software modules of the simulator have been properly extended to be integrated in a patient specific computer guidance system for intraoperative navigation and assistance in robotic single port interventions. This application provides guidance functionalities working in three different modalities: passive as a surgical navigator, assistive as a guide for the single port placement and active as a tutor preventing unwanted collision during the intervention. The simulation system has beed tested by five surgeons: simulating the robot access port placemen, and evaluating the robot movability and workspace inside the patient abdomen. The tested functionalities, rated by expert surgeons, have shown good quality and performance of the simulation. Moreover, the integration of biomechanical parameters into deformable models has beed tested with various material samples. The results have shown a good visual realism ensuring the performance required by an interactive simulation. Finally, the intraoperative navigator has been tested performing a cholecystectomy on a synthetic patient mannequin, in order to evaluate: the intraoperative navigation accuracy, the network communications latency and the overall usability of the system. The tests performed demonstrated the effectiveness and the usability of the software systems developed: encouraging the introduction of the proposed solution in the clinical practice, and the implementation of further improvements. Surgical robotics will be enhanced by an advanced integration of medical images into software systems: allowing the detailed planning of surgical interventions by means of virtual surgery simulation based on patient specific biomechanical parameters. Furthermore, the advanced functionalities offered by these systems, enable surgical robots to improve the intraoperative surgical assistance: benefitting of the knowledge of the virtual patient anatomy

    Development and Validation of a Hybrid Virtual/Physical Nuss Procedure Surgical Trainer

    Get PDF
    With continuous advancements and adoption of minimally invasive surgery, proficiency with nontrivial surgical skills involved is becoming a greater concern. Consequently, the use of surgical simulation has been increasingly embraced by many for training and skill transfer purposes. Some systems utilize haptic feedback within a high-fidelity anatomically-correct virtual environment whereas others use manikins, synthetic components, or box trainers to mimic primary components of a corresponding procedure. Surgical simulation development for some minimally invasive procedures is still, however, suboptimal or otherwise embryonic. This is true for the Nuss procedure, which is a minimally invasive surgery for correcting pectus excavatum (PE) – a congenital chest wall deformity. This work aims to address this gap by exploring the challenges of developing both a purely virtual and a purely physical simulation platform of the Nuss procedure and their implications in a training context. This work then describes the development of a hybrid mixed-reality system that integrates virtual and physical constituents as well as an augmentation of the haptic interface, to carry out a reproduction of the primary steps of the Nuss procedure and satisfy clinically relevant prerequisites for its training platform. Furthermore, this work carries out a user study to investigate the system’s face, content, and construct validity to establish its faithfulness as a training platform
    • 

    corecore