45 research outputs found

    Realistic tool-tissue interaction models for surgical simulation and planning

    Get PDF
    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in pre- and intra-operative surgical planning. Realistic modeling of medical interventions involving tool-tissue interactions has been considered to be a key requirement in the development of high-fidelity simulators and planners. The soft-tissue constitutive laws, organ geometry and boundary conditions imposed by the connective tissues surrounding the organ, and the shape of the surgical tool interacting with the organ are some of the factors that govern the accuracy of medical intervention planning.\ud \ud This thesis is divided into three parts. First, we compare the accuracy of linear and nonlinear constitutive laws for tissue. An important consequence of nonlinear models is the Poynting effect, in which shearing of tissue results in normal force; this effect is not seen in a linear elastic model. The magnitude of the normal force for myocardial tissue is shown to be larger than the human contact force discrimination threshold. Further, in order to investigate and quantify the role of the Poynting effect on material discrimination, we perform a multidimensional scaling study. Second, we consider the effects of organ geometry and boundary constraints in needle path planning. Using medical images and tissue mechanical properties, we develop a model of the prostate and surrounding organs. We show that, for needle procedures such as biopsy or brachytherapy, organ geometry and boundary constraints have more impact on target motion than tissue material parameters. Finally, we investigate the effects surgical tool shape on the accuracy of medical intervention planning. We consider the specific case of robotic needle steering, in which asymmetry of a bevel-tip needle results in the needle naturally bending when it is inserted into soft tissue. We present an analytical and finite element (FE) model for the loads developed at the bevel tip during needle-tissue interaction. The analytical model explains trends observed in the experiments. We incorporated physical parameters (rupture toughness and nonlinear material elasticity) into the FE model that included both contact and cohesive zone models to simulate tissue cleavage. The model shows that the tip forces are sensitive to the rupture toughness. In order to model the mechanics of deflection of the needle, we use an energy-based formulation that incorporates tissue-specific parameters such as rupture toughness, nonlinear material elasticity, and interaction stiffness, and needle geometric and material properties. Simulation results follow similar trends (deflection and radius of curvature) to those observed in macroscopic experimental studies of a robot-driven needle interacting with gels

    Haptics in Robot-Assisted Surgery: Challenges and Benefits

    Get PDF
    Robotic surgery is transforming the current surgical practice, not only by improving the conventional surgical methods but also by introducing innovative robot-enhanced approaches that broaden the capabilities of clinicians. Being mainly of man-machine collaborative type, surgical robots are seen as media that transfer pre- and intra-operative information to the operator and reproduce his/her motion, with appropriate filtering, scaling, or limitation, to physically interact with the patient. The field, however, is far from maturity and, more critically, is still a subject of controversy in medical communities. Limited or absent haptic feedback is reputed to be among reasons that impede further spread of surgical robots. In this paper objectives and challenges of deploying haptic technologies in surgical robotics is discussed and a systematic review is performed on works that have studied the effects of providing haptic information to the users in major branches of robotic surgery. It has been tried to encompass both classical works and the state of the art approaches, aiming at delivering a comprehensive and balanced survey both for researchers starting their work in this field and for the experts

    Robotics-Assisted Needle Steering for Percutaneous Interventions: Modeling and Experiments

    Get PDF
    Needle insertion and guidance plays an important role in medical procedures such as brachytherapy and biopsy. Flexible needles have the potential to facilitate precise targeting and avoid collisions during medical interventions while reducing trauma to the patient and post-puncture issues. Nevertheless, error introduced during guidance degrades the effectiveness of the planned therapy or diagnosis. Although steering using flexible bevel-tip needles provides great mobility and dexterity, a major barrier is the complexity of needle-tissue interaction that does not lend itself to intuitive control. To overcome this problem, a robotic system can be employed to perform trajectory planning and tracking by manipulation of the needle base. This research project focuses on a control-theoretic approach and draws on the rich literature from control and systems theory to model needle-tissue interaction and needle flexion and then design a robotics-based strategy for needle insertion/steering. The resulting solutions will directly benefit a wide range of needle-based interventions. The outcome of this computer-assisted approach will not only enable us to perform efficient preoperative trajectory planning, but will also provide more insight into needle-tissue interaction that will be helpful in developing advanced intraoperative algorithms for needle steering. Experimental validation of the proposed methodologies was carried out on a state of-the-art 5-DOF robotic system designed and constructed in-house primarily for prostate brachytherapy. The system is equipped with a Nano43 6-DOF force/torque sensor (ATI Industrial Automation) to measure forces and torques acting on the needle shaft. In our setup, an Aurora electromagnetic tracker (Northern Digital Inc.) is the sensing device used for measuring needle deflection. A multi-threaded application for control, sensor readings, data logging and communication over the ethernet was developed using Microsoft Visual C 2005, MATLAB 2007 and the QuaRC Toolbox (Quanser Inc.). Various artificial phantoms were developed so as to create a realistic medium in terms of elasticity and insertion force ranges; however, they simulated a uniform environment without exhibiting complexities of organic tissues. Experiments were also conducted on beef liver and fresh chicken breast, beef, and ham, to investigate the behavior of a variety biological tissues

    3D Multimodal Interaction with Physically-based Virtual Environments

    Get PDF
    The virtual has become a huge field of exploration for researchers: it could assist the surgeon, help the prototyping of industrial objects, simulate natural phenomena, be a fantastic time machine or entertain users through games or movies. Far beyond the only visual rendering of the virtual environment, the Virtual Reality aims at -literally- immersing the user in the virtual world. VR technologies simulate digital environments with which users can interact and, as a result, perceive through different modalities the effects of their actions in real time. The challenges are huge: the user's motions need to be perceived and to have an immediate impact on the virtual world by modifying the objects in real-time. In addition, the targeted immersion of the user is not only visual: auditory or haptic feedback needs to be taken into account, merging all the sensory modalities of the user into a multimodal answer. The global objective of my research activities is to improve 3D interaction with complex virtual environments by proposing novel approaches for physically-based and multimodal interaction. I have laid the foundations of my work on designing the interactions with complex virtual worlds, referring to a higher demand in the characteristics of the virtual environments. My research could be described within three main research axes inherent to the 3D interaction loop: (1) the physically-based modeling of the virtual world to take into account the complexity of the virtual object behavior, their topology modifications as well as their interactions, (2) the multimodal feedback for combining the sensory modalities into a global answer from the virtual world to the user and (3) the design of body-based 3D interaction techniques and devices for establishing the interfaces between the user and the virtual world. All these contributions could be gathered in a general framework within the 3D interaction loop. By improving all the components of this framework, I aim at proposing approaches that could be used in future virtual reality applications but also more generally in other areas such as medical simulation, gesture training, robotics, virtual prototyping for the industry or web contents.Le virtuel est devenu un vaste champ d'exploration pour la recherche et offre de nos jours de nombreuses possibilitĂ©s : assister le chirurgien, rĂ©aliser des prototypes de piĂšces industrielles, simuler des phĂ©nomĂšnes naturels, remonter dans le temps ou proposer des applications ludiques aux utilisateurs au travers de jeux ou de films. Bien plus que le rendu purement visuel d'environnement virtuel, la rĂ©alitĂ© virtuelle aspire Ă  -littĂ©ralement- immerger l'utilisateur dans le monde virtuel. L'utilisateur peut ainsi interagir avec le contenu numĂ©rique et percevoir les effets de ses actions au travers de diffĂ©rents retours sensoriels. Permettre une vĂ©ritable immersion de l'utilisateur dans des environnements virtuels de plus en plus complexes confronte la recherche en rĂ©alitĂ© virtuelle Ă  des dĂ©fis importants: les gestes de l'utilisateur doivent ĂȘtre capturĂ©s puis directement transmis au monde virtuel afin de le modifier en temps-rĂ©el. Les retours sensoriels ne sont pas uniquement visuels mais doivent ĂȘtre combinĂ©s avec les retours auditifs ou haptiques dans une rĂ©ponse globale multimodale. L'objectif principal de mes activitĂ©s de recherche consiste Ă  amĂ©liorer l'interaction 3D avec des environnements virtuels complexes en proposant de nouvelles approches utilisant la simulation physique et exploitant au mieux les diffĂ©rentes modalitĂ©s sensorielles. Dans mes travaux, je m'intĂ©resse tout particuliĂšrement Ă  concevoir des interactions avec des mondes virtuels complexes. Mon approche peut ĂȘtre dĂ©crite au travers de trois axes principaux de recherche: (1) la modĂ©lisation dans les mondes virtuels d'environnements physiques plausibles oĂč les objets rĂ©agissent de maniĂšre naturelle, mĂȘme lorsque leur topologie est modifiĂ©e ou lorsqu'ils sont en interaction avec d'autres objets, (2) la mise en place de retours sensoriels multimodaux vers l'utilisateur intĂ©grant des composantes visuelles, haptiques et/ou sonores, (3) la prise en compte de l'interaction physique de l'utilisateur avec le monde virtuel dans toute sa richesse : mouvements de la tĂȘte, des deux mains, des doigts, des jambes, voire de tout le corps, en concevant de nouveaux dispositifs ou de nouvelles techniques d'interactions 3D. Les diffĂ©rentes contributions que j'ai proposĂ©es dans chacun de ces trois axes peuvent ĂȘtre regroupĂ©es au sein d'un cadre plus gĂ©nĂ©ral englobant toute la boucle d'interaction 3D avec les environnements virtuels. Elles ouvrent des perspectives pour de futures applications en rĂ©alitĂ© virtuelle mais Ă©galement plus gĂ©nĂ©ralement dans d'autres domaines tels que la simulation mĂ©dicale, l'apprentissage de gestes, la robotique, le prototypage virtuel pour l'industrie ou bien les contenus web

    3D Multimodal Interaction with Physically-based Virtual Environments

    Get PDF
    The virtual has become a huge field of exploration for researchers: it could assist the surgeon, help the prototyping of industrial objects, simulate natural phenomena, be a fantastic time machine or entertain users through games or movies. Far beyond the only visual rendering of the virtual environment, the Virtual Reality aims at -literally- immersing the user in the virtual world. VR technologies simulate digital environments with which users can interact and, as a result, perceive through different modalities the effects of their actions in real time. The challenges are huge: the user's motions need to be perceived and to have an immediate impact on the virtual world by modifying the objects in real-time. In addition, the targeted immersion of the user is not only visual: auditory or haptic feedback needs to be taken into account, merging all the sensory modalities of the user into a multimodal answer. The global objective of my research activities is to improve 3D interaction with complex virtual environments by proposing novel approaches for physically-based and multimodal interaction. I have laid the foundations of my work on designing the interactions with complex virtual worlds, referring to a higher demand in the characteristics of the virtual environments. My research could be described within three main research axes inherent to the 3D interaction loop: (1) the physically-based modeling of the virtual world to take into account the complexity of the virtual object behavior, their topology modifications as well as their interactions, (2) the multimodal feedback for combining the sensory modalities into a global answer from the virtual world to the user and (3) the design of body-based 3D interaction techniques and devices for establishing the interfaces between the user and the virtual world. All these contributions could be gathered in a general framework within the 3D interaction loop. By improving all the components of this framework, I aim at proposing approaches that could be used in future virtual reality applications but also more generally in other areas such as medical simulation, gesture training, robotics, virtual prototyping for the industry or web contents.Le virtuel est devenu un vaste champ d'exploration pour la recherche et offre de nos jours de nombreuses possibilitĂ©s : assister le chirurgien, rĂ©aliser des prototypes de piĂšces industrielles, simuler des phĂ©nomĂšnes naturels, remonter dans le temps ou proposer des applications ludiques aux utilisateurs au travers de jeux ou de films. Bien plus que le rendu purement visuel d'environnement virtuel, la rĂ©alitĂ© virtuelle aspire Ă  -littĂ©ralement- immerger l'utilisateur dans le monde virtuel. L'utilisateur peut ainsi interagir avec le contenu numĂ©rique et percevoir les effets de ses actions au travers de diffĂ©rents retours sensoriels. Permettre une vĂ©ritable immersion de l'utilisateur dans des environnements virtuels de plus en plus complexes confronte la recherche en rĂ©alitĂ© virtuelle Ă  des dĂ©fis importants: les gestes de l'utilisateur doivent ĂȘtre capturĂ©s puis directement transmis au monde virtuel afin de le modifier en temps-rĂ©el. Les retours sensoriels ne sont pas uniquement visuels mais doivent ĂȘtre combinĂ©s avec les retours auditifs ou haptiques dans une rĂ©ponse globale multimodale. L'objectif principal de mes activitĂ©s de recherche consiste Ă  amĂ©liorer l'interaction 3D avec des environnements virtuels complexes en proposant de nouvelles approches utilisant la simulation physique et exploitant au mieux les diffĂ©rentes modalitĂ©s sensorielles. Dans mes travaux, je m'intĂ©resse tout particuliĂšrement Ă  concevoir des interactions avec des mondes virtuels complexes. Mon approche peut ĂȘtre dĂ©crite au travers de trois axes principaux de recherche: (1) la modĂ©lisation dans les mondes virtuels d'environnements physiques plausibles oĂč les objets rĂ©agissent de maniĂšre naturelle, mĂȘme lorsque leur topologie est modifiĂ©e ou lorsqu'ils sont en interaction avec d'autres objets, (2) la mise en place de retours sensoriels multimodaux vers l'utilisateur intĂ©grant des composantes visuelles, haptiques et/ou sonores, (3) la prise en compte de l'interaction physique de l'utilisateur avec le monde virtuel dans toute sa richesse : mouvements de la tĂȘte, des deux mains, des doigts, des jambes, voire de tout le corps, en concevant de nouveaux dispositifs ou de nouvelles techniques d'interactions 3D. Les diffĂ©rentes contributions que j'ai proposĂ©es dans chacun de ces trois axes peuvent ĂȘtre regroupĂ©es au sein d'un cadre plus gĂ©nĂ©ral englobant toute la boucle d'interaction 3D avec les environnements virtuels. Elles ouvrent des perspectives pour de futures applications en rĂ©alitĂ© virtuelle mais Ă©galement plus gĂ©nĂ©ralement dans d'autres domaines tels que la simulation mĂ©dicale, l'apprentissage de gestes, la robotique, le prototypage virtuel pour l'industrie ou bien les contenus web

    Design, Development, and Evaluation of a Teleoperated Master-Slave Surgical System for Breast Biopsy under Continuous MRI Guidance

    Get PDF
    The goal of this project is to design and develop a teleoperated master-slave surgical system that can potentially assist the physician in performing breast biopsy with a magnetic resonance imaging (MRI) compatible robotic system. MRI provides superior soft-tissue contrast compared to other imaging modalities such as computed tomography or ultrasound and is used for both diagnostic and therapeutic procedures. The strong magnetic field and the limited space inside the MRI bore, however, restrict direct means of breast biopsy while performing real-time imaging. Therefore, current breast biopsy procedures employ a blind targeting approach based on magnetic resonance (MR) images obtained a priori. Due to possible patient involuntary motion or inaccurate insertion through the registration grid, such approach could lead to tool tip positioning errors thereby affecting diagnostic accuracy and leading to a long and painful process, if repeated procedures are required. Hence, it is desired to develop the aforementioned teleoperation system to take advantages of real-time MR imaging and avoid multiple biopsy needle insertions, improving the procedure accuracy as well as reducing the sampling errors. The design, implementation, and evaluation of the teleoperation system is presented in this dissertation. A MRI-compatible slave robot is implemented, which consists of a 1 degree of freedom (DOF) needle driver, a 3-DOF parallel mechanism, and a 2-DOF X-Y stage. This slave robot is actuated with pneumatic cylinders through long transmission lines except the 1-DOF needle driver is actuated with a piezo motor. Pneumatic actuation through long transmission lines is then investigated using proportional pressure valves and controllers based on sliding mode control are presented. A dedicated master robot is also developed, and the kinematic map between the master and the slave robot is established. The two robots are integrated into a teleoperation system and a graphical user interface is developed to provide visual feedback to the physician. MRI experiment shows that the slave robot is MRI-compatible, and the ex vivo test shows over 85%success rate in targeting with the MRI-compatible robotic system. The success in performing in vivo animal experiments further confirm the potential of further developing the proposed robotic system for clinical applications

    Sensorless Motion Planning for Medical Needle Insertion in Deformable Tissues

    Get PDF
    Minimally invasive medical procedures such as biopsies, anesthesia drug injections, and brachytherapy cancer treatments require inserting a needle to a specific target inside soft tissues. This is difficult because needle insertion displaces and deforms the surrounding soft tissues causing the target to move during the procedure. To facilitate physician training and preoperative planning for these procedures, we develop a needle insertion motion planning system based on an interactive simulation of needle insertion in deformable tissues and numerical optimization to reduce placement error. We describe a 2-D physically based, dynamic simulation of needle insertion that uses a finite-element model of deformable soft tissues and models needle cutting and frictional forces along the needle shaft. The simulation offers guarantees on simulation stability for mesh modications and achieves interactive, real-time performance on a standard PC. Using texture mapping, the simulation provides visualization comparable to ultrasound images that the physician would see during the procedure. We use the simulation as a component of a sensorless planning algorithm that uses numerical optimization to compute needle insertion offsets that compensate for tissue deformations. We apply the method to radioactive seed implantation during permanent seed prostate brachytherapy to minimize seed placement error

    Real-time hybrid cutting with dynamic fluid visualization for virtual surgery

    Get PDF
    It is widely accepted that a reform in medical teaching must be made to meet today's high volume training requirements. Virtual simulation offers a potential method of providing such trainings and some current medical training simulations integrate haptic and visual feedback to enhance procedure learning. The purpose of this project is to explore the capability of Virtual Reality (VR) technology to develop a training simulator for surgical cutting and bleeding in a general surgery
    corecore