6 research outputs found

    Towards Closed-loop, Robot Assisted Percutaneous Interventions under MRI Guidance

    Get PDF
    Image guided therapy procedures under MRI guidance has been a focused research area over past decade. Also, over the last decade, various MRI guided robotic devices have been developed and used clinically for percutaneous interventions, such as prostate biopsy, brachytherapy, and tissue ablation. Though MRI provides better soft tissue contrast compared to Computed Tomography and Ultrasound, it poses various challenges like constrained space, less ergonomic patient access and limited material choices due to its high magnetic field. Even after, advancements in MRI compatible actuation methods and robotic devices using them, most MRI guided interventions are still open-loop in nature and relies on preoperative or intraoperative images. In this thesis, an intraoperative MRI guided robotic system for prostate biopsy comprising of an MRI compatible 4-DOF robotic manipulator, robot controller and control application with Clinical User Interface (CUI) and surgical planning applications (3DSlicer and RadVision) is presented. This system utilizes intraoperative images acquired after each full or partial needle insertion for needle tip localization. Presented system was approved by Institutional Review Board at Brigham and Women\u27s Hospital(BWH) and has been used in 30 patient trials. Successful translation of such a system utilizing intraoperative MR images motivated towards the development of a system architecture for close-loop, real-time MRI guided percutaneous interventions. Robot assisted, close-loop intervention could help in accurate positioning and localization of the therapy delivery instrument, improve physician and patient comfort and allow real-time therapy monitoring. Also, utilizing real-time MR images could allow correction of surgical instrument trajectory and controlled therapy delivery. Two of the applications validating the presented architecture; closed-loop needle steering and MRI guided brain tumor ablation are demonstrated under real-time MRI guidance

    Design, Development, and Evaluation of a Teleoperated Master-Slave Surgical System for Breast Biopsy under Continuous MRI Guidance

    Get PDF
    The goal of this project is to design and develop a teleoperated master-slave surgical system that can potentially assist the physician in performing breast biopsy with a magnetic resonance imaging (MRI) compatible robotic system. MRI provides superior soft-tissue contrast compared to other imaging modalities such as computed tomography or ultrasound and is used for both diagnostic and therapeutic procedures. The strong magnetic field and the limited space inside the MRI bore, however, restrict direct means of breast biopsy while performing real-time imaging. Therefore, current breast biopsy procedures employ a blind targeting approach based on magnetic resonance (MR) images obtained a priori. Due to possible patient involuntary motion or inaccurate insertion through the registration grid, such approach could lead to tool tip positioning errors thereby affecting diagnostic accuracy and leading to a long and painful process, if repeated procedures are required. Hence, it is desired to develop the aforementioned teleoperation system to take advantages of real-time MR imaging and avoid multiple biopsy needle insertions, improving the procedure accuracy as well as reducing the sampling errors. The design, implementation, and evaluation of the teleoperation system is presented in this dissertation. A MRI-compatible slave robot is implemented, which consists of a 1 degree of freedom (DOF) needle driver, a 3-DOF parallel mechanism, and a 2-DOF X-Y stage. This slave robot is actuated with pneumatic cylinders through long transmission lines except the 1-DOF needle driver is actuated with a piezo motor. Pneumatic actuation through long transmission lines is then investigated using proportional pressure valves and controllers based on sliding mode control are presented. A dedicated master robot is also developed, and the kinematic map between the master and the slave robot is established. The two robots are integrated into a teleoperation system and a graphical user interface is developed to provide visual feedback to the physician. MRI experiment shows that the slave robot is MRI-compatible, and the ex vivo test shows over 85%success rate in targeting with the MRI-compatible robotic system. The success in performing in vivo animal experiments further confirm the potential of further developing the proposed robotic system for clinical applications

    AUGMENTED REALITY AND INTRAOPERATIVE C-ARM CONE-BEAM COMPUTED TOMOGRAPHY FOR IMAGE-GUIDED ROBOTIC SURGERY

    Get PDF
    Minimally-invasive robotic-assisted surgery is a rapidly-growing alternative to traditionally open and laparoscopic procedures; nevertheless, challenges remain. Standard of care derives surgical strategies from preoperative volumetric data (i.e., computed tomography (CT) and magnetic resonance (MR) images) that benefit from the ability of multiple modalities to delineate different anatomical boundaries. However, preoperative images may not reflect a possibly highly deformed perioperative setup or intraoperative deformation. Additionally, in current clinical practice, the correspondence of preoperative plans to the surgical scene is conducted as a mental exercise; thus, the accuracy of this practice is highly dependent on the surgeon’s experience and therefore subject to inconsistencies. In order to address these fundamental limitations in minimally-invasive robotic surgery, this dissertation combines a high-end robotic C-arm imaging system and a modern robotic surgical platform as an integrated intraoperative image-guided system. We performed deformable registration of preoperative plans to a perioperative cone-beam computed tomography (CBCT), acquired after the patient is positioned for intervention. From the registered surgical plans, we overlaid critical information onto the primary intraoperative visual source, the robotic endoscope, by using augmented reality. Guidance afforded by this system not only uses augmented reality to fuse virtual medical information, but also provides tool localization and other dynamic intraoperative updated behavior in order to present enhanced depth feedback and information to the surgeon. These techniques in guided robotic surgery required a streamlined approach to creating intuitive and effective human-machine interferences, especially in visualization. Our software design principles create an inherently information-driven modular architecture incorporating robotics and intraoperative imaging through augmented reality. The system's performance is evaluated using phantoms and preclinical in-vivo experiments for multiple applications, including transoral robotic surgery, robot-assisted thoracic interventions, and cocheostomy for cochlear implantation. The resulting functionality, proposed architecture, and implemented methodologies can be further generalized to other C-arm-based image guidance for additional extensions in robotic surgery

    The radiological investigation of musculoskeletal tumours : chairperson's introduction

    No full text

    Infective/inflammatory disorders

    Get PDF
    corecore