7,213 research outputs found

    Image-Guided Robot-Assisted Needle Intervention Devices and Methods to Improve Targeting Accuracy

    Get PDF
    This dissertation addresses the development of medical devices, image-guided robots, and their application in needle-based interventions, as well as methods to improve accuracy and safety in clinical procedures. Needle access is an essential component of minimally invasive diagnostic and therapeutic procedures. Image-guiding devices are often required to help physicians handle the needle based on the images. Integrating robotic accuracy and precision with digital medical imaging has the potential to improve the clinical outcomes. The dissertation presents two robotic devices for interventions under Magnetic Resonance Imaging (MRI) respectively Computed Tomography (CT) – Ultrasound(US) cross modality guidance. The MRI robot is a MR Safe Remote Center of Motion (RCM) robot for direct image-guided needle interventions such as brain surgery. The dissertation also presents the integration of the robot with an intraoperative MRI scanner, and preclinical tests for deep brain needle access. The CT-Ultrasound guidance uses a robotic manipulator to handle an US probe within a CT scanner. The dissertation presents methods related to the co-registration of multi-image spaces with an intermediary frame, experiments for needle targeting. The dissertation also presents method on using optical tracking measurements specifically for medical robots. The method was derived to test the robots presented above. With advanced image-guidance, such as the robotic approaches, needle targeting accuracy may still be deteriorated by errors related to needle defections. Methods and associated devices for needle steering on the straight path are presented. These are a robotic approach that uses real-time ultrasound guidance to steer the needle; Modeling and testing of a method to markedly reduce targeting errors with bevel-point needles; Dynamic design, manufacturing, and testing of a novel core biopsy needle with straighter path, power assistance, reduced noise, and safer operation. Overall, the dissertation presents several developments that contribute to the field of medical devices, image-guided robots, and needle interventions. These include robot testing methods that can be used by other researchers, needle steering methods that can be used directly by physicians or for robotic devices, as well as several methods to improve the accuracy in image-guided interventions. Collectively, these contribute to the field and may have a significant clinical impact

    Optical techniques for 3D surface reconstruction in computer-assisted laparoscopic surgery

    Get PDF
    One of the main challenges for computer-assisted surgery (CAS) is to determine the intra-opera- tive morphology and motion of soft-tissues. This information is prerequisite to the registration of multi-modal patient-specific data for enhancing the surgeon’s navigation capabilites by observ- ing beyond exposed tissue surfaces and for providing intelligent control of robotic-assisted in- struments. In minimally invasive surgery (MIS), optical techniques are an increasingly attractive approach for in vivo 3D reconstruction of the soft-tissue surface geometry. This paper reviews the state-of-the-art methods for optical intra-operative 3D reconstruction in laparoscopic surgery and discusses the technical challenges and future perspectives towards clinical translation. With the recent paradigm shift of surgical practice towards MIS and new developments in 3D opti- cal imaging, this is a timely discussion about technologies that could facilitate complex CAS procedures in dynamic and deformable anatomical regions

    An Open-Source 7-Axis, Robotic Platform to Enable Dexterous Procedures within CT Scanners

    Full text link
    This paper describes the design, manufacture, and performance of a highly dexterous, low-profile, 7 Degree-of-Freedom (DOF) robotic arm for CT-guided percutaneous needle biopsy. Direct CT guidance allows physicians to localize tumours quickly; however, needle insertion is still performed by hand. This system is mounted to a fully active gantry superior to the patient's head and teleoperated by a radiologist. Unlike other similar robots, this robot's fully serial-link approach uses a unique combination of belt and cable drives for high-transparency and minimal-backlash, allowing for an expansive working area and numerous approach angles to targets all while maintaining a small in-bore cross-section of less than 16cm216cm^2. Simulations verified the system's expansive collision free work-space and ability to hit targets across the entire chest, as required for lung cancer biopsy. Targeting error is on average <1mm<1mm on a teleoperated accuracy task, illustrating the system's sufficient accuracy to perform biopsy procedures. The system is designed for lung biopsies due to the large working volume that is required for reaching peripheral lung lesions, though, with its large working volume and small in-bore cross-sectional area, the robotic system is effectively a general-purpose CT-compatible manipulation device for percutaneous procedures. Finally, with the considerable development time undertaken in designing a precise and flexible-use system and with the desire to reduce the burden of other researchers in developing algorithms for image-guided surgery, this system provides open-access, and to the best of our knowledge, is the first open-hardware image-guided biopsy robot of its kind.Comment: 8 pages, 9 figures, final submission to IROS 201

    Robot Autonomy for Surgery

    Full text link
    Autonomous surgery involves having surgical tasks performed by a robot operating under its own will, with partial or no human involvement. There are several important advantages of automation in surgery, which include increasing precision of care due to sub-millimeter robot control, real-time utilization of biosignals for interventional care, improvements to surgical efficiency and execution, and computer-aided guidance under various medical imaging and sensing modalities. While these methods may displace some tasks of surgical teams and individual surgeons, they also present new capabilities in interventions that are too difficult or go beyond the skills of a human. In this chapter, we provide an overview of robot autonomy in commercial use and in research, and present some of the challenges faced in developing autonomous surgical robots

    Computer- and robot-assisted Medical Intervention

    Full text link
    Medical robotics includes assistive devices used by the physician in order to make his/her diagnostic or therapeutic practices easier and more efficient. This chapter focuses on such systems. It introduces the general field of Computer-Assisted Medical Interventions, its aims, its different components and describes the place of robots in that context. The evolutions in terms of general design and control paradigms in the development of medical robots are presented and issues specific to that application domain are discussed. A view of existing systems, on-going developments and future trends is given. A case-study is detailed. Other types of robotic help in the medical environment (such as for assisting a handicapped person, for rehabilitation of a patient or for replacement of some damaged/suppressed limbs or organs) are out of the scope of this chapter.Comment: Handbook of Automation, Shimon Nof (Ed.) (2009) 000-00

    Augmented Reality-based Feedback for Technician-in-the-loop C-arm Repositioning

    Full text link
    Interventional C-arm imaging is crucial to percutaneous orthopedic procedures as it enables the surgeon to monitor the progress of surgery on the anatomy level. Minimally invasive interventions require repeated acquisition of X-ray images from different anatomical views to verify tool placement. Achieving and reproducing these views often comes at the cost of increased surgical time and radiation dose to both patient and staff. This work proposes a marker-free "technician-in-the-loop" Augmented Reality (AR) solution for C-arm repositioning. The X-ray technician operating the C-arm interventionally is equipped with a head-mounted display capable of recording desired C-arm poses in 3D via an integrated infrared sensor. For C-arm repositioning to a particular target view, the recorded C-arm pose is restored as a virtual object and visualized in an AR environment, serving as a perceptual reference for the technician. We conduct experiments in a setting simulating orthopedic trauma surgery. Our proof-of-principle findings indicate that the proposed system can decrease the 2.76 X-ray images required per desired view down to zero, suggesting substantial reductions of radiation dose during C-arm repositioning. The proposed AR solution is a first step towards facilitating communication between the surgeon and the surgical staff, improving the quality of surgical image acquisition, and enabling context-aware guidance for surgery rooms of the future. The concept of technician-in-the-loop design will become relevant to various interventions considering the expected advancements of sensing and wearable computing in the near future
    corecore