15 research outputs found

    Improving Rigid 3-D Calibration for Robotic Surgery

    Get PDF
    Autonomy is the next frontier of research in robotic surgery and its aim is to improve the quality of surgical procedures in the next future. One fundamental requirement for autonomy is advanced perception capability through vision sensors. In this article, we propose a novel calibration technique for a surgical scenario with a da Vinci Research Kit (dVRK) robot. Camera and robotic arms calibration are necessary to precise position and emulate expert surgeon. The novel calibration technique is tailored for RGB-D cameras. Different tests performed on relevant use cases prove that we significantly improve precision and accuracy with respect to state of the art solutions for similar devices on a surgical-size setups. Moreover, our calibration method can be easily extended to standard surgical endoscope used in real surgical scenario

    Improved Performance of d<sub>31</sub>-Mode Needle-actuating Transducer with PMN-PT Piezocrystal

    Get PDF
    Prototypes of a PZT-based ultrasound needle-actuating device have shown the ability to reduce needle penetration force and enhance needle visibility with color Doppler imaging during needle insertion for tissue biopsy and regional anesthesia. However, the demand for smaller, lighter devices and the need for high performance transducers have motivated investigation of a different configuration of needle-actuation transducer, utilizing the d 31-mode of PZT4 piezoceramic, and exploration of further improvement in its performance using relaxor-type piezocrystal. This paper outlines the development of the d 31-mode needle actuation transducer design from simulation to fabrication and demonstration. Full characterization was performed on transducers for performance comparison. The performance of the proposed smaller, lighter d 31-mode transducer is comparable with that of previous d−33d-{33}-mode transducers. Furthermore, it has been found to be much more efficient when using PMN-PT piezocrystal rather than piezoceramic. </p

    Semi-Automated Needle Steering in Biological Tissue Using an Ultrasound-Based Deflection Predictor

    Get PDF
    The performance of needle-based interventions depends on the accuracy of needle tip positioning. Here, a novel needle steering strategy is proposed that enhances accuracy of needle steering. In our approach the surgeon is in charge of needle insertion to ensure the safety of operation, while the needle tip bevel location is robotically controlled to minimize the targeting error. The system has two main components: (1) a real-time predictor for estimating future needle deflection as it is steered inside soft tissue, and (2) an online motion planner that calculates control decisions and steers the needle toward the target by iterative optimization of the needle deflection predictions. The predictor uses the ultrasound-based curvature information to estimate the needle deflection. Given the specification of anatomical obstacles and a target from preoperative images, the motion planner uses the deflection predictions to estimate control actions, i.e., the depth(s) at which the needle should be rotated to reach the target. Ex-vivo needle insertions are performed with and without obstacle to validate our approach. The results demonstrate the needle steering strategy guides the needle to the targets with a maximum error of 1.22 mm

    Tendon-Driven Notched Needle for Robot-Assisted Prostate Interventions

    Get PDF
    M.S

    Estudo de técnicas para rastreamento de agulhas flexíveis em imagens de ultrassom

    Get PDF
    Trabalho de ConclusĂŁo Curso (graduação)—Universidade de BrasĂ­lia, Faculdade de Tecnologia, Departamento de Engenharia ElĂ©trica, 2019.Neste trabalho sĂŁo buscados mĂ©todos computacionais para o aperfeiçoamento de tĂ©cnicas que auxiliam no rastreamento de agulhas exĂ­veis em imagens de ultrassom, durante procedimentos clĂ­nicos minimamente invasivos. Implementou-se um mĂ©todo adicional para identicação da ponta da agulha, alĂ©m de dois existentes, durante trechos de vĂ­deo de ensaios realizados em phantoms de gelatina. Adicionalmente foi implementado um algoritmo do ltro de Kalman, visando aproveitar informaçÔes dos quadros anteriores e gerar estimativas mais precisas e suaves da posição da ponta da agulha. Os resultados obtidos foram validados de acordo com uma base de dados existente, de resultados anteriores, e com novas coordenadas geradas a partir de simulaçÔes feitas utilizando-se os vĂ­deos.In this work are sought computational methods for the enhancement of techniques that aid in the tracking of exible needles in ultrasound images, during minimally invasive clinical procedures. An additional method for the needle tip localization was implemented, in addition to two existent ones, during video segments from tests performed in gelatine phantoms. Additionally, a Kalman lter algorithm was implemented, aiming to utilize information from previous frames and generate more precise and smooth estimates of the needle tip's location. The obtained results were validated according to an existing database, from previous results, and with new coordinates generated from simulations ran using the videos

    SMART IMAGE-GUIDED NEEDLE INSERTION FOR TISSUE BIOPSY

    Get PDF
    M.S

    Medical SLAM in an autonomous robotic system

    Get PDF
    One of the main challenges for computer-assisted surgery (CAS) is to determine the intra-operative morphology and motion of soft-tissues. This information is prerequisite to the registration of multi-modal patient-specific data for enhancing the surgeon’s navigation capabilities by observing beyond exposed tissue surfaces and for providing intelligent control of robotic-assisted instruments. In minimally invasive surgery (MIS), optical techniques are an increasingly attractive approach for in vivo 3D reconstruction of the soft-tissue surface geometry. This thesis addresses the ambitious goal of achieving surgical autonomy, through the study of the anatomical environment by Initially studying the technology present and what is needed to analyze the scene: vision sensors. A novel endoscope for autonomous surgical task execution is presented in the first part of this thesis. Which combines a standard stereo camera with a depth sensor. This solution introduces several key advantages, such as the possibility of reconstructing the 3D at a greater distance than traditional endoscopes. Then the problem of hand-eye calibration is tackled, which unites the vision system and the robot in a single reference system. Increasing the accuracy in the surgical work plan. In the second part of the thesis the problem of the 3D reconstruction and the algorithms currently in use were addressed. In MIS, simultaneous localization and mapping (SLAM) can be used to localize the pose of the endoscopic camera and build ta 3D model of the tissue surface. Another key element for MIS is to have real-time knowledge of the pose of surgical tools with respect to the surgical camera and underlying anatomy. Starting from the ORB-SLAM algorithm we have modified the architecture to make it usable in an anatomical environment by adding the registration of the pre-operative information of the intervention to the map obtained from the SLAM. Once it has been proven that the slam algorithm is usable in an anatomical environment, it has been improved by adding semantic segmentation to be able to distinguish dynamic features from static ones. All the results in this thesis are validated on training setups, which mimics some of the challenges of real surgery and on setups that simulate the human body within Autonomous Robotic Surgery (ARS) and Smart Autonomous Robotic Assistant Surgeon (SARAS) projects

    Medical SLAM in an autonomous robotic system

    Get PDF
    One of the main challenges for computer-assisted surgery (CAS) is to determine the intra-operative morphology and motion of soft-tissues. This information is prerequisite to the registration of multi-modal patient-specific data for enhancing the surgeon’s navigation capabilities by observing beyond exposed tissue surfaces and for providing intelligent control of robotic-assisted instruments. In minimally invasive surgery (MIS), optical techniques are an increasingly attractive approach for in vivo 3D reconstruction of the soft-tissue surface geometry. This thesis addresses the ambitious goal of achieving surgical autonomy, through the study of the anatomical environment by Initially studying the technology present and what is needed to analyze the scene: vision sensors. A novel endoscope for autonomous surgical task execution is presented in the first part of this thesis. Which combines a standard stereo camera with a depth sensor. This solution introduces several key advantages, such as the possibility of reconstructing the 3D at a greater distance than traditional endoscopes. Then the problem of hand-eye calibration is tackled, which unites the vision system and the robot in a single reference system. Increasing the accuracy in the surgical work plan. In the second part of the thesis the problem of the 3D reconstruction and the algorithms currently in use were addressed. In MIS, simultaneous localization and mapping (SLAM) can be used to localize the pose of the endoscopic camera and build ta 3D model of the tissue surface. Another key element for MIS is to have real-time knowledge of the pose of surgical tools with respect to the surgical camera and underlying anatomy. Starting from the ORB-SLAM algorithm we have modified the architecture to make it usable in an anatomical environment by adding the registration of the pre-operative information of the intervention to the map obtained from the SLAM. Once it has been proven that the slam algorithm is usable in an anatomical environment, it has been improved by adding semantic segmentation to be able to distinguish dynamic features from static ones. All the results in this thesis are validated on training setups, which mimics some of the challenges of real surgery and on setups that simulate the human body within Autonomous Robotic Surgery (ARS) and Smart Autonomous Robotic Assistant Surgeon (SARAS) projects
    corecore