3,840 research outputs found

    Autonomous Tissue Scanning under Free-Form Motion for Intraoperative Tissue Characterisation

    Full text link
    In Minimally Invasive Surgery (MIS), tissue scanning with imaging probes is required for subsurface visualisation to characterise the state of the tissue. However, scanning of large tissue surfaces in the presence of deformation is a challenging task for the surgeon. Recently, robot-assisted local tissue scanning has been investigated for motion stabilisation of imaging probes to facilitate the capturing of good quality images and reduce the surgeon's cognitive load. Nonetheless, these approaches require the tissue surface to be static or deform with periodic motion. To eliminate these assumptions, we propose a visual servoing framework for autonomous tissue scanning, able to deal with free-form tissue deformation. The 3D structure of the surgical scene is recovered and a feature-based method is proposed to estimate the motion of the tissue in real-time. A desired scanning trajectory is manually defined on a reference frame and continuously updated using projective geometry to follow the tissue motion and control the movement of the robotic arm. The advantage of the proposed method is that it does not require the learning of the tissue motion prior to scanning and can deal with free-form deformation. We deployed this framework on the da Vinci surgical robot using the da Vinci Research Kit (dVRK) for Ultrasound tissue scanning. Since the framework does not rely on information from the Ultrasound data, it can be easily extended to other probe-based imaging modalities.Comment: 7 pages, 5 figures, ICRA 202

    Automated pick-up of suturing needles for robotic surgical assistance

    Get PDF
    Robot-assisted laparoscopic prostatectomy (RALP) is a treatment for prostate cancer that involves complete or nerve sparing removal prostate tissue that contains cancer. After removal the bladder neck is successively sutured directly with the urethra. The procedure is called urethrovesical anastomosis and is one of the most dexterity demanding tasks during RALP. Two suturing instruments and a pair of needles are used in combination to perform a running stitch during urethrovesical anastomosis. While robotic instruments provide enhanced dexterity to perform the anastomosis, it is still highly challenging and difficult to learn. In this paper, we presents a vision-guided needle grasping method for automatically grasping the needle that has been inserted into the patient prior to anastomosis. We aim to automatically grasp the suturing needle in a position that avoids hand-offs and immediately enables the start of suturing. The full grasping process can be broken down into: a needle detection algorithm; an approach phase where the surgical tool moves closer to the needle based on visual feedback; and a grasping phase through path planning based on observed surgical practice. Our experimental results show examples of successful autonomous grasping that has the potential to simplify and decrease the operational time in RALP by assisting a small component of urethrovesical anastomosis

    Computer- and robot-assisted Medical Intervention

    Full text link
    Medical robotics includes assistive devices used by the physician in order to make his/her diagnostic or therapeutic practices easier and more efficient. This chapter focuses on such systems. It introduces the general field of Computer-Assisted Medical Interventions, its aims, its different components and describes the place of robots in that context. The evolutions in terms of general design and control paradigms in the development of medical robots are presented and issues specific to that application domain are discussed. A view of existing systems, on-going developments and future trends is given. A case-study is detailed. Other types of robotic help in the medical environment (such as for assisting a handicapped person, for rehabilitation of a patient or for replacement of some damaged/suppressed limbs or organs) are out of the scope of this chapter.Comment: Handbook of Automation, Shimon Nof (Ed.) (2009) 000-00

    Recent trends, technical concepts and components of computer-assisted orthopedic surgery systems: A comprehensive review

    Get PDF
    Computer-assisted orthopedic surgery (CAOS) systems have become one of the most important and challenging types of system in clinical orthopedics, as they enable precise treatment of musculoskeletal diseases, employing modern clinical navigation systems and surgical tools. This paper brings a comprehensive review of recent trends and possibilities of CAOS systems. There are three types of the surgical planning systems, including: systems based on the volumetric images (computer tomography (CT), magnetic resonance imaging (MRI) or ultrasound images), further systems utilize either 2D or 3D fluoroscopic images, and the last one utilizes the kinetic information about the joints and morphological information about the target bones. This complex review is focused on three fundamental aspects of CAOS systems: their essential components, types of CAOS systems, and mechanical tools used in CAOS systems. In this review, we also outline the possibilities for using ultrasound computer-assisted orthopedic surgery (UCAOS) systems as an alternative to conventionally used CAOS systems.Web of Science1923art. no. 519

    Computer-assisted access to the kidney

    Full text link
    OBJECTIVES: The aim of this paper is to introduce the principles of computer-assisted access to the kidney. The system provides the surgeon with a pre-operative 3D planning on computed tomography (CT) images. After a rigid registration with space-localized ultrasound (US) data, preoperative planning can be transferred to the intra-operative conditions and an intuitive man-machine interface allows the user to perform a puncture. MATERIAL AND METHODS: Both CT and US images of informed normal volunteer were obtained to perform calculation on the accuracy of registration and punctures were carried out on a kidney phantom to measure the precision of the whole of the system. RESULTS: We carried out millimetric registrations on real data and guidance experiments on a kidney phantom showed encouraging results of 4.7 mm between planned and reached targets. We noticed that the most significant error was related to the needle deflection during the puncture. CONCLUSION: Preliminary results are encouraging. Further work will be undertaken to improve efficiency and accuracy, and to take breathing into account

    Robotic Ultrasound Guidance by B-scan Plane Positioning Control

    Get PDF
    AbstractUltrasound is indispensable imaging modality for clinical diagnosis such as fetus assessment and heart assessment. Moreover, many ultrasound applications for image guided procedures have been proposed and attempted because US is less invasive, less cost, and high portability. However, to obtain US images, a US imaging probe has to be held manually and contacted with a patient body. To address the issue, we have proposed a robotic system for automatic probe scanning. The system consists of a probe scanning robot, navigation software, an optical tracking device, and an ultrasound imaging device. The robot, that is six degrees of freedom, is composed of a frame mechanism and a probe holding mechanism. The frame mechanism has six pneumatic actuators to reduce its weight, and the probe holding mechanism has one DC motor. The probe holding mechanism is connected with the pneumatic actuators using wires. Moreover, the robot can control the position and orientation of the B-scan plane based on the transformation between an optical tracker attached to the US probe and the B-scan plane. The navigation system, which is connected with the tracking device and an US imaging device via a VGA cable, computes the relative position between the positions of a therapeutic tool and the B-scan plane, and sends it to the robot. Then the position of the B-scan plane can be controlled based on the tool position. Also, the navigation system displays the plane with a texture of an actual echogram and a tool model three-dimensionally to monitor the relative position of the tool and the B-scan plane. To validate the basic system performance, phantom tests were conducted. The phantom was made of gelatin and poly(ethylene glycol). In the tests, the needle was inserted into the phantom, and the B-scan plane was controlled to contain a tracked needle in real-time. From the results, the needle was continuously visualized during needle insertion. Therefore, it is confirmed that the system has a great potential for automatic US image guided procedures

    A surgical system for automatic registration, stiffness mapping and dynamic image overlay

    Full text link
    In this paper we develop a surgical system using the da Vinci research kit (dVRK) that is capable of autonomously searching for tumors and dynamically displaying the tumor location using augmented reality. Such a system has the potential to quickly reveal the location and shape of tumors and visually overlay that information to reduce the cognitive overload of the surgeon. We believe that our approach is one of the first to incorporate state-of-the-art methods in registration, force sensing and tumor localization into a unified surgical system. First, the preoperative model is registered to the intra-operative scene using a Bingham distribution-based filtering approach. An active level set estimation is then used to find the location and the shape of the tumors. We use a recently developed miniature force sensor to perform the palpation. The estimated stiffness map is then dynamically overlaid onto the registered preoperative model of the organ. We demonstrate the efficacy of our system by performing experiments on phantom prostate models with embedded stiff inclusions.Comment: International Symposium on Medical Robotics (ISMR 2018
    corecore