1,418 research outputs found

    Automated pick-up of suturing needles for robotic surgical assistance

    Get PDF
    Robot-assisted laparoscopic prostatectomy (RALP) is a treatment for prostate cancer that involves complete or nerve sparing removal prostate tissue that contains cancer. After removal the bladder neck is successively sutured directly with the urethra. The procedure is called urethrovesical anastomosis and is one of the most dexterity demanding tasks during RALP. Two suturing instruments and a pair of needles are used in combination to perform a running stitch during urethrovesical anastomosis. While robotic instruments provide enhanced dexterity to perform the anastomosis, it is still highly challenging and difficult to learn. In this paper, we presents a vision-guided needle grasping method for automatically grasping the needle that has been inserted into the patient prior to anastomosis. We aim to automatically grasp the suturing needle in a position that avoids hand-offs and immediately enables the start of suturing. The full grasping process can be broken down into: a needle detection algorithm; an approach phase where the surgical tool moves closer to the needle based on visual feedback; and a grasping phase through path planning based on observed surgical practice. Our experimental results show examples of successful autonomous grasping that has the potential to simplify and decrease the operational time in RALP by assisting a small component of urethrovesical anastomosis

    A hybrid camera- and ultrasound-based approach for needle localization and tracking using a 3D motorized curvilinear ultrasound probe

    Get PDF
    Three-dimensional (3D) motorized curvilinear ultrasound probes provide an effective, low-cost tool to guide needle interventions, but localizing and tracking the needle in 3D ultrasound volumes is often challenging. In this study, a new method is introduced to localize and track the needle using 3D motorized curvilinear ultrasound probes. In particular, a low-cost camera mounted on the probe is employed to estimate the needle axis. The camera-estimated axis is used to identify a volume of interest (VOI) in the ultrasound volume that enables high needle visibility. This VOI is analyzed using local phase analysis and the random sample consensus algorithm to refine the camera-estimated needle axis. The needle tip is determined by searching the localized needle axis using a probabilistic approach. Dynamic needle tracking in a sequence of 3D ultrasound volumes is enabled by iteratively applying a Kalman filter to estimate the VOI that includes the needle in the successive ultrasound volume and limiting the localization analysis to this VOI. A series of ex vivo animal experiments are conducted to evaluate the accuracy of needle localization and tracking. The results show that the proposed method can localize the needle in individual ultrasound volumes with maximum error rates of 0.7 mm for the needle axis, 1.7° for the needle angle, and 1.2 mm for the needle tip. Moreover, the proposed method can track the needle in a sequence of ultrasound volumes with maximum error rates of 1.0 mm for the needle axis, 2.0° for the needle angle, and 1.7 mm for the needle tip. These results suggest the feasibility of applying the proposed method to localize and track the needle using 3D motorized curvilinear ultrasound probes

    SMART IMAGE-GUIDED NEEDLE INSERTION FOR TISSUE BIOPSY

    Get PDF
    M.S

    Design and validation of a medical robotic device system to control two collaborative robots for ultrasound-guided needle insertions

    Get PDF
    The percutaneous biopsy is a critical intervention for diagnosis and staging in cancer therapy. Robotic systems can improve the efficiency and outcome of such procedures while alleviating stress for physicians and patients. However, the high complexity of operation and the limited possibilities for robotic integration in the operating room (OR) decrease user acceptance and the number of deployed robots. Collaborative systems and standardized device communication may provide approaches to overcome named problems. Derived from the IEEE 11073 SDC standard terminology of medical device systems, we designed and validated a medical robotic device system (MERODES) to access and control a collaborative setup of two KUKA robots for ultrasound-guided needle insertions. The system is based on a novel standard for service-oriented device connectivity and utilizes collaborative principles to enhance user experience. Implementing separated workflow applications allows for a flexible system setup and configuration. The system was validated in three separate test scenarios to measure accuracies for 1) co-registration, 2) needle target planning in a water bath and 3) in an abdominal phantom. The co-registration accuracy averaged 0.94 ± 0.42 mm. The positioning errors ranged from 0.86 ± 0.42 to 1.19 ± 0.70 mm in the water bath setup and from 1.69 ± 0.92 to 1.96 ± 0.86 mm in the phantom. The presented results serve as a proof-of-concept and add to the current state of the art to alleviate system deployment and fast configuration for percutaneous robotic interventions

    Ultrasound-Guided Mechatronic System for Targeted Delivery of Cell-Based Cancer Vaccine Immunotherapy in Preclinical Models

    Get PDF
    Injection of dendritic cell (DC) vaccines into lymph nodes (LN) is a promising strategy for eliciting immune responses against cancer, but these injections in mouse cancer models are challenging due to the small target scale (~ 1 mm × 2 mm). Direct manual intranodal injection is difficult and can cause architectural damage to the LN, potentially disrupting crucial interactions between DC and T cells. Therefore, a second-generation ultrasound-guided mechatronic device has been developed to perform this intervention. A targeting accuracy of \u3c 500 ÎŒm will enable targeted delivery of the DCs specifically to a LN subcapsular space. The device was redesigned from its original CT-guided edition, which used a remote centre of motion architecture, to be easily integrated onto a commercially available VisualSonics imaging rail system. Subtle modifications were made to ensure simple workflow that allows for live-animal interventions that fall within the knockout periods stated in study protocols. Several calibration and registration techniques were developed in order to achieve an overall targeting accuracy appropriate for the intended application. A variety of methods to quantify the positioning accuracy of the device were investigated. The method chosen involved validating a guided injection into a tissue-mimicking phantom using ultrasound imaging post-operatively to localize the end-point position of the needle tip in the track left behind by the needle. Ultrasound-guided injections into a tissue-mimicking phantom revealed a targeting accuracy of 285 ± 94 ÎŒm for the developed robot compared to 508 ± 166 ÎŒm for a commercial-available manually-actuated injection device from VisuailSonics. The utility of the robot was also demonstrated by performing in vivo injections into the lymph nodes of mice

    New Mechatronic Systems for the Diagnosis and Treatment of Cancer

    Get PDF
    Both two dimensional (2D) and three dimensional (3D) imaging modalities are useful tools for viewing the internal anatomy. Three dimensional imaging techniques are required for accurate targeting of needles. This improves the efficiency and control over the intervention as the high temporal resolution of medical images can be used to validate the location of needle and target in real time. Relying on imaging alone, however, means the intervention is still operator dependent because of the difficulty of controlling the location of the needle within the image. The objective of this thesis is to improve the accuracy and repeatability of needle-based interventions over conventional techniques: both manual and automated techniques. This includes increasing the accuracy and repeatability of these procedures in order to minimize the invasiveness of the procedure. In this thesis, I propose that by combining the remote center of motion concept using spherical linkage components into a passive or semi-automated device, the physician will have a useful tracking and guidance system at their disposal in a package, which is less threatening than a robot to both the patient and physician. This design concept offers both the manipulative transparency of a freehand system, and tremor reduction through scaling currently offered in automated systems. In addressing each objective of this thesis, a number of novel mechanical designs incorporating an remote center of motion architecture with varying degrees of freedom have been presented. Each of these designs can be deployed in a variety of imaging modalities and clinical applications, ranging from preclinical to human interventions, with an accuracy of control in the millimeter to sub-millimeter range

    Robot Assisted Object Manipulation for Minimally Invasive Surgery

    Get PDF
    Robotic systems have an increasingly important role in facilitating minimally invasive surgical treatments. In robot-assisted minimally invasive surgery, surgeons remotely control instruments from a console to perform operations inside the patient. However, despite the advanced technological status of surgical robots, fully autonomous systems, with decision-making capabilities, are not yet available. In 2017, a structure to classify the research efforts toward autonomy achievable with surgical robots was proposed by Yang et al. Six different levels were identified: no autonomy, robot assistance, task autonomy, conditional autonomy, high autonomy, and full autonomy. All the commercially available platforms in robot-assisted surgery is still in level 0 (no autonomy). Despite increasing the level of autonomy remains an open challenge, its adoption could potentially introduce multiple benefits, such as decreasing surgeons’ workload and fatigue and pursuing a consistent quality of procedures. Ultimately, allowing the surgeons to interpret the ample and intelligent information from the system will enhance the surgical outcome and positively reflect both on patients and society. Three main aspects are required to introduce automation into surgery: the surgical robot must move with high precision, have motion planning capabilities and understand the surgical scene. Besides these main factors, depending on the type of surgery, there could be other aspects that might play a fundamental role, to name some compliance, stiffness, etc. This thesis addresses three technological challenges encountered when trying to achieve the aforementioned goals, in the specific case of robot-object interaction. First, how to overcome the inaccuracy of cable-driven systems when executing fine and precise movements. Second, planning different tasks in dynamically changing environments. Lastly, how the understanding of a surgical scene can be used to solve more than one manipulation task. To address the first challenge, a control scheme relying on accurate calibration is implemented to execute the pick-up of a surgical needle. Regarding the planning of surgical tasks, two approaches are explored: one is learning from demonstration to pick and place a surgical object, and the second is using a gradient-based approach to trigger a smoother object repositioning phase during intraoperative procedures. Finally, to improve scene understanding, this thesis focuses on developing a simulation environment where multiple tasks can be learned based on the surgical scene and then transferred to the real robot. Experiments proved that automation of the pick and place task of different surgical objects is possible. The robot was successfully able to autonomously pick up a suturing needle, position a surgical device for intraoperative ultrasound scanning and manipulate soft tissue for intraoperative organ retraction. Despite automation of surgical subtasks has been demonstrated in this work, several challenges remain open, such as the capabilities of the generated algorithm to generalise over different environment conditions and different patients

    A 3D US Guidance System for Permanent Breast Seed Implantation: Development and Validation

    Get PDF
    Permanent breast seed implantation (PBSI) is a promising breast radiotherapy technique that suffers from operator dependence. We propose and have developed an intraoperative 3D ultrasound (US) guidance system for PBSI. A tracking arm mounted to a 3D US scanner registers a needle template to the image. Images were validated for linear and volumetric accuracy, and image quality in a volunteer. The tracking arm was calibrated, and the 3D image registered to the scanner. Tracked and imaged needle positions were compared to assess accuracy and a patient-specific phantom procedure guided with the system. Median/mean linear and volumetric error was ±1.1% and ±4.1%, respectively, with clinically suitable volunteer scans. Mean tracking arm error was 0.43mm and 3D US target registration error ≀0.87mm. Mean needle tip/trajectory error was 2.46mm/1.55°. Modelled mean phantom procedure seed displacement was 2.50mm. To our knowledge, this is the first reported PBSI phantom procedure with intraoperative 3D image guidance

    Image guided robotic assistance for the diagnosis and treatment of tumor

    Get PDF
    The aim of this thesis is to demonstrate the feasibility and the potentiality of introduction of robotics and image guidance in the overall oncologic workflow, from the diagnosis to the treatment phase. The popularity of robotics in the operating room has grown in recent years. Currently the most popular systems is the da Vinci telemanipulator (Intuitive Surgical), it is based on a master-slave control, for minimally invasive surgery and it is used in several surgical fields such us urology, general, gynecology, cardiothoracic. An accurate study of this system, from a technological field of view, has been conducted addressing all drawbacks and advantages of this system. The da Vinci System creates an immersive operating environment for the surgeon by providing both high quality stereo visualization and a human-machine interface that directly connects the surgeon’s hands to the motion of the surgical tool tips inside the patient’s body. It has undoubted advantages for the surgeon work and for the patient health, at least for some interventions, while its very high costs leaves many doubts on its price benefit ratio. In the robotic surgery field many researchers are working on the optimization and miniaturization robots mechanic, while others are trying to obtain smart functionalities to realize robotic systems, that, “knowing” the patient anatomy from radiological images, can assists the surgeon in an active way. Regarding the second point, image guided systems can be useful to plan and to control medical robots motion and to provide the surgeon pre-operative and intra-operative images with augmented reality visualization to enhance his/her perceptual capacities and, as a consequence, to improve the quality of treatments. To demonstrate this thesis some prototypes has been designed, implemented and tested. The development of image guided medical devices, comprehensive of augmented reality, virtual navigation and robotic surgical features, requires to address several problems. The first ones are the choosing of the robotic platform and of the image source to employ. An industrial anthropomorphic arm has been used as testing platform. The idea of integrating industrial robot components in the clinical workflow has been supported by the da Vinci technical analysis. The algorithms and methods developed, regarding in particular robot calibration, based on literature theories and on an easily integration in the clinical scenario, can be adapted to each anthropomorphic arm. In this way this work can be integrated with light-weight robots, for industrial or clinical use, able to work in close contact to humans, which will become numerous in the early future. Regarding the medical image source, it has been decided to work with ultrasound imaging. Two-dimensional ultrasound imaging is widely used in clinical practice because is not dangerous for the patient, inexpensive, compact and it is a highly flexible imaging that allows users to study many anatomic structures. It is routinely used for diagnosis and as guidance in percutaneous treatments. However the use of 2D ultrasound imaging presents some disadvantages that require great ability of the user: it requires that the clinician mentally integrates many images to reconstruct a complete idea of the anatomy in 3D. Furthermore the freehand control of the probe make it difficult to individuate anatomic positions and orientations and probe repositioning to reach a particular location. To overcome these problems it has been developed an image guided system that fuse 2D US real time images with routinely CT or MRI 3D images, previously acquired from the patient, to enhance clinician orientation and probe guidance. The implemented algorithms for robot calibration and US image guidance has been used to realize two applications responding to specific clinical needs. The first one to speed up the execution of routinely and very recurrently procedures like percutaneous biopsy or ablation. The second one to improve a new completely non invasive type of treatment for solid tumors, the HIFU (High Intensity Focused Ultrasound). An ultrasound guided robotic system has been developed to assist the clinician to execute complicated biopsies, or percutaneous ablations, in particular for deep abdominal organs. It was developed an integrated system that provides the clinician two types of assistance: a mixed reality visualization allows accurate and easy planning of needle trajectory and target reaching verification; the robot arm equipped with a six-degree-of-freedom force sensor allows the precise positioning of the needle holder and allows the clinician to adjust, by means of a cooperative control, the planned trajectory to overcome needle deflection and target motion. The second application consists in an augmented reality navigation system for HIFU treatment. HIFU represents a completely non invasive method for treatment of solid tumors, hemostasis and other vascular features in human tissues. The technology for HIFU treatments is still evolving and the systems available on the market have some limitations and drawbacks. A disadvantage resulting from our experience with the machinery available in our hospital (JC200 therapeutic system Haifu (HIFU) by Tech Co., Ltd, Chongqing), which is similar to other analogous machines, is the long time required to perform the procedure due to the difficulty to find the target, using the remote motion of an ultrasound probe under the patient. This problem has been addressed developing an augmented reality navigation system to enhance US guidance during HIFU treatments allowing an easy target localization. The system was implemented using an additional free hand ultrasound probe coupled with a localizer and CT fused imaging. It offers a simple and an economic solution to an easy HIFU target localization. This thesis demonstrates the utility and usability of robots for diagnosis and treatment of the tumor, in particular the combination of automatic positioning and cooperative control allows the surgeon and the robot to work in synergy. Further the work demonstrates the feasibility and the potentiality of the use of a mixed reality navigation system to facilitate the target localization and consequently to reduce the times of sittings, to increase the number of possible diagnosis/treatments and to decrease the risk of potential errors. The proposed solutions for the integration of robotics and image guidance in the overall oncologic workflow, take into account current available technologies, traditional clinical procedures and cost minimization

    Medical Robotics

    Get PDF
    The first generation of surgical robots are already being installed in a number of operating rooms around the world. Robotics is being introduced to medicine because it allows for unprecedented control and precision of surgical instruments in minimally invasive procedures. So far, robots have been used to position an endoscope, perform gallbladder surgery and correct gastroesophogeal reflux and heartburn. The ultimate goal of the robotic surgery field is to design a robot that can be used to perform closed-chest, beating-heart surgery. The use of robotics in surgery will expand over the next decades without any doubt. Minimally Invasive Surgery (MIS) is a revolutionary approach in surgery. In MIS, the operation is performed with instruments and viewing equipment inserted into the body through small incisions created by the surgeon, in contrast to open surgery with large incisions. This minimizes surgical trauma and damage to healthy tissue, resulting in shorter patient recovery time. The aim of this book is to provide an overview of the state-of-art, to present new ideas, original results and practical experiences in this expanding area. Nevertheless, many chapters in the book concern advanced research on this growing area. The book provides critical analysis of clinical trials, assessment of the benefits and risks of the application of these technologies. This book is certainly a small sample of the research activity on Medical Robotics going on around the globe as you read it, but it surely covers a good deal of what has been done in the field recently, and as such it works as a valuable source for researchers interested in the involved subjects, whether they are currently “medical roboticists” or not
    • 

    corecore