28 research outputs found

    Automated pick-up of suturing needles for robotic surgical assistance

    Get PDF
    Robot-assisted laparoscopic prostatectomy (RALP) is a treatment for prostate cancer that involves complete or nerve sparing removal prostate tissue that contains cancer. After removal the bladder neck is successively sutured directly with the urethra. The procedure is called urethrovesical anastomosis and is one of the most dexterity demanding tasks during RALP. Two suturing instruments and a pair of needles are used in combination to perform a running stitch during urethrovesical anastomosis. While robotic instruments provide enhanced dexterity to perform the anastomosis, it is still highly challenging and difficult to learn. In this paper, we presents a vision-guided needle grasping method for automatically grasping the needle that has been inserted into the patient prior to anastomosis. We aim to automatically grasp the suturing needle in a position that avoids hand-offs and immediately enables the start of suturing. The full grasping process can be broken down into: a needle detection algorithm; an approach phase where the surgical tool moves closer to the needle based on visual feedback; and a grasping phase through path planning based on observed surgical practice. Our experimental results show examples of successful autonomous grasping that has the potential to simplify and decrease the operational time in RALP by assisting a small component of urethrovesical anastomosis

    Automated pick-up of suturing needles for robotic surgical assistance

    Get PDF
    Robot-assisted laparoscopic prostatectomy (RALP) is a treatment for prostate cancer that involves complete or nerve sparing removal prostate tissue that contains cancer. After removal the bladder neck is successively sutured directly with the urethra. The procedure is called urethrovesical anastomosis and is one of the most dexterity demanding tasks during RALP. Two suturing instruments and a pair of needles are used in combination to perform a running stitch during urethrovesical anastomosis. While robotic instruments provide enhanced dexterity to perform the anastomosis, it is still highly challenging and difficult to learn. In this paper, we presents a vision-guided needle grasping method for automatically grasping the needle that has been inserted into the patient prior to anastomosis. We aim to automatically grasp the suturing needle in a position that avoids hand-offs and immediately enables the start of suturing. The full grasping process can be broken down into: a needle detection algorithm; an approach phase where the surgical tool moves closer to the needle based on visual feedback; and a grasping phase through path planning based on observed surgical practice. Our experimental results show examples of successful autonomous grasping that has the potential to simplify and decrease the operational time in RALP by assisting a small component of urethrovesical anastomosis

    Haptic-guided shared control for needle grasping optimization in minimally invasive robotic surgery

    Get PDF
    During suturing tasks performed with minimally invasive surgical robots, configuration singularities and joint limits often force surgeons to interrupt the task and re- grasp the needle using dual-arm movements. This yields an increased operator’s cognitive load, time-to-completion, fatigue and performance degradation. In this paper, we propose a haptic-guided shared control method for grasping the needle with the Patient Side Manipulator (PSM) of the da Vinci robot avoiding such issues. We suggest a cost function consisting of (i) the distance from robot joint limits and (ii) the task-oriented manipulability over the suturing trajectory. We evaluate the cost and its gradient on the needle grasping manifold that allows us to obtain the optimal grasping pose for joint-limit and singularity free movements of the needle during suturing. Then, we compute force cues that are applied to the Master Tool Manipulator (MTM) of the da Vinci to guide the operator towards the optimal grasp. As such, our system helps the operator to choose a grasping configuration allowing the robot to avoid joint limits and singularities during post-grasp suturing movements. We show the effectiveness of our proposed haptic- guided shared control method during suturing using both simulated and real experiments. The results illustrate that our approach significantly improves the performance in terms of needle re-grasping

    Fast and Reliable Autonomous Surgical Debridement with Cable-Driven Robots Using a Two-Phase Calibration Procedure

    Full text link
    Automating precision subtasks such as debridement (removing dead or diseased tissue fragments) with Robotic Surgical Assistants (RSAs) such as the da Vinci Research Kit (dVRK) is challenging due to inherent non-linearities in cable-driven systems. We propose and evaluate a novel two-phase coarse-to-fine calibration method. In Phase I (coarse), we place a red calibration marker on the end effector and let it randomly move through a set of open-loop trajectories to obtain a large sample set of camera pixels and internal robot end-effector configurations. This coarse data is then used to train a Deep Neural Network (DNN) to learn the coarse transformation bias. In Phase II (fine), the bias from Phase I is applied to move the end-effector toward a small set of specific target points on a printed sheet. For each target, a human operator manually adjusts the end-effector position by direct contact (not through teleoperation) and the residual compensation bias is recorded. This fine data is then used to train a Random Forest (RF) to learn the fine transformation bias. Subsequent experiments suggest that without calibration, position errors average 4.55mm. Phase I can reduce average error to 2.14mm and the combination of Phase I and Phase II can reduces average error to 1.08mm. We apply these results to debridement of raisins and pumpkin seeds as fragment phantoms. Using an endoscopic stereo camera with standard edge detection, experiments with 120 trials achieved average success rates of 94.5%, exceeding prior results with much larger fragments (89.4%) and achieving a speedup of 2.1x, decreasing time per fragment from 15.8 seconds to 7.3 seconds. Source code, data, and videos are available at https://sites.google.com/view/calib-icra/.Comment: Code, data, and videos are available at https://sites.google.com/view/calib-icra/. Final version for ICRA 201

    Virtual Fixture Assistance for Suturing in Robot-Aided Pediatric Endoscopic Surgery

    Full text link
    The limited workspace in pediatric endoscopic surgery makes surgical suturing one of the most difficult tasks. During suturing, surgeons have to prevent collisions between tools and also collisions with the surrounding tissues. Surgical robots have been shown to be effective in adult laparoscopy, but assistance for suturing in constrained workspaces has not been yet fully explored. In this letter, we propose guidance virtual fixtures to enhance the performance and the safety of suturing while generating the required task constraints using constrained optimization and Cartesian force feedback. We propose two guidance methods: looping virtual fixtures and a trajectory guidance cylinder, that are based on dynamic geometric elements. In simulations and experiments with a physical robot, we show that the proposed methods achieve a more precise and safer looping in robot-assisted pediatric endoscopy.Comment: Accepted on RA-L/ICRA 2020, 8 Pages. Fixed a few typo

    An Open-Source Framework for Surgical Subtask Automation

    Get PDF
    Robot-Assisted Surgery (RAS) is becoming a standard in Minimally Invasive Surgery (MIS). Despite RAS’ benefits and potential, surgeons still have to perform themselves a number of monotonous and time-consuming subtasks like knot- tying or blunt dissection. Many believe that the next big step in development is the automation of such subtasks. Partial automation can reduce the cognitive load on surgeons, supporting them to pay more attention to the most critical elements of the surgical workflow. Our aim was to develop a framework to ease and fasten the automation of surgical subtasks. This framework was built alongside the da Vinci Research Kit (dVRK), while it can be ported easily onto further robotic platforms, since it is based on the Robot Operating System (ROS). The software includes both stereo vision-based and hierarchical motion planning, with a wide palette of often used surgical gestures—such as grasping, cutting or soft tissue manipulation—as building blocks to support the high-level implementation of autonomous surgical subtasks. This open-source surgical automation framework—irob-saf—is available at https://github.com/ABC-iRobotics/irob-saf
    corecore