261 research outputs found

    Gaze gesture based human robot interaction for laparoscopic surgery

    No full text
    While minimally invasive surgery offers great benefits in terms of reduced patient trauma, bleeding, as well as faster recovery time, it still presents surgeons with major ergonomic challenges. Laparoscopic surgery requires the surgeon to bimanually control surgical instruments during the operation. A dedicated assistant is thus required to manoeuvre the camera, which is often difficult to synchronise with the surgeon’s movements. This article introduces a robotic system in which a rigid endoscope held by a robotic arm is controlled via the surgeon’s eye movement, thus forgoing the need for a camera assistant. Gaze gestures detected via a series of eye movements are used to convey the surgeon’s intention to initiate gaze contingent camera control. Hidden Markov Models (HMMs) are used for real-time gaze gesture recognition, allowing the robotic camera to pan, tilt, and zoom, whilst immune to aberrant or unintentional eye movements. A novel online calibration method for the gaze tracker is proposed, which overcomes calibration drift and simplifies its clinical application. This robotic system has been validated by comprehensive user trials and a detailed analysis performed on usability metrics to assess the performance of the system. The results demonstrate that the surgeons can perform their tasks quicker and more efficiently when compared to the use of a camera assistant or foot switches

    Intention recognition for gaze controlled robotic minimally invasive laser ablation

    Get PDF
    Eye tracking technology has shown promising results for allowing hands-free control of robotically-mounted cameras and tools. However existing systems present only limited capabilities in allowing the full range of camera motions in a safe, intuitive manner. This paper introduces a framework for the recognition of surgeon intention, allowing activation and control of the camera through natural gaze behaviour. The system is resistant to noise such as blinking, while allowing the surgeon to look away safely at any time. Furthermore, this paper presents a novel approach to control the translation of the camera along its optical axis using a combination of eye tracking and stereo reconstruction. Combining eye tracking and stereo reconstruction allows the system to determine which point in 3D space the user is fixating, enabling a translation of the camera to achieve the optimal viewing distance. In addition, the eye tracking information is used to perform automatic laser targeting for laser ablation. The desired target point of the laser, mounted on a separate robotic arm, is determined with the eye tracking thus removing the need to manually adjust the laser's target point before starting each new ablation. The calibration methodology used to obtain millimetre precision for the laser targeting without the aid of visual servoing is described. Finally, a user study validating the system is presented, showing clear improvement with median task times under half of those of a manually controlled robotic system

    Robot Autonomy for Surgery

    Full text link
    Autonomous surgery involves having surgical tasks performed by a robot operating under its own will, with partial or no human involvement. There are several important advantages of automation in surgery, which include increasing precision of care due to sub-millimeter robot control, real-time utilization of biosignals for interventional care, improvements to surgical efficiency and execution, and computer-aided guidance under various medical imaging and sensing modalities. While these methods may displace some tasks of surgical teams and individual surgeons, they also present new capabilities in interventions that are too difficult or go beyond the skills of a human. In this chapter, we provide an overview of robot autonomy in commercial use and in research, and present some of the challenges faced in developing autonomous surgical robots

    Prevalence of haptic feedback in robot-mediated surgery : a systematic review of literature

    Get PDF
    © 2017 Springer-Verlag. This is a post-peer-review, pre-copyedit version of an article published in Journal of Robotic Surgery. The final authenticated version is available online at: https://doi.org/10.1007/s11701-017-0763-4With the successful uptake and inclusion of robotic systems in minimally invasive surgery and with the increasing application of robotic surgery (RS) in numerous surgical specialities worldwide, there is now a need to develop and enhance the technology further. One such improvement is the implementation and amalgamation of haptic feedback technology into RS which will permit the operating surgeon on the console to receive haptic information on the type of tissue being operated on. The main advantage of using this is to allow the operating surgeon to feel and control the amount of force applied to different tissues during surgery thus minimising the risk of tissue damage due to both the direct and indirect effects of excessive tissue force or tension being applied during RS. We performed a two-rater systematic review to identify the latest developments and potential avenues of improving technology in the application and implementation of haptic feedback technology to the operating surgeon on the console during RS. This review provides a summary of technological enhancements in RS, considering different stages of work, from proof of concept to cadaver tissue testing, surgery in animals, and finally real implementation in surgical practice. We identify that at the time of this review, while there is a unanimous agreement regarding need for haptic and tactile feedback, there are no solutions or products available that address this need. There is a scope and need for new developments in haptic augmentation for robot-mediated surgery with the aim of improving patient care and robotic surgical technology further.Peer reviewe

    A gaze-contingent framework for perceptually-enabled applications in healthcare

    Get PDF
    Patient safety and quality of care remain the focus of the smart operating room of the future. Some of the most influential factors with a detrimental effect are related to suboptimal communication among the staff, poor flow of information, staff workload and fatigue, ergonomics and sterility in the operating room. While technological developments constantly transform the operating room layout and the interaction between surgical staff and machinery, a vast array of opportunities arise for the design of systems and approaches, that can enhance patient safety and improve workflow and efficiency. The aim of this research is to develop a real-time gaze-contingent framework towards a "smart" operating suite, that will enhance operator's ergonomics by allowing perceptually-enabled, touchless and natural interaction with the environment. The main feature of the proposed framework is the ability to acquire and utilise the plethora of information provided by the human visual system to allow touchless interaction with medical devices in the operating room. In this thesis, a gaze-guided robotic scrub nurse, a gaze-controlled robotised flexible endoscope and a gaze-guided assistive robotic system are proposed. Firstly, the gaze-guided robotic scrub nurse is presented; surgical teams performed a simulated surgical task with the assistance of a robot scrub nurse, which complements the human scrub nurse in delivery of surgical instruments, following gaze selection by the surgeon. Then, the gaze-controlled robotised flexible endoscope is introduced; experienced endoscopists and novice users performed a simulated examination of the upper gastrointestinal tract using predominately their natural gaze. Finally, a gaze-guided assistive robotic system is presented, which aims to facilitate activities of daily living. The results of this work provide valuable insights into the feasibility of integrating the developed gaze-contingent framework into clinical practice without significant workflow disruptions.Open Acces

    A flexible access platform for robot-assisted minimally invasive surgery

    No full text
    Advances in Minimally Invasive Surgery (MIS) are driven by the clinical demand to reduce the invasiveness of surgical procedures so patients undergo less trauma and experience faster recoveries. These well documented benefits of MIS have been achieved through parallel advances in the technology and instrumentation used during procedures. The new and evolving field of Flexible Access Surgery (FAS), where surgeons access the operative site through a single incision or a natural orifice incision, is being promoted as the next potential step in the evolution of surgery. In order to achieve similar levels of success and adoption as MIS, technology again has its role to play in developing new instruments to solve the unmet clinical challenges of FAS. As procedures become less invasive, these instruments should not just address the challenges presented by the complex access routes of FAS, but should also build on the recent advances in pre- and intraoperative imaging techniques to provide surgeons with new diagnostic and interventional decision making capabilities. The main focus of this thesis is the development and applications of a flexible robotic device that is capable of providing controlled flexibility along curved pathways inside the body. The principal component of the device is its modular mechatronic joint design which utilises an embedded micromotor-tendon actuation scheme to provide independently addressable degrees of freedom and three internal working channels. Connecting multiple modules together allows a seven degree-of-freedom (DoF) flexible access platform to be constructed. The platform is intended for use as a research test-bed to explore engineering and surgical challenges of FAS. Navigation of the platform is realised using a handheld controller optimised for functionality and ergonomics, or in a "hands-free" manner via a gaze contingent control framework. Under this framework, the operator's gaze fixation point is used as feedback to close the servo control loop. The feasibility and potential of integrating multi-spectral imaging capabilities into flexible robotic devices is also demonstrated. A force adaptive servoing mechanism is developed to simplify the deployment, and improve the consistency of probe-based optical imaging techniques by automatically controlling the contact force between the probe tip and target tissue. The thesis concludes with the description of two FAS case studies performed with the platform during in-vivo porcine experiments. These studies demonstrate the ability of the platform to perform large area explorations within the peritoneal cavity and to provide a stable base for the deployment of interventional instruments and imaging probes

    Motion Estimation and Reconstruction of a Heart Surface by Means of 2D-/3D-Membrane Models

    Get PDF
    In order to assist surgeons during minimally invasive interventions on the beating heart, it would be helpful to develop a robotic surgery system, which synchronizes the instruments with the heart surface, so that their positions do not change relative to the point of interest (POI). The synchronization of the robotic manipulators requires an estimation of the heart surface motion. In this paper, a modelbased motion estimation of the heart surface is presented. The motion of a partition of the heart surface is modelled by means of a thin or thick vibrating membrane in order to represent the epicardial surface or the connected epicard and myocard. The membrane motion is described by means of a system of coupled linear partial differential equations (PDEs), whose 3D-input function is assumed to be known. After spatial discretization of the PDE solution space by the Finite Spectral Element Method, a bank of lumped systems is obtained. A Kalman filter is used to estimate the state of the lumped systems by incorporating noisy measurements of the heart surface. Measurements can be the position or velocity of sonomicrometry-based sensors or of certain landmarks, which are tracked by optical sensors. With the model-based estimation it is possible to reconstruct the entire partition of the heart surface even at non-measurement points and thus at each POI

    Current Trends and Future Developments in Robotic Cardiac Surgery

    Get PDF
    Robotic Cardiac Surgery has revolutionised operating for surgeons to provide less operative pain, shorter hospital stays and improved quality of life. As surgeons are constantly trying new techniques, Robotic Cardiac Surgery now encompasses mitral valve surgery, coronary revascularisation, atrial fibrillation surgery, pacing lead implantation, congenital cardiac operations, cardiac tumours resection and diaphragmatic pacing. Robotic technology is gradually becoming more affordable and so more centres are investing in training surgeons in these techniques. As a result, robotic cardiac surgery has developed into a rapidly evolving speciality with exciting new possibilities... (excerpt

    Evaluation of head-free eye tracking as an input device for air traffic control

    Get PDF
    International audienceThe purpose of this study was to investigate the possibility to integrate a free head motion eye-tracking system as input device in air traffic control (ATC) activity. Sixteen participants used an eye tracker to select targets displayed on a screen as quickly and accurately as possible. We assessed the impact of the presence of visual feedback about gaze position and the method of target selection on selection performance under different difficulty levels induced by variations in target size and target-to-target separation. We tend to consider that the combined use of gaze dwell-time selection and continuous eye-gaze feedback was the best condition as it suits naturally with gaze displacement over the ATC display and free the hands of the controller, despite a small cost in terms of selection speed. In addition, target size had a greater impact on accuracy and selection time than target distance. These findings provide guidelines on possible further implementation of eye tracking in ATC everyday activity
    • …
    corecore