401 research outputs found

    Augmented reality (AR) for surgical robotic and autonomous systems: State of the art, challenges, and solutions

    Get PDF
    Despite the substantial progress achieved in the development and integration of augmented reality (AR) in surgical robotic and autonomous systems (RAS), the center of focus in most devices remains on improving end-effector dexterity and precision, as well as improved access to minimally invasive surgeries. This paper aims to provide a systematic review of different types of state-of-the-art surgical robotic platforms while identifying areas for technological improvement. We associate specific control features, such as haptic feedback, sensory stimuli, and human-robot collaboration, with AR technology to perform complex surgical interventions for increased user perception of the augmented world. Current researchers in the field have, for long, faced innumerable issues with low accuracy in tool placement around complex trajectories, pose estimation, and difficulty in depth perception during two-dimensional medical imaging. A number of robots described in this review, such as Novarad and SpineAssist, are analyzed in terms of their hardware features, computer vision systems (such as deep learning algorithms), and the clinical relevance of the literature. We attempt to outline the shortcomings in current optimization algorithms for surgical robots (such as YOLO and LTSM) whilst providing mitigating solutions to internal tool-to-organ collision detection and image reconstruction. The accuracy of results in robot end-effector collisions and reduced occlusion remain promising within the scope of our research, validating the propositions made for the surgical clearance of ever-expanding AR technology in the future

    Haptic Training Simulator for Pedicle Screw Insertion in Scoliosis Surgery

    Get PDF
    This thesis develops a haptic training simulator that imitates the sensations experienced by a surgeon in pedicle screw insertions in a scoliosis surgery. Pedicle screw insertion is a common treatment for fixing spinal deformities in idiopathic scoliosis. Surgeons using the free hand technique are guided primarily by haptic feedback. A vital step in this free hand technique is the use of a probe to make a channel through the vertebrae pedicle. This is a sensitive process which carries risk of serious mechanical, neurological and vascular complications. Surgeons are currently trained using cadavers or live patients. Cadavers often have vertebrae that are softer than the real surgeons would typically encounter, while training on live patients carries the obvious issue of increased risk of complications to the patient. In this thesis, a haptic virtual reality simulator is designed and studied as a training tool for surgeons in this procedure. Creating a pathway through the pedicle by the free-hand technique is composed of two main degrees of freedom: rotation and linear progression. The rotary stage of the device which was developed by a previous student, is enhanced in this research by adding hardware, improving the haptic model and proposing techniques to couple the rotary and linear degree of freedom. Haptic model parameters for a spine surgery with normal bone density are then clinically tuned within a user study. Over ten surgeons of varying experience levels used the simulator and were able to change various parameters in order to tune the simulator to what felt most realistic. The surgeons also evaluated the simulator for its feasibility and usefulness. Four research questions were investigated. First, can a reference set of values be found that replicate the surgeon's interpretation of the surgical scenario? Second, how are the rotary stage parameters influenced in the presence of linear effects? Third, do the results differ across different expertise levels? Finally, can the simulator serve as a useful tool in the education of surgical trainees for teaching channel creation in pedicle screw insertion? Statistical analysis are carried out to examine the research questions. The results indicates the feasibility of the simulator for surgical education

    Trustworthy and Intelligent COVID-19 Diagnostic IoMT through XR and Deep-Learning-Based Clinic Data Access

    Get PDF
    This article presents a novel extended reality (XR) and deep-learning-based Internet-of-Medical-Things (IoMT) solution for the COVID-19 telemedicine diagnostic, which systematically combines virtual reality/augmented reality (AR) remote surgical plan/rehearse hardware, customized 5G cloud computing and deep learning algorithms to provide real-time COVID-19 treatment scheme clues. Compared to existing perception therapy techniques, our new technique can significantly improve performance and security. The system collected 25 clinic data from the 347 positive and 2270 negative COVID-19 patients in the Red Zone by 5G transmission. After that, a novel auxiliary classifier generative adversarial network-based intelligent prediction algorithm is conducted to train the new COVID-19 prediction model. Furthermore, The Copycat network is employed for the model stealing and attack for the IoMT to improve the security performance. To simplify the user interface and achieve an excellent user experience, we combined the Red Zone's guiding images with the Green Zone's view through the AR navigate clue by using 5G. The XR surgical plan/rehearse framework is designed, including all COVID-19 surgical requisite details that were developed with a real-time response guaranteed. The accuracy, recall, F1-score, and area under the ROC curve (AUC) area of our new IoMT were 0.92, 0.98, 0.95, and 0.98, respectively, which outperforms the existing perception techniques with significantly higher accuracy performance. The model stealing also has excellent performance, with the AUC area of 0.90 in Copycat slightly lower than the original model. This study suggests a new framework in the COVID-19 diagnostic integration and opens the new research about the integration of XR and deep learning for IoMT implementation

    Computer simulated needle manipulation of Chinese acupuncture with realistic haptic feedback.

    Get PDF
    Leung Ka Man.Thesis submitted in: August 2002.Thesis (M.Phil.)--Chinese University of Hong Kong, 2003.Includes bibliographical references (leaves 81-84).Abstracts in English and Chinese.Abstract --- p.iiAcknowledgements --- p.ivContents --- p.vList of Figures --- p.viiiList of Tables --- p.xChapter 1. --- Introduction --- p.1Chapter 1.1 --- Surgical Needle Simulation --- p.4Chapter 1.1.1 --- Data Source --- p.5Chapter 1.1.2 --- Computer-aided training simulation --- p.6Chapter 1.1.3 --- Existing Systems --- p.8Chapter 1.2 --- Research Goal --- p.10Chapter 1.3 --- Organization of this Thesis --- p.12Chapter 2. --- Haptization of Needle Interactions --- p.13Chapter 2.1 --- Data Collection --- p.13Chapter 2.1.1 --- Force Measurement --- p.14Chapter 2.1.2 --- Data Correlation --- p.17Chapter 2.1.3 --- Expert Opinion --- p.18Chapter 2.2 --- Haptic Display Devices --- p.18Chapter 2.2.1 --- General-purpose Devices --- p.19Chapter 2.2.2 --- Tailor-made Devices --- p.20Chapter 2.3 --- Haptic Models for Tissues --- p.21Chapter 2.3.1 --- Stiffness Models --- p.21Chapter 2.3.2 --- Friction Models --- p.22Chapter 2.3.3 --- Modelling of needle operations --- p.23Chapter 2.4 --- Chapter Summary --- p.24Chapter 3. --- Haptic Rendering of Bi-directional Needle Manipulation --- p.25Chapter 3.1 --- Data Source and Pre-processing --- p.26Chapter 3.1.1 --- Virtual Body Surface Construction --- p.28Chapter 3.1.2 --- Tissue Mapping for Haptic Rendering --- p.29Chapter 3.2 --- The PHANToM´ёØ Haptic Device --- p.31Chapter 3.3 --- Force Profile Analysis --- p.33Chapter 3.4 --- Haptic Model Construction --- p.37Chapter 3.4.1 --- Skin --- p.41Chapter 3.4.2 --- Adipose Tissue --- p.48Chapter 3.4.3 --- Muscle --- p.49Chapter 3.4.4 --- Bone --- p.50Chapter 3.5 --- Force Composition --- p.51Chapter 3.5.1 --- Structure Weight Compensation --- p.52Chapter 3.5.2 --- Path Constraint Force --- p.52Chapter 3.5.3 --- Needle Axial Force --- p.53Chapter 3.6 --- Interactive Calibration --- p.60Chapter 3.7 --- Skin Deformation --- p.61Chapter 3.8 --- Chapter Summary --- p.63Chapter 4. --- Parallel Visual-Haptic Rendering --- p.64Chapter 4.1 --- Parallel Network Architecture --- p.64Chapter 4.2 --- Visual Rendering Pipeline --- p.65Chapter 4.3 --- Haptic Rendering Pipeline --- p.67Chapter 4.4 --- Chapter Summary --- p.67Chapter 5. --- User Interface --- p.68Chapter 5.1 --- Needle Practice --- p.68Chapter 5.1.1 --- Moving Mode --- p.69Chapter 5.1.2 --- Acupuncture Atlas --- p.70Chapter 5.1.3 --- Training Results --- p.70Chapter 5.1.4 --- User Controls --- p.71Chapter 5.2 --- Device Calibration --- p.72Chapter 5.3 --- Model Settings --- p.72Chapter 5.4 --- Chapter Summary --- p.72Chapter 6. --- Conclusion --- p.73Chapter 6.1 --- Research Summary --- p.73Chapter 6.2 --- Suggested Improvement --- p.74Chapter 6.3 --- Future Research Works --- p.75Appendix A: Mapping Table for Tissues --- p.76Appendix B: Incremental Viscoelastic Model --- p.78Appendix C: Model Parameter Values --- p.80Bibliography --- p.8

    Development and validation of real-time simulation of X-ray imaging with respiratory motion

    Get PDF
    International audienceWe present a framework that combines evolutionary optimisation, soft tissue modelling and ray tracing on GPU to simultaneously compute the respiratory motion and X-ray imaging in real-time. Our aim is to provide validated building blocks with high fidelity to closely match both the human physiology and the physics of X-rays. A CPU-based set of algorithms is presented to model organ behaviours during respiration. Soft tissue deformation is computed with an extension of the Chain Mail method. Rigid elements move according to kinematic laws. A GPU-based surface rendering method is proposed to compute the X-ray image using the Beer-Lambert law. It is provided as an open-source library. A quantitative validation study is provided to objectively assess the accuracy of both components: i) the respiration against anatomical data, and ii) the X-ray against the Beer-Lambert law and the results of Monte Carlo simulations. Our implementation can be used in various applications, such as interactive medical virtual environment to train percutaneous transhepatic cholangiography in interventional radiology, 2D/3D registration, computation of digitally reconstructed radiograph, simulation of 4D sinograms to test tomography reconstruction tools

    Proceedings of the Second PHANToM Users Group Workshop : October 19-22, 1997 : Endicott House, Dedham, MA, Massachusetts Institute of Technology, Cambridge, MA

    Get PDF
    "December, 1997." Cover title.Includes bibliographical references.Sponsored by SensAble Technologies, Inc., Cambridge, MA."[edited by J. Kennedy Salisbury and Mandayam A. Srinivasan]

    Patient Specific Systems for Computer Assisted Robotic Surgery Simulation, Planning, and Navigation

    Get PDF
    The evolving scenario of surgery: starting from modern surgery, to the birth of medical imaging and the introduction of minimally invasive techniques, has seen in these last years the advent of surgical robotics. These systems, making possible to get through the difficulties of endoscopic surgery, allow an improved surgical performance and a better quality of the intervention. Information technology contributed to this evolution since the beginning of the digital revolution: providing innovative medical imaging devices and computer assisted surgical systems. Afterwards, the progresses in computer graphics brought innovative visualization modalities for medical datasets, and later the birth virtual reality has paved the way for virtual surgery. Although many surgical simulators already exist, there are no patient specific solutions. This thesis presents the development of patient specific software systems for preoperative planning, simulation and intraoperative assistance, designed for robotic surgery: in particular for bimanual robots that are becoming the future of single port interventions. The first software application is a virtual reality simulator for this kind of surgical robots. The system has been designed to validate the initial port placement and the operative workspace for the potential application of this surgical device. Given a bimanual robot with its own geometry and kinematics, and a patient specific 3D virtual anatomy, the surgical simulator allows the surgeon to choose the optimal positioning of the robot and the access port in the abdominal wall. Additionally, it makes possible to evaluate in a virtual environment if a dexterous movability of the robot is achievable, avoiding unwanted collisions with the surrounding anatomy to prevent potential damages in the real surgical procedure. Even if the software has been designed for a specific bimanual surgical robot, it supports any open kinematic chain structure: as far as it can be described in our custom format. The robot capabilities to accomplish specific tasks can be virtually tested using the deformable models: interacting directly with the target virtual organs, trying to avoid unwanted collisions with the surrounding anatomy not involved in the intervention. Moreover, the surgical simulator has been enhanced with algorithms and data structures to integrate biomechanical parameters into virtual deformable models (based on mass-spring-damper network) of target solid organs, in order to properly reproduce the physical behaviour of the patient anatomy during the interactions. The main biomechanical parameters (Young's modulus and density) have been integrated, allowing the automatic tuning of some model network elements, such as: the node mass and the spring stiffness. The spring damping coefficient has been modeled using the Rayleigh approach. Furthermore, the developed method automatically detect the external layer, allowing the usage of both the surface and internal Young's moduli, in order to model the main parts of dense organs: the stroma and the parenchyma. Finally the model can be manually tuned to represent lesion with specific biomechanical properties. Additionally, some software modules of the simulator have been properly extended to be integrated in a patient specific computer guidance system for intraoperative navigation and assistance in robotic single port interventions. This application provides guidance functionalities working in three different modalities: passive as a surgical navigator, assistive as a guide for the single port placement and active as a tutor preventing unwanted collision during the intervention. The simulation system has beed tested by five surgeons: simulating the robot access port placemen, and evaluating the robot movability and workspace inside the patient abdomen. The tested functionalities, rated by expert surgeons, have shown good quality and performance of the simulation. Moreover, the integration of biomechanical parameters into deformable models has beed tested with various material samples. The results have shown a good visual realism ensuring the performance required by an interactive simulation. Finally, the intraoperative navigator has been tested performing a cholecystectomy on a synthetic patient mannequin, in order to evaluate: the intraoperative navigation accuracy, the network communications latency and the overall usability of the system. The tests performed demonstrated the effectiveness and the usability of the software systems developed: encouraging the introduction of the proposed solution in the clinical practice, and the implementation of further improvements. Surgical robotics will be enhanced by an advanced integration of medical images into software systems: allowing the detailed planning of surgical interventions by means of virtual surgery simulation based on patient specific biomechanical parameters. Furthermore, the advanced functionalities offered by these systems, enable surgical robots to improve the intraoperative surgical assistance: benefitting of the knowledge of the virtual patient anatomy

    Augmented Reality (AR) for Surgical Robotic and Autonomous Systems: State of the Art, Challenges, and Solutions

    Get PDF
    Despite the substantial progress achieved in the development and integration of augmented reality (AR) in surgical robotic and autonomous systems (RAS), the center of focus in most devices remains on improving end-effector dexterity and precision, as well as improved access to minimally invasive surgeries. This paper aims to provide a systematic review of different types of state-of-the-art surgical robotic platforms while identifying areas for technological improvement. We associate specific control features, such as haptic feedback, sensory stimuli, and human–robot collaboration, with AR technology to perform complex surgical interventions for increased user perception of the augmented world. Current researchers in the field have, for long, faced innumerable issues with low accuracy in tool placement around complex trajectories, pose estimation, and difficulty in depth perception during two-dimensional medical imaging. A number of robots described in this review, such as Novarad and SpineAssist, are analyzed in terms of their hardware features, computer vision systems (such as deep learning algorithms), and the clinical relevance of the literature. We attempt to outline the shortcomings in current optimization algorithms for surgical robots (such as YOLO and LTSM) whilst providing mitigating solutions to internal tool-to-organ collision detection and image reconstruction. The accuracy of results in robot end-effector collisions and reduced occlusion remain promising within the scope of our research, validating the propositions made for the surgical clearance of ever-expanding AR technology in the future

    Medical Robotics

    Get PDF
    The first generation of surgical robots are already being installed in a number of operating rooms around the world. Robotics is being introduced to medicine because it allows for unprecedented control and precision of surgical instruments in minimally invasive procedures. So far, robots have been used to position an endoscope, perform gallbladder surgery and correct gastroesophogeal reflux and heartburn. The ultimate goal of the robotic surgery field is to design a robot that can be used to perform closed-chest, beating-heart surgery. The use of robotics in surgery will expand over the next decades without any doubt. Minimally Invasive Surgery (MIS) is a revolutionary approach in surgery. In MIS, the operation is performed with instruments and viewing equipment inserted into the body through small incisions created by the surgeon, in contrast to open surgery with large incisions. This minimizes surgical trauma and damage to healthy tissue, resulting in shorter patient recovery time. The aim of this book is to provide an overview of the state-of-art, to present new ideas, original results and practical experiences in this expanding area. Nevertheless, many chapters in the book concern advanced research on this growing area. The book provides critical analysis of clinical trials, assessment of the benefits and risks of the application of these technologies. This book is certainly a small sample of the research activity on Medical Robotics going on around the globe as you read it, but it surely covers a good deal of what has been done in the field recently, and as such it works as a valuable source for researchers interested in the involved subjects, whether they are currently “medical roboticists” or not

    Realistic tool-tissue interaction models for surgical simulation and planning

    Get PDF
    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in pre- and intra-operative surgical planning. Realistic modeling of medical interventions involving tool-tissue interactions has been considered to be a key requirement in the development of high-fidelity simulators and planners. The soft-tissue constitutive laws, organ geometry and boundary conditions imposed by the connective tissues surrounding the organ, and the shape of the surgical tool interacting with the organ are some of the factors that govern the accuracy of medical intervention planning.\ud \ud This thesis is divided into three parts. First, we compare the accuracy of linear and nonlinear constitutive laws for tissue. An important consequence of nonlinear models is the Poynting effect, in which shearing of tissue results in normal force; this effect is not seen in a linear elastic model. The magnitude of the normal force for myocardial tissue is shown to be larger than the human contact force discrimination threshold. Further, in order to investigate and quantify the role of the Poynting effect on material discrimination, we perform a multidimensional scaling study. Second, we consider the effects of organ geometry and boundary constraints in needle path planning. Using medical images and tissue mechanical properties, we develop a model of the prostate and surrounding organs. We show that, for needle procedures such as biopsy or brachytherapy, organ geometry and boundary constraints have more impact on target motion than tissue material parameters. Finally, we investigate the effects surgical tool shape on the accuracy of medical intervention planning. We consider the specific case of robotic needle steering, in which asymmetry of a bevel-tip needle results in the needle naturally bending when it is inserted into soft tissue. We present an analytical and finite element (FE) model for the loads developed at the bevel tip during needle-tissue interaction. The analytical model explains trends observed in the experiments. We incorporated physical parameters (rupture toughness and nonlinear material elasticity) into the FE model that included both contact and cohesive zone models to simulate tissue cleavage. The model shows that the tip forces are sensitive to the rupture toughness. In order to model the mechanics of deflection of the needle, we use an energy-based formulation that incorporates tissue-specific parameters such as rupture toughness, nonlinear material elasticity, and interaction stiffness, and needle geometric and material properties. Simulation results follow similar trends (deflection and radius of curvature) to those observed in macroscopic experimental studies of a robot-driven needle interacting with gels
    corecore