269 research outputs found

    Robotic-Assisted Systems for Spinal Surgery

    Get PDF
    Robotic-assisted spinal surgery is in its infancy. It aims to improve the accuracy of screw placement, lower the risk of surgical complications, and reduce radiation exposure to the patient and the surgical team. The present chapter attempts to provide an overview of the evolution of robotic-assisted spinal surgery and highlights different commercially available spine robotic systems in present use. The review concludes with future applications of robotics in spinal surgery

    Miniature Robotic Guidance for Spine Surgery

    Get PDF

    Recent trends, technical concepts and components of computer-assisted orthopedic surgery systems: A comprehensive review

    Get PDF
    Computer-assisted orthopedic surgery (CAOS) systems have become one of the most important and challenging types of system in clinical orthopedics, as they enable precise treatment of musculoskeletal diseases, employing modern clinical navigation systems and surgical tools. This paper brings a comprehensive review of recent trends and possibilities of CAOS systems. There are three types of the surgical planning systems, including: systems based on the volumetric images (computer tomography (CT), magnetic resonance imaging (MRI) or ultrasound images), further systems utilize either 2D or 3D fluoroscopic images, and the last one utilizes the kinetic information about the joints and morphological information about the target bones. This complex review is focused on three fundamental aspects of CAOS systems: their essential components, types of CAOS systems, and mechanical tools used in CAOS systems. In this review, we also outline the possibilities for using ultrasound computer-assisted orthopedic surgery (UCAOS) systems as an alternative to conventionally used CAOS systems.Web of Science1923art. no. 519

    Robotics in neurosurgery: A literature review

    Get PDF
    Robotic surgery has been the forte of minimally invasive stereo-tactic procedures for some decades now. Ongoing advancements and evolutionary developments require substantial evidence to build the consensus about its efficacy in the field of neurosurgery. Main obstacle in obtaining successful results in neurosurgery is fine neural structures and other anatomical limitations. Currently, human rationalisation and robotic precision works in symbiosis to provide improved results. We reviewed the current data about recent interventions. Robots are capable of providing virtual data, superior spatial resolution and geometric accuracy, superior dexterity, faster manoeuvring and non-fatigability with steady motion. Robotic surgery also allows simulation of virtual procedures which turn out to be of great succour for young apprentice surgeons to practise their surgical skills in a safe environment. It also allows senior professionals to rehearse difficult cases before involving into considerable risky procedures

    Augmented Reality: Mapping Methods and Tools for Enhancing the Human Role in Healthcare HMI

    Get PDF
    Background: Augmented Reality (AR) represents an innovative technology to improve data visualization and strengthen the human perception. Among Human–Machine Interaction (HMI), medicine can benefit most from the adoption of these digital technologies. In this perspective, the literature on orthopedic surgery techniques based on AR was evaluated, focusing on identifying the limitations and challenges of AR-based healthcare applications, to support the research and the development of further studies. Methods: Studies published from January 2018 to December 2021 were analyzed after a comprehensive search on PubMed, Google Scholar, Scopus, IEEE Xplore, Science Direct, and Wiley Online Library databases. In order to improve the review reporting, the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines were used. Results: Authors selected sixty-two articles meeting the inclusion criteria, which were categorized according to the purpose of the study (intraoperative, training, rehabilitation) and according to the surgical procedure used. Conclusions: AR has the potential to improve orthopedic training and practice by providing an increasingly human-centered clinical approach. Further research can be addressed by this review to cover problems related to hardware limitations, lack of accurate registration and tracking systems, and absence of security protocols

    Augmented navigation

    Get PDF
    Spinal fixation procedures have the inherent risk of causing damage to vulnerable anatomical structures such as the spinal cord, nerve roots, and blood vessels. To prevent complications, several technological aids have been introduced. Surgical navigation is the most widely used, and guides the surgeon by providing the position of the surgical instruments and implants in relation to the patient anatomy based on radiographic images. Navigation can be extended by the addition of a robotic arm to replace the surgeon’s hand to increase accuracy. Another line of surgical aids is tissue sensing equipment, that recognizes different tissue types and provides a warning system built into surgical instruments. All these technologies are under continuous development and the optimal solution is yet to be found. The aim of this thesis was to study the use of Augmented Reality (AR), Virtual Reality (VR), Artificial Intelligence (AI), and tissue sensing technology in spinal navigation to improve precision and prevent surgical errors. The aim of Paper I was to develop and validate an algorithm for automatizing the intraoperative planning of pedicle screws. An AI algorithm for automatic segmentation of the spine, and screw path suggestion was developed and evaluated. In a clinical study of advanced deformity cases, the algorithm could provide correct suggestions for 86% of all pedicles—or 95%, when cases with extremely altered anatomy were excluded. Paper II evaluated the accuracy of pedicle screw placement using a novel augmented reality surgical navigation (ARSN) system, harboring the above-developed algorithm. Twenty consecutively enrolled patients, eligible for deformity correction surgery in the thoracolumbar region, were operated on using the ARSN system. In this cohort, we found a pedicle screw placement accuracy of 94%, as measured according to the Gertzbein grading scale. The primary goal of Paper III was to validate an extension of the ARSN system for placing pedicle screws using instrument tracking and VR. In a porcine cadaver model, it was demonstrated that VR instrument tracking could successfully be integrated with the ARSN system, resulting in pedicle devices placed within 1.7 ± 1.0 mm of the planed path. Paper IV examined the feasibility of a robot-guided system for semi-automated, minimally invasive, pedicle screw placement in a cadaveric model. Using the robotic arm, pedicle devices were placed within 0.94 ± 0.59 mm of the planned path. The use of a semi-automated surgical robot was feasible, providing a higher technical accuracy compared to non-robotic solutions. Paper V investigated the use of a tissue sensing technology, diffuse reflectance spectroscopy (DRS), for detecting the cortical bone boundary in vertebrae during pedicle screw insertions. The technology could accurately differentiate between cancellous and cortical bone and warn the surgeon before a cortical breach. Using machine learning models, the technology demonstrated a sensitivity of 98% [range: 94-100%] and a specificity of 98% [range: 91-100%]. In conclusion, several technological aids can be used to improve accuracy during spinal fixation procedures. In this thesis, the advantages of adding AR, VR, AI and tissue sensing technology to conventional navigation solutions were studied

    Exploiting Temporal Image Information in Minimally Invasive Surgery

    Get PDF
    Minimally invasive procedures rely on medical imaging instead of the surgeons direct vision. While preoperative images can be used for surgical planning and navigation, once the surgeon arrives at the target site real-time intraoperative imaging is needed. However, acquiring and interpreting these images can be challenging and much of the rich temporal information present in these images is not visible. The goal of this thesis is to improve image guidance for minimally invasive surgery in two main areas. First, by showing how high-quality ultrasound video can be obtained by integrating an ultrasound transducer directly into delivery devices for beating heart valve surgery. Secondly, by extracting hidden temporal information through video processing methods to help the surgeon localize important anatomical structures. Prototypes of delivery tools, with integrated ultrasound imaging, were developed for both transcatheter aortic valve implantation and mitral valve repair. These tools provided an on-site view that shows the tool-tissue interactions during valve repair. Additionally, augmented reality environments were used to add more anatomical context that aids in navigation and in interpreting the on-site video. Other procedures can be improved by extracting hidden temporal information from the intraoperative video. In ultrasound guided epidural injections, dural pulsation provides a cue in finding a clear trajectory to the epidural space. By processing the video using extended Kalman filtering, subtle pulsations were automatically detected and visualized in real-time. A statistical framework for analyzing periodicity was developed based on dynamic linear modelling. In addition to detecting dural pulsation in lumbar spine ultrasound, this approach was used to image tissue perfusion in natural video and generate ventilation maps from free-breathing magnetic resonance imaging. A second statistical method, based on spectral analysis of pixel intensity values, allowed blood flow to be detected directly from high-frequency B-mode ultrasound video. Finally, pulsatile cues in endoscopic video were enhanced through Eulerian video magnification to help localize critical vasculature. This approach shows particular promise in identifying the basilar artery in endoscopic third ventriculostomy and the prostatic artery in nerve-sparing prostatectomy. A real-time implementation was developed which processed full-resolution stereoscopic video on the da Vinci Surgical System

    A Multi-surgeon Robotic-guided Thoracolumbar Fusion Experience: Accuracy, Radiation, Complications, Readmissions, and Revisions of 3,874 Screws across Three Robotic Generations

    Get PDF
    Objective Robotic guidance provides indirect visualization of key anatomic landmarks to facilitate minimally invasive surgery (MIS) and is emerging as a reliable and accurate technique for posterior spine instrumentation. We sought to describe eight years of experience with robotic guidance at a high-volume, multi-surgeon center. We hypothesize that robotic guidance will lead to (1) low rates of complication, readmissions, and revision surgery, (2) reduced fluoroscopic radiation exposure, (3) and accurate thoracolumbar instrumentation. Methods A retrospective review of complications, revision surgery, and readmission rates in patients undergoing thoracolumbar fusion surgery utilizing three robotic generations. Secondary analysis was conducted comparing the three robotic generations for complications, revision surgery, accuracy, and readmission rates along with intraoperative fluoroscopic duration. Results A total of 628 patients (3,874 robotic-guided screws) ages 12–81 years-old (43.9% male) were included in the study. At one year, the cumulative complication incidence was 15.5% with a 10.3% incidence of surgical complications (3.7% wound, 1.2% robot-related, and 5.4% non-robot-related complications). At one year, the revision surgery incidence was 9.4%. There was no statistical difference between complications, readmission, or revision surgery after initial admission among the three robotic generations. The average intraoperative fluoroscopic duration was 53.8 seconds (11.9 seconds per screw and 17.6 seconds per instrumented level). Conclusion Robotic guidance in thoracolumbar instrumented fusions was associated with low complication, revision surgery, and readmission rates. Our results suggest robotic guidance can provide accurate guidance with minimal adverse events in thoracolumbar instrumentation

    Minimally Invasive Treatment of the Thoracic Spine Disease: Completely Percutaneous and Hybrid Approaches

    Get PDF

    Intra-operative fiducial-based CT/fluoroscope image registration framework for image-guided robot-assisted joint fracture surgery

    Get PDF
    Purpose Joint fractures must be accurately reduced minimising soft tissue damages to avoid negative surgical outcomes. To this regard, we have developed the RAFS surgical system, which allows the percutaneous reduction of intra-articular fractures and provides intra-operative real-time 3D image guidance to the surgeon. Earlier experiments showed the effectiveness of the RAFS system on phantoms, but also key issues which precluded its use in a clinical application. This work proposes a redesign of the RAFS’s navigation system overcoming the earlier version’s issues, aiming to move the RAFS system into a surgical environment. Methods The navigation system is improved through an image registration framework allowing the intra-operative registration between pre-operative CT images and intra-operative fluoroscopic images of a fractured bone using a custom-made fiducial marker. The objective of the registration is to estimate the relative pose between a bone fragment and an orthopaedic manipulation pin inserted into it intra-operatively. The actual pose of the bone fragment can be updated in real time using an optical tracker, enabling the image guidance. Results Experiments on phantom and cadavers demonstrated the accuracy and reliability of the registration framework, showing a reduction accuracy (sTRE) of about 0.88 ±0.2mm (phantom) and 1.15±0.8mm (cadavers). Four distal femur fractures were successfully reduced in cadaveric specimens using the improved navigation system and the RAFS system following the new clinical workflow (reduction error 1.2±0.3mm, 2±1∘). Conclusion Experiments showed the feasibility of the image registration framework. It was successfully integrated into the navigation system, allowing the use of the RAFS system in a realistic surgical application
    corecore