75 research outputs found

    Augmented reality navigation for spinal pedicle screw instrumentation using intraoperative 3D imaging

    Get PDF
    BACKGROUND CONTEXT Due to recent developments in augmented reality with head-mounted devices, holograms of a surgical plan can be displayed directly in the surgeon's field of view. To the best of our knowledge, three dimensional (3D) intraoperative fluoroscopy has not been explored for the use with holographic navigation by head-mounted devices in spine surgery. PURPOSE To evaluate the surgical accuracy of holographic pedicle screw navigation by head-mounted device using 3D intraoperative fluoroscopy. STUDY DESIGN In this experimental cadaver study, the accuracy of surgical navigation using a head-mounted device was compared with navigation with a state-of-the-art pose-tracking system. METHODS Three lumbar cadaver spines were embedded in nontransparent agar gel, leaving only commonly visible anatomy in sight. Intraoperative registration of preoperative planning was achieved by 3D fluoroscopy and fiducial markers attached to lumbar vertebrae. Trackable custom-made drill sleeve guides enabled real-time navigation. In total, 20 K-wires were navigated into lumbar pedicles using AR-navigation, 10 K-wires by the state-of-the-art pose-tracking system. 3D models obtained from postexperimental CT scans were used to measure surgical accuracy. MF is the founder and shareholder of Incremed AG, a Balgrist University Hospital start-up focusing on the development of innovative techniques for surgical executions. The other authors declare no conflict of interest concerning the contents of this study. No external funding was received for this study. RESULTS No significant difference in accuracy was measured between AR-navigated drillings and the gold standard with pose-tracking system with mean translational errors between entry points (3D vector distance; p=.85) of 3.4±1.6 mm compared with 3.2±2.0 mm, and mean angular errors between trajectories (3D angle; p=.30) of 4.3°±2.3° compared with 3.5°±1.4°. CONCLUSIONS In conclusion, holographic navigation by use of a head-mounted device achieve accuracy comparable to the gold standard of high-end pose-tracking systems. CLINICAL SIGNIFICANCE These promising results could result in a new way of surgical navigation with minimal infrastructural requirements but now have to be confirmed in clinical studies

    Real-time integration between Microsoft HoloLens 2 and 3D Slicer with demonstration in pedicle screw placement planning

    Get PDF
    We established a direct communication channel between Microsoft HoloLens 2 and 3D Slicer to exchange transform and image messages between the platforms in real time. This allows us to seamlessly display a CT reslice of a patient in the AR world.Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature. Research supported by projects PI122/00601 and AC20/00102 (Ministerio de Ciencia, InnovaciĂłn y Universidades, Instituto de Salud Carlos III, AsociaciĂłn Española Contra el CĂĄncer and European Regional Development Fund “Una manera de hacer Europa”), project PerPlanRT (ERA Permed), TED2021-129392B-I00 and TED2021-132200B-I00 (MCIN/AEI/10.13039/501100011033 and European Union “NextGenerationEU”/PRTR) and EU Horizon 2020 research and innovation programme Conex plus UC3M (grant agreement 801538). APC funded by Universidad Carlos III de Madrid (Read & Publish Agreement CRUE-CSIC 2023)

    Augmented navigation

    Get PDF
    Spinal fixation procedures have the inherent risk of causing damage to vulnerable anatomical structures such as the spinal cord, nerve roots, and blood vessels. To prevent complications, several technological aids have been introduced. Surgical navigation is the most widely used, and guides the surgeon by providing the position of the surgical instruments and implants in relation to the patient anatomy based on radiographic images. Navigation can be extended by the addition of a robotic arm to replace the surgeon’s hand to increase accuracy. Another line of surgical aids is tissue sensing equipment, that recognizes different tissue types and provides a warning system built into surgical instruments. All these technologies are under continuous development and the optimal solution is yet to be found. The aim of this thesis was to study the use of Augmented Reality (AR), Virtual Reality (VR), Artificial Intelligence (AI), and tissue sensing technology in spinal navigation to improve precision and prevent surgical errors. The aim of Paper I was to develop and validate an algorithm for automatizing the intraoperative planning of pedicle screws. An AI algorithm for automatic segmentation of the spine, and screw path suggestion was developed and evaluated. In a clinical study of advanced deformity cases, the algorithm could provide correct suggestions for 86% of all pedicles—or 95%, when cases with extremely altered anatomy were excluded. Paper II evaluated the accuracy of pedicle screw placement using a novel augmented reality surgical navigation (ARSN) system, harboring the above-developed algorithm. Twenty consecutively enrolled patients, eligible for deformity correction surgery in the thoracolumbar region, were operated on using the ARSN system. In this cohort, we found a pedicle screw placement accuracy of 94%, as measured according to the Gertzbein grading scale. The primary goal of Paper III was to validate an extension of the ARSN system for placing pedicle screws using instrument tracking and VR. In a porcine cadaver model, it was demonstrated that VR instrument tracking could successfully be integrated with the ARSN system, resulting in pedicle devices placed within 1.7 ± 1.0 mm of the planed path. Paper IV examined the feasibility of a robot-guided system for semi-automated, minimally invasive, pedicle screw placement in a cadaveric model. Using the robotic arm, pedicle devices were placed within 0.94 ± 0.59 mm of the planned path. The use of a semi-automated surgical robot was feasible, providing a higher technical accuracy compared to non-robotic solutions. Paper V investigated the use of a tissue sensing technology, diffuse reflectance spectroscopy (DRS), for detecting the cortical bone boundary in vertebrae during pedicle screw insertions. The technology could accurately differentiate between cancellous and cortical bone and warn the surgeon before a cortical breach. Using machine learning models, the technology demonstrated a sensitivity of 98% [range: 94-100%] and a specificity of 98% [range: 91-100%]. In conclusion, several technological aids can be used to improve accuracy during spinal fixation procedures. In this thesis, the advantages of adding AR, VR, AI and tissue sensing technology to conventional navigation solutions were studied

    Augmented Reality: Mapping Methods and Tools for Enhancing the Human Role in Healthcare HMI

    Get PDF
    Background: Augmented Reality (AR) represents an innovative technology to improve data visualization and strengthen the human perception. Among Human–Machine Interaction (HMI), medicine can benefit most from the adoption of these digital technologies. In this perspective, the literature on orthopedic surgery techniques based on AR was evaluated, focusing on identifying the limitations and challenges of AR-based healthcare applications, to support the research and the development of further studies. Methods: Studies published from January 2018 to December 2021 were analyzed after a comprehensive search on PubMed, Google Scholar, Scopus, IEEE Xplore, Science Direct, and Wiley Online Library databases. In order to improve the review reporting, the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines were used. Results: Authors selected sixty-two articles meeting the inclusion criteria, which were categorized according to the purpose of the study (intraoperative, training, rehabilitation) and according to the surgical procedure used. Conclusions: AR has the potential to improve orthopedic training and practice by providing an increasingly human-centered clinical approach. Further research can be addressed by this review to cover problems related to hardware limitations, lack of accurate registration and tracking systems, and absence of security protocols

    Feasibility and accuracy of a robotic guidance system for navigated spine surgery in a hybrid operating room: a cadaver study.

    Get PDF
    To access publisher's full text version of this article, please click on the hyperlink in Additional Links field or click on the hyperlink at the top of the page marked DownloadThe combination of navigation and robotics in spine surgery has the potential to accurately identify and maintain bone entry position and planned trajectory. The goal of this study was to examine the feasibility, accuracy and efficacy of a new robot-guided system for semi-automated, minimally invasive, pedicle screw placement. A custom robotic arm was integrated into a hybrid operating room (OR) equipped with an augmented reality surgical navigation system (ARSN). The robot was mounted on the OR-table and used to assist in placing Jamshidi needles in 113 pedicles in four cadavers. The ARSN system was used for planning screw paths and directing the robot. The robot arm autonomously aligned with the planned screw trajectory, and the surgeon inserted the Jamshidi needle into the pedicle. Accuracy measurements were performed on verification cone beam computed tomographies with the planned paths superimposed. To provide a clinical grading according to the Gertzbein scale, pedicle screw diameters were simulated on the placed Jamshidi needles. A technical accuracy at bone entry point of 0.48 ± 0.44 mm and 0.68 ± 0.58 mm was achieved in the axial and sagittal views, respectively. The corresponding angular errors were 0.94 ± 0.83° and 0.87 ± 0.82°. The accuracy was statistically superior (p < 0.001) to ARSN without robotic assistance. Simulated pedicle screw grading resulted in a clinical accuracy of 100%. This study demonstrates that the use of a semi-automated surgical robot for pedicle screw placement provides an accuracy well above what is clinically acceptable

    Advanced cranial navigation

    Get PDF
    Neurosurgery is performed with extremely low margins of error. Surgical inaccuracy may have disastrous consequences. The overall aim of this thesis was to improve accuracy in cranial neurosurgical procedures by the application of new technical aids. Two technical methods were evaluated: augmented reality (AR) for surgical navigation (Papers I-II) and the optical technique of diffuse reflectance spectroscopy (DRS) for real-time tissue identification (Papers III-V). Minimally invasive skull-base endoscopy has several potential benefits compared to traditional craniotomy, but approaching the skull base through this route implies that at-risk organs and surgical targets are covered by bone and out of the surgeon’s direct line of sight. In Paper I, a new application for AR-navigated endoscopic skull-base surgery, based on an augmented-reality surgical navigation (ARSN) system, was developed. The accuracy of the system, defined by mean target registration error (TRE), was evaluated and found to be 0.55±0.24 mm, the lowest value reported error in the literature. As a first step toward the development of a cranial application for AR navigation, in Paper II this ARSN system was used to enable insertions of biopsy needles and external ventricular drainages (EVDs). The technical accuracy (i.e., deviation from the target or intended path) and efficacy (i.e., insertion time) were assessed on a 3D-printed realistic, anthropomorphic skull and brain phantom; Thirty cranial biopsies and 10 EVD insertions were performed. Accuracy for biopsy was 0.8±0.43 mm with a median insertion time of 149 (87-233) seconds, and for EVD accuracy was 2.9±0.8 mm at the tip with a median angular deviation of 0.7±0.5° and a median insertion time of 188 (135-400) seconds. Glial tumors grow diffusely in the brain, and patient survival is correlated with the extent of tumor removal. Tumor borders are often invisible. Resection beyond borders as defined by conventional methods may further improve a patient’s prognosis. In Paper III, DRS was evaluated for discrimination between glioma and normal brain tissue ex vivo. DRS spectra and histology were acquired from 22 tumor samples and 9 brain tissue samples retrieved from 30 patients. Sensitivity and specificity for the detection of low-grade gliomas were 82.0% and 82.7%, respectively, with an AUC of 0.91. Acute ischemic stroke caused by large vessel occlusion is treated with endovascular thrombectomy, but treatment failure can occur when clot composition and thrombectomy technique are mismatched. Intra-procedural knowledge of clot composition could guide the choice of treatment modality. In Paper IV, DRS, in vivo, was evaluated for intravascular clot characterization. Three types of clot analogs, red blood cell (RBC)-rich, fibrin-rich and mixed clots, were injected into the external carotids of a domestic pig. An intravascular DRS probe was used for in-situ measurements of clots, blood, and vessel walls, and the spectral data were analyzed. DRS could differentiate clot types, vessel walls, and blood in vivo (p<0,001). The sensitivity and specificity for detection were 73.8% and 98.8% for RBC clots, 100% and 100% for mixed clots, and 80.6% and 97.8% for fibrin clots, respectively. Paper V evaluated DRS for characterization of human clot composition ex vivo: 45 clot units were retrieved from 29 stroke patients and examined with DRS and histopathological evaluation. DRS parameters correlated with clot RBC fraction (R=81, p<0.001) and could be used for the classification of clot type with sensitivity and specificity rates for the detection of RBC-rich clots of 0.722 and 0.846, respectively. Applied in an intravascular probe, DRS may provide intra-procedural information on clot composition to improve endovascular thrombectomy efficiency

    Augmented reality for computer assisted orthopaedic surgery

    Get PDF
    In recent years, computer-assistance and robotics have established their presence in operating theatres and found success in orthopaedic procedures. Benefits of computer assisted orthopaedic surgery (CAOS) have been thoroughly explored in research, finding improvements in clinical outcomes, through increased control and precision over surgical actions. However, human-computer interaction in CAOS remains an evolving field, through emerging display technologies including augmented reality (AR) – a fused view of the real environment with virtual, computer-generated holograms. Interactions between clinicians and patient-specific data generated during CAOS are limited to basic 2D interactions on touchscreen monitors, potentially creating clutter and cognitive challenges in surgery. Work described in this thesis sought to explore the benefits of AR in CAOS through: an integration between commercially available AR and CAOS systems, creating a novel AR-centric surgical workflow to support various tasks of computer-assisted knee arthroplasty, and three pre–clinical studies exploring the impact of the new AR workflow on both existing and newly proposed quantitative and qualitative performance metrics. Early research focused on cloning the (2D) user-interface of an existing CAOS system onto a virtual AR screen and investigating any resulting impacts on usability and performance. An infrared-based registration system is also presented, describing a protocol for calibrating commercial AR headsets with optical trackers, calculating a spatial transformation between surgical and holographic coordinate frames. The main contribution of this thesis is a novel AR workflow designed to support computer-assisted patellofemoral arthroplasty. The reported workflow provided 3D in-situ holographic guidance for CAOS tasks including patient registration, pre-operative planning, and assisted-cutting. Pre-clinical experimental validation on a commercial system (NAVIO¼, Smith & Nephew) for these contributions demonstrates encouraging early-stage results showing successful deployment of AR to CAOS systems, and promising indications that AR can enhance the clinician’s interactions in the future. The thesis concludes with a summary of achievements, corresponding limitations and future research opportunities.Open Acces

    Augmented reality for minimally invasive spinal surgery

    Get PDF
    BackgroundAugmented reality (AR) is an emerging technology that can overlay computer graphics onto the real world and enhance visual feedback from information systems. Within the past several decades, innovations related to AR have been integrated into our daily lives; however, its application in medicine, specifically in minimally invasive spine surgery (MISS), may be most important to understand. AR navigation provides auditory and haptic feedback, which can further enhance surgeons’ capabilities and improve safety.PurposeThe purpose of this article is to address previous and current applications of AR, AR in MISS, limitations of today's technology, and future areas of innovation.MethodsA literature review related to applications of AR technology in previous and current generations was conducted.ResultsAR systems have been implemented for treatments related to spinal surgeries in recent years, and AR may be an alternative to current approaches such as traditional navigation, robotically assisted navigation, fluoroscopic guidance, and free hand. As AR is capable of projecting patient anatomy directly on the surgical field, it can eliminate concern for surgeon attention shift from the surgical field to navigated remote screens, line-of-sight interruption, and cumulative radiation exposure as the demand for MISS increases.ConclusionAR is a novel technology that can improve spinal surgery, and limitations will likely have a great impact on future technology

    Optimization of computer-assisted intraoperative guidance for complex oncological procedures

    Get PDF
    MenciĂłn Internacional en el tĂ­tulo de doctorThe role of technology inside the operating room is constantly increasing, allowing surgical procedures previously considered impossible or too risky due to their complexity or limited access. These reliable tools have improved surgical efficiency and safety. Cancer treatment is one of the surgical specialties that has benefited most from these techniques due to its high incidence and the accuracy required for tumor resections with conservative approaches and clear margins. However, in many cases, introducing these technologies into surgical scenarios is expensive and entails complex setups that are obtrusive, invasive, and increase the operative time. In this thesis, we proposed convenient, accessible, reliable, and non-invasive solutions for two highly complex regions for tumor resection surgeries: pelvis and head and neck. We explored how the introduction of 3D printing, surgical navigation, and augmented reality in these scenarios provided high intraoperative precision. First, we presented a less invasive setup for osteotomy guidance in pelvic tumor resections based on small patient-specific instruments (PSIs) fabricated with a desktop 3D printer at a low cost. We evaluated their accuracy in a cadaveric study, following a realistic workflow, and obtained similar results to previous studies with more invasive setups. We also identified the ilium as the region more prone to errors. Then, we proposed surgical navigation using these small PSIs for image-to-patient registration. Artificial landmarks included in the PSIs substitute the anatomical landmarks and the bone surface commonly used for this step, which require additional bone exposure and is, therefore, more invasive. We also presented an alternative and more convenient installation of the dynamic reference frame used to track the patient movements in surgical navigation. The reference frame is inserted in a socket included in the PSIs and can be attached and detached without losing precision and simplifying the installation. We validated the setup in a cadaveric study, evaluating the accuracy and finding the optimal PSI configuration in the three most common scenarios for pelvic tumor resection. The results demonstrated high accuracy, where the main source of error was again incorrect placements of PSIs in regular and homogeneous regions such as the ilium. The main limitation of PSIs is the guidance error resulting from incorrect placements. To overcome this issue, we proposed augmented reality as a tool to guide PSI installation in the patient’s bone. We developed an application for smartphones and HoloLens 2 that displays the correct position intraoperatively. We measured the placement errors in a conventional and a realistic phantom, including a silicone layer to simulate tissue. The results demonstrated a significant reduction of errors with augmented reality compared to freehand placement, ensuring an installation of the PSI close to the target area. Finally, we proposed three setups for surgical navigation in palate tumor resections, using optical trackers and augmented reality. The tracking tools for the patient and surgical instruments were fabricated with low-cost desktop 3D printers and designed to provide less invasive setups compared to previous solutions. All setups presented similar results with high accuracy when tested in a 3D-printed patient-specific phantom. They were then validated in the real surgical case, and one of the solutions was applied for intraoperative guidance. Postoperative results demonstrated high navigation accuracy, obtaining optimal surgical outcomes. The proposed solution enabled a conservative surgical approach with a less invasive navigation setup. To conclude, in this thesis we have proposed new setups for intraoperative navigation in two complex surgical scenarios for tumor resection. We analyzed their navigation precision, defining the optimal configurations to ensure accuracy. With this, we have demonstrated that computer-assisted surgery techniques can be integrated into the surgical workflow with accessible and non-invasive setups. These results are a step further towards optimizing the procedures and continue improving surgical outcomes in complex surgical scenarios.Programa de Doctorado en Ciencia y TecnologĂ­a BiomĂ©dica por la Universidad Carlos III de MadridPresidente: RaĂșl San JosĂ© EstĂ©par.- Secretario: Alba GonzĂĄlez Álvarez.- Vocal: Simon Droui
    • 

    corecore