786 research outputs found
Augmented navigation
Spinal fixation procedures have the inherent risk of causing damage to vulnerable anatomical structures such as the spinal cord, nerve roots, and blood vessels. To prevent complications, several technological aids have been introduced. Surgical navigation is the most widely used, and guides the surgeon by providing the position of the surgical instruments and implants in relation to the patient anatomy based on radiographic images. Navigation can be extended by the addition of a robotic arm to replace the surgeonâs hand to increase accuracy. Another line of surgical aids is tissue sensing equipment, that recognizes different tissue types and provides a warning system built into surgical instruments. All these technologies are under continuous development and the optimal solution is yet to be found. The aim of this thesis was to study the use of Augmented Reality (AR), Virtual Reality (VR), Artificial Intelligence (AI), and tissue sensing technology in spinal navigation to improve precision and prevent surgical errors.
The aim of Paper I was to develop and validate an algorithm for automatizing the intraoperative planning of pedicle screws. An AI algorithm for automatic segmentation of the spine, and screw path suggestion was developed and evaluated. In a clinical study of advanced deformity cases, the algorithm could provide correct suggestions for 86% of all pediclesâor 95%, when cases with extremely altered anatomy were excluded.
Paper II evaluated the accuracy of pedicle screw placement using a novel augmented reality surgical navigation (ARSN) system, harboring the above-developed algorithm. Twenty consecutively enrolled patients, eligible for deformity correction surgery in the thoracolumbar region, were operated on using the ARSN system. In this cohort, we found a pedicle screw placement accuracy of 94%, as measured according to the Gertzbein grading scale.
The primary goal of Paper III was to validate an extension of the ARSN system for placing pedicle screws using instrument tracking and VR. In a porcine cadaver model, it was demonstrated that VR instrument tracking could successfully be integrated with the ARSN system, resulting in pedicle devices placed within 1.7 ± 1.0 mm of the planed path.
Paper IV examined the feasibility of a robot-guided system for semi-automated, minimally invasive, pedicle screw placement in a cadaveric model. Using the robotic arm, pedicle devices were placed within 0.94 ± 0.59 mm of the planned path. The use of a semi-automated surgical robot was feasible, providing a higher technical accuracy compared to non-robotic solutions.
Paper V investigated the use of a tissue sensing technology, diffuse reflectance spectroscopy (DRS), for detecting the cortical bone boundary in vertebrae during pedicle screw insertions. The technology could accurately differentiate between cancellous and cortical bone and warn the surgeon before a cortical breach. Using machine learning models, the technology demonstrated a sensitivity of 98% [range: 94-100%] and a specificity of 98% [range: 91-100%].
In conclusion, several technological aids can be used to improve accuracy during spinal fixation procedures. In this thesis, the advantages of adding AR, VR, AI and tissue sensing technology to conventional navigation solutions were studied
Computer assisted navigation in spine surgery
INTRODUCTION: Computer aided navigation is an important tool which has the capability to enhance surgical accuracy, while reducing negative outcomes. However, it is a relatively new technology and has not yet been accepted as the standard of care in all settings.
OBJECTIVES: The objective of the present study is to present the development and current state of technologies in computer aided navigation in Orthopedic Spine Surgery, specifically in navigated placement of pedicle screws, to examine the clinical need for navigation, it's effect on surgical accuracy and clinical outcome and to determine whether the benefits justify the costs, and make recommendations for future use and enhancements.
CONCLUSION: Computer aided navigation in pedicle screw placement enhances accuracy, reduces the probability of negative outcomes, reduces the exposure of the patient and staff to radiation, reduces operative time, and provides cost-savings. Future investigations may potentially enhance this effect further with the use of innovative augmented reality type displays
Recent trends, technical concepts and components of computer-assisted orthopedic surgery systems: A comprehensive review
Computer-assisted orthopedic surgery (CAOS) systems have become one of the most important and challenging types of system in clinical orthopedics, as they enable precise treatment of musculoskeletal diseases, employing modern clinical navigation systems and surgical tools. This paper brings a comprehensive review of recent trends and possibilities of CAOS systems. There are three types of the surgical planning systems, including: systems based on the volumetric images (computer tomography (CT), magnetic resonance imaging (MRI) or ultrasound images), further systems utilize either 2D or 3D fluoroscopic images, and the last one utilizes the kinetic information about the joints and morphological information about the target bones. This complex review is focused on three fundamental aspects of CAOS systems: their essential components, types of CAOS systems, and mechanical tools used in CAOS systems. In this review, we also outline the possibilities for using ultrasound computer-assisted orthopedic surgery (UCAOS) systems as an alternative to conventionally used CAOS systems.Web of Science1923art. no. 519
Sonification as a Reliable Alternative to Conventional Visual Surgical Navigation
Despite the undeniable advantages of image-guided surgical assistance systems in terms of accuracy, such systems have not yet fully met surgeons' needs or expectations regarding usability, time efficiency, and their integration into the surgical workflow. On the other hand, perceptual studies have shown that presenting independent but causally correlated information via multimodal feedback involving different sensory modalities can improve task performance. This article investigates an alternative method for computer-assisted surgical navigation, introduces a novel sonification methodology for navigated pedicle screw placement, and discusses advanced solutions based on multisensory feedback. The proposed method comprises a novel sonification solution for alignment tasks in four degrees of freedom based on frequency modulation (FM) synthesis. We compared the resulting accuracy and execution time of the proposed sonification method with visual navigation, which is currently considered the state of the art. We conducted a phantom study in which 17 surgeons executed the pedicle screw placement task in the lumbar spine, guided by either the proposed sonification-based or the traditional visual navigation method. The results demonstrated that the proposed method is as accurate as the state of the art while decreasing the surgeon's need to focus on visual navigation displays instead of the natural focus on surgical tools and targeted anatomy during task execution
Advanced cranial navigation
Neurosurgery is performed with extremely low margins of error. Surgical inaccuracy may
have disastrous consequences. The overall aim of this thesis was to improve accuracy in
cranial neurosurgical procedures by the application of new technical aids. Two technical
methods were evaluated: augmented reality (AR) for surgical navigation (Papers I-II) and the
optical technique of diffuse reflectance spectroscopy (DRS) for real-time tissue identification
(Papers III-V).
Minimally invasive skull-base endoscopy has several potential benefits compared to
traditional craniotomy, but approaching the skull base through this route implies that at-risk
organs and surgical targets are covered by bone and out of the surgeonâs direct line of sight.
In Paper I, a new application for AR-navigated endoscopic skull-base surgery, based on an
augmented-reality surgical navigation (ARSN) system, was developed. The accuracy of the
system, defined by mean target registration error (TRE), was evaluated and found to be
0.55±0.24 mm, the lowest value reported error in the literature.
As a first step toward the development of a cranial application for AR
navigation, in Paper II this ARSN system was used to enable insertions of biopsy needles
and external ventricular drainages (EVDs). The technical accuracy (i.e., deviation from the
target or intended path) and efficacy (i.e., insertion time) were assessed on a 3D-printed
realistic, anthropomorphic skull and brain phantom; Thirty cranial biopsies and 10 EVD
insertions were performed. Accuracy for biopsy was 0.8±0.43 mm with a median insertion
time of 149 (87-233) seconds, and for EVD accuracy was 2.9±0.8 mm at the tip with a median
angular deviation of 0.7±0.5° and a median insertion time of 188 (135-400) seconds.
Glial tumors grow diffusely in the brain, and patient survival is correlated with
the extent of tumor removal. Tumor borders are often invisible. Resection beyond borders as
defined by conventional methods may further improve a patientâs prognosis. In Paper III,
DRS was evaluated for discrimination between glioma and normal brain tissue ex vivo. DRS
spectra and histology were acquired from 22 tumor samples and 9 brain tissue samples
retrieved from 30 patients. Sensitivity and specificity for the detection of low-grade gliomas
were 82.0% and 82.7%, respectively, with an AUC of 0.91.
Acute ischemic stroke caused by large vessel occlusion is treated with
endovascular thrombectomy, but treatment failure can occur when clot composition and
thrombectomy technique are mismatched. Intra-procedural knowledge of clot composition
could guide the choice of treatment modality. In Paper IV, DRS, in vivo, was evaluated for
intravascular clot characterization. Three types of clot analogs, red blood cell (RBC)-rich,
fibrin-rich and mixed clots, were injected into the external carotids of a domestic pig. An
intravascular DRS probe was used for in-situ measurements of clots, blood, and vessel walls,
and the spectral data were analyzed. DRS could differentiate clot types, vessel walls, and
blood in vivo (p<0,001). The sensitivity and specificity for detection were 73.8% and 98.8%
for RBC clots, 100% and 100% for mixed clots, and 80.6% and 97.8% for fibrin clots,
respectively.
Paper V evaluated DRS for characterization of human clot composition ex
vivo: 45 clot units were retrieved from 29 stroke patients and examined with DRS and
histopathological evaluation. DRS parameters correlated with clot RBC fraction (R=81,
p<0.001) and could be used for the classification of clot type with sensitivity and specificity
rates for the detection of RBC-rich clots of 0.722 and 0.846, respectively. Applied in an
intravascular probe, DRS may provide intra-procedural information on clot composition to
improve endovascular thrombectomy efficiency
Augmented reality for computer assisted orthopaedic surgery
In recent years, computer-assistance and robotics have established their presence in operating
theatres and found success in orthopaedic procedures. Benefits of computer assisted orthopaedic
surgery (CAOS) have been thoroughly explored in research, finding improvements in clinical outcomes, through increased control and precision over surgical actions. However, human-computer interaction in CAOS remains an evolving field, through emerging display technologies including augmented reality (AR) â a fused view of the real environment with virtual, computer-generated holograms. Interactions between clinicians and patient-specific data generated during CAOS are limited to basic 2D interactions on touchscreen monitors, potentially creating clutter and cognitive challenges in surgery.
Work described in this thesis sought to explore the benefits of AR in CAOS through: an integration between commercially available AR and CAOS systems, creating a novel AR-centric surgical workflow to support various tasks of computer-assisted knee arthroplasty, and three preâclinical studies exploring the impact of the new AR workflow on both existing and newly proposed quantitative and qualitative performance metrics.
Early research focused on cloning the (2D) user-interface of an existing CAOS system onto a virtual AR screen and investigating any resulting impacts on usability and performance. An infrared-based registration system is also presented, describing a protocol for calibrating commercial AR headsets with optical trackers, calculating a spatial transformation between surgical and holographic coordinate frames. The main contribution of this thesis is a novel AR workflow designed to support computer-assisted patellofemoral arthroplasty. The reported workflow provided 3D in-situ holographic guidance for CAOS tasks including patient registration, pre-operative planning, and assisted-cutting. Pre-clinical experimental validation on a commercial system (NAVIOÂź, Smith & Nephew) for these contributions demonstrates encouraging early-stage results showing successful deployment of AR to CAOS systems, and promising indications that AR can enhance the clinicianâs interactions in the future. The thesis concludes with a summary of achievements, corresponding limitations and future research opportunities.Open Acces
Feasibility and accuracy of a robotic guidance system for navigated spine surgery in a hybrid operating room: a cadaver study.
To access publisher's full text version of this article, please click on the hyperlink in Additional Links field or click on the hyperlink at the top of the page marked DownloadThe combination of navigation and robotics in spine surgery has the potential to accurately identify and maintain bone entry position and planned trajectory. The goal of this study was to examine the feasibility, accuracy and efficacy of a new robot-guided system for semi-automated, minimally invasive, pedicle screw placement. A custom robotic arm was integrated into a hybrid operating room (OR) equipped with an augmented reality surgical navigation system (ARSN). The robot was mounted on the OR-table and used to assist in placing Jamshidi needles in 113 pedicles in four cadavers. The ARSN system was used for planning screw paths and directing the robot. The robot arm autonomously aligned with the planned screw trajectory, and the surgeon inserted the Jamshidi needle into the pedicle. Accuracy measurements were performed on verification cone beam computed tomographies with the planned paths superimposed. To provide a clinical grading according to the Gertzbein scale, pedicle screw diameters were simulated on the placed Jamshidi needles. A technical accuracy at bone entry point of 0.48 ± 0.44 mm and 0.68 ± 0.58 mm was achieved in the axial and sagittal views, respectively. The corresponding angular errors were 0.94 ± 0.83° and 0.87 ± 0.82°. The accuracy was statistically superior (p < 0.001) to ARSN without robotic assistance. Simulated pedicle screw grading resulted in a clinical accuracy of 100%. This study demonstrates that the use of a semi-automated surgical robot for pedicle screw placement provides an accuracy well above what is clinically acceptable
Neurosurgical Ultrasound Pose Estimation Using Image-Based Registration and Sensor Fusion - A Feasibility Study
Modern neurosurgical procedures often rely on computer-assisted real-time guidance using multiple medical imaging modalities. State-of-the-art commercial products enable the fusion of pre-operative with intra-operative images (e.g., magnetic resonance [MR] with ultrasound [US] images), as well as the on-screen visualization of procedures in progress. In so doing, US images can be employed as a template to which pre-operative images can be registered, to correct for anatomical changes, to provide live-image feedback, and consequently to improve confidence when making resection margin decisions near eloquent regions during tumour surgery.
In spite of the potential for tracked ultrasound to improve many neurosurgical procedures, it is not widely used. State-of-the-art systems are handicapped by optical trackingâs need for consistent line-of-sight, keeping tracked rigid bodies clean and rigidly fixed, and requiring a calibration workflow. The goal of this work is to improve the value offered by co-registered ultrasound images without the workflow drawbacks of conventional systems. The novel work in this thesis includes: the exploration and development of a GPU-enabled 2D-3D multi-modal registration algorithm based on the existing LC2 metric; and the use of this registration algorithm in the context of a sensor and image-fusion algorithm.
The work presented here is a motivating step in a vision towards a heterogeneous tracking framework for image-guided interventions where the knowledge from intraoperative imaging, pre-operative imaging, and (potentially disjoint) wireless sensors in the surgical field are seamlessly integrated for the benefit of the surgeon. The technology described in this thesis, inspired by advances in robot localization demonstrate how inaccurate pose data from disjoint sources can produce a localization system greater than the sum of its parts
- âŠ