9 research outputs found

    Personalized medicine in surgical treatment combining tracking systems, augmented reality and 3D printing

    Get PDF
    Mención Internacional en el título de doctorIn the last twenty years, a new way of practicing medicine has been focusing on the problems and needs of each patient as an individual thanks to the significant advances in healthcare technology, the so-called personalized medicine. In surgical treatments, personalization has been possible thanks to key technologies adapted to the specific anatomy of each patient and the needs of the physicians. Tracking systems, augmented reality (AR), three-dimensional (3D) printing and artificial intelligence (AI) have previously supported this individualized medicine in many ways. However, their independent contributions show several limitations in terms of patient-to-image registration, lack of flexibility to adapt to the requirements of each case, large preoperative planning times, and navigation complexity. The main objective of this thesis is to increase patient personalization in surgical treatments by combining these technologies to bring surgical navigation to new complex cases by developing new patient registration methods, designing patient-specific tools, facilitating access to augmented reality by the medical community, and automating surgical workflows. In the first part of this dissertation, we present a novel framework for acral tumor resection combining intraoperative open-source navigation software, based on an optical tracking system, and desktop 3D printing. We used additive manufacturing to create a patient-specific mold that maintained the same position of the distal extremity during image-guided surgery as in the preoperative images. The feasibility of the proposed workflow was evaluated in two clinical cases (soft-tissue sarcomas in hand and foot). We achieved an overall accuracy of the system of 1.88 mm evaluated on the patient-specific 3D printed phantoms. Surgical navigation was feasible during both surgeries, allowing surgeons to verify the tumor resection margin. Then, we propose and augmented reality navigation system that uses 3D printed surgical guides with a tracking pattern enabling automatic patient-to-image registration in orthopedic oncology. This specific tool fits on the patient only in a pre-designed location, in this case bone tissue. This solution has been developed as a software application running on Microsoft HoloLens. The workflow was validated on a 3D printed phantom replicating the anatomy of a patient presenting an extraosseous Ewing’s sarcoma, and then tested during the actual surgical intervention. The results showed that the surgical guide with the reference marker can be placed precisely with an accuracy of 2 mm and a visualization error lower than 3 mm. The application allowed physicians to visualize the skin, bone, tumor and medical images overlaid on the phantom and patient. To enable the use of AR and 3D printing by inexperienced users without broad technical knowledge, we designed a step-by-step methodology. The proposed protocol describes how to develop an AR smartphone application that allows superimposing any patient-based 3D model onto a real-world environment using a 3D printed marker tracked by the smartphone camera. Our solution brings AR solutions closer to the final clinical user, combining free and open-source software with an open-access protocol. The proposed guide is already helping to accelerate the adoption of these technologies by medical professionals and researchers. In the next section of the thesis, we wanted to show the benefits of combining these technologies during different stages of the surgical workflow in orthopedic oncology. We designed a novel AR-based smartphone application that can display the patient’s anatomy and the tumor’s location. A 3D printed reference marker, designed to fit in a unique position of the affected bone tissue, enables automatic registration. The system has been evaluated in terms of visualization accuracy and usability during the whole surgical workflow on six realistic phantoms achieving a visualization error below 3 mm. The AR system was tested in two clinical cases during surgical planning, patient communication, and surgical intervention. These results and the positive feedback obtained from surgeons and patients suggest that the combination of AR and 3D printing can improve efficacy, accuracy, and patients’ experience In the final section, two surgical navigation systems have been developed and evaluated to guide electrode placement in sacral neurostimulation procedures based on optical tracking and augmented reality. Our results show that both systems could minimize patient discomfort and improve surgical outcomes by reducing needle insertion time and number of punctures. Additionally, we proposed a feasible clinical workflow for guiding SNS interventions with both navigation methodologies, including automatically creating sacral virtual 3D models for trajectory definition using artificial intelligence and intraoperative patient-to-image registration. To conclude, in this thesis we have demonstrated that the combination of technologies such as tracking systems, augmented reality, 3D printing, and artificial intelligence overcomes many current limitations in surgical treatments. Our results encourage the medical community to combine these technologies to improve surgical workflows and outcomes in more clinical scenarios.Programa de Doctorado en Ciencia y Tecnología Biomédica por la Universidad Carlos III de MadridPresidenta: María Jesús Ledesma Carbayo.- Secretaria: María Arrate Muñoz Barrutia.- Vocal: Csaba Pinte

    Development of the acquisition software for a cone beam in-vitro micro-CT

    Get PDF
    During the 70’s Computed Tomography (CT) opened the door to the possibility of obtaining anatomical information from living subjects non-invasively. Since then, this technology has become indispensable for clinical diagnostic in medicine as well as for sanitary research. Consequently, micro-CTs rose from the necessity in the preclinical environment of the advantages that CT offers to the clinicians, providing high-resolution images of small samples. The Universidad Carlos III de Madrid has designed a new high-resolution in-vitro x-ray micro-CT that will serve as a test bench for wide number of applications in research and teaching. The goal of this project is to implement a control and data acquisition software for this device. LabVIEW has been used as the development environment due to the advantages that offers in the creation of graphical user interfaces, its commodious configuration to communicate with many types of hardware elements and its flexibility to expand the software in future works with few modifications. This project comprises the development of a set of libraries to control the hardware elements: x-ray source, flat-panel detector and mechanical system of the UC3M test bench. In addition, the implementation of a “step and shoot” acquisition protocol is needed, which combines control libraries previously developed. The architecture of the implemented software leads to the possibility of expanding its functionalities into more advanced features such as advanced acquisition protocols or imaging techniques. The work included in this project is framed in one of the lines of research carried out at the Biomedical Imaging and Instrumentation Group at the Departamento de Bioingeniería e Ingenería Aereospacial of the University Carlos III de Madrid.Ingeniería Biomédic

    Evaluation of optical tracking and augmented reality for needle navigation in sacral nerve stimulation

    Get PDF
    Background and objective: Sacral nerve stimulation (SNS) is a minimally invasive procedure where an electrode lead is implanted through the sacral foramina to stimulate the nerve modulating colonic and urinary functions. One of the most crucial steps in SNS procedures is the placement of the tined lead close to the sacral nerve. However, needle insertion is very challenging for surgeons. Several x-ray projections are required to interpret the needle position correctly. In many cases, multiple punctures are needed, causing an increase in surgical time and patient's discomfort and pain. In this work we propose and evaluate two different navigation systems to guide electrode placement in SNS surgeries designed to reduce surgical time, minimize patient discomfort and improve surgical outcomes. Methods: We developed, for the first alternative, an open-source navigation software to guide electrode placement by real-time needle tracking with an optical tracking system (OTS). In the second method, we present a smartphone-based AR application that displays virtual guidance elements directly on the affected area, using a 3D printed reference marker placed on the patient. This guidance facilitates needle insertion with a predefined trajectory. Both techniques were evaluated to determine which one obtained better results than the current surgical procedure. To compare the proposals with the clinical method, we developed an x-ray software tool that calculates a digitally reconstructed radiograph, simulating the fluoroscopy acquisitions during the procedure. Twelve physicians (inexperienced and experienced users) performed needle insertions through several specific targets to evaluate the alternative SNS guidance methods on a realistic patient-based phantom. Results: With each navigation solution, we observed that users took less average time to complete each insertion (36.83 s and 44.43 s for the OTS and AR methods, respectively) and needed fewer average punctures to reach the target (1.23 and 1.96 for the OTS and AR methods respectively) than following the standard clinical method (189.28 s and 3.65 punctures). Conclusions: To conclude, we have shown two navigation alternatives that could improve surgical outcome by significantly reducing needle insertions, surgical time and patient's pain in SNS procedures. We believe that these solutions are feasible to train surgeons and even replace current SNS clinical procedures.Research supported by projects PI18/01625 and AC20/00102 (Ministerio de Ciencia, Innovación y Universidades, Instituto de Salud Carlos III, Asociación Española Contra el Cáncer and European Regional Development Fund "Una manera de hacer Europa"), IND2018/TIC-9753 (Comunidad de Madrid) and project PerPlanRT (ERA Permed). Funding for APC: Universidad Carlos III de Madrid (Read & Publish Agreement CRUE-CSIC 2022)

    Surgical navigation, augmented reality, and 3D printing for hard palate adenoid cystic carcinoma en-bloc resection: case report and literature review

    Get PDF
    Adenoid Cystic Carcinoma is a rare and aggressive tumor representing less than 1% of head and neck cancers. This malignancy often arises from the minor salivary glands, being the palate its most common location. Surgical en-bloc resection with clear margins is the primary treatment. However, this location presents a limited line of sight and a high risk of injuries, making the surgical procedure challenging. In this context, technologies such as intraoperative navigation can become an effective tool, reducing morbidity and improving the safety and accuracy of the procedure. Although their use is extended in fields such as neurosurgery, their application in maxillofacial surgery has not been widely evidenced. One reason is the need to rigidly fixate a navigation reference to the patient, which often entails an invasive setup. In this work, we studied three alternative and less invasive setups using optical tracking, 3D printing and augmented reality. We evaluated their precision in a patient-specific phantom, obtaining errors below 1 mm. The optimum setup was finally applied in a clinical case, where the navigation software was used to guide the tumor resection. Points were collected along the surgical margins after resection and compared with the real ones identified in the postoperative CT. Distances of less than 2 mm were obtained in 90% of the samples. Moreover, the navigation provided confidence to the surgeons, who could then undertake a less invasive and more conservative approach. The postoperative CT scans showed adequate resection margins and confirmed that the patient is free of disease after two years of follow-up.This work has been supported by projects PI18/01625 (Ministerio de Ciencia, Innovación y Universidades, Instituto de Salud Carlos III and European Regional Development Fund “Una manera de hacer Europa”) and IND2018/TIC-9753 (Comunidad de Madrid)

    Desktop 3D Printing: Key for Surgical Navigation in Acral Tumors?

    Get PDF
    Surgical navigation techniques have shown potential benefits in orthopedic oncologic surgery. However, the translation of these results to acral tumor resection surgeries is challenging due to the large number of joints with complex movements of the affected areas (located in distal extremities). This study proposes a surgical workflow that combines an intraoperative open-source navigation software, based on a multi-camera tracking, with desktop three-dimensional (3D) printing for accurate navigation of these tumors. Desktop 3D printing was used to fabricate patient-specific 3D printed molds to ensure that the distal extremity is in the same position both in preoperative images and during image-guided surgery (IGS). The feasibility of the proposed workflow was evaluated in two clinical cases (soft-tissue sarcomas in hand and foot). The validation involved deformation analysis of the 3D-printed mold after sterilization, accuracy of the system in patient-specific 3D-printed phantoms, and feasibility of the workflow during the surgical intervention. The sterilization process did not lead to significant deformations of the mold (mean error below 0.20 mm). The overall accuracy of the system was 1.88 mm evaluated on the phantoms. IGS guidance was feasible during both surgeries, allowing surgeons to verify enough margin during tumor resection. The results obtained have demonstrated the viability of combining open-source navigation and desktop 3D printing for acral tumor surgeries. The suggested framework can be easily personalized to any patient and could be adapted to other surgical scenarios.This work was supported by projects TEC2013-48251-C2-1-R (Ministerio de Economía y Competitividad); PI18/01625 and PI15/02121 (Ministerio de Ciencia, Innovación y Universidades, Instituto de Salud Carlos III and European Regional Development Fund “Una manera de hacer Europa”) and IND2018/TIC-9753 (Comunidad de Madrid).Publicad

    Combining Augmented Reality and 3D Printing to Improve Surgical Workflows in Orthopedic Oncology: Smartphone Application and Clinical Evaluation

    Get PDF
    During the last decade, orthopedic oncology has experienced the benefits of computerized medical imaging to reduce human dependency, improving accuracy and clinical outcomes. However, traditional surgical navigation systems do not always adapt properly to this kind of interventions. Augmented reality (AR) and three-dimensional (3D) printing are technologies lately introduced in the surgical environment with promising results. Here we present an innovative solution combining 3D printing and AR in orthopedic oncological surgery. A new surgical workflow is proposed, including 3D printed models and a novel AR-based smartphone application (app). This app can display the patient’s anatomy and the tumor’s location. A 3D-printed reference marker, designed to fit in a unique position of the affected bone tissue, enables automatic registration. The system has been evaluated in terms of visualization accuracy and usability during the whole surgical workflow. Experiments on six realistic phantoms provided a visualization error below 3 mm. The AR system was tested in two clinical cases during surgical planning, patient communication, and surgical intervention. These results and the positive feedback obtained from surgeons and patients suggest that the combination of AR and 3D printing can improve efficacy, accuracy, and patients’ experience.This work was supported by projects PI18/01625 (Ministerio de Ciencia, Innovación y Universidades, Instituto de Salud Carlos III and European Regional Development Fund “Una manera de hacer Europa”) and IND2018/TIC-9753 (Comunidad de Madrid)

    Computer-assisted dental implant placement following free flap reconstruction: virtual planning, CAD/CAM templates, dynamic navigation and augmented reality

    Get PDF
    Image-guided surgery, prosthetic-based virtual planning, 3D printing, and CAD/CAM technology are changing head and neck ablative and reconstructive surgical oncology. Due to quality-of-life improvement, dental implant rehabilitation could be considered in every patient treated with curative intent. Accurate implant placement is mandatory for prosthesis long-term stability and success in oncologic patients. We present a prospective study, with a novel workflow, comprising 11 patients reconstructed with free flaps and 56 osseointegrated implants placed in bone flaps or remnant jaws (iliac crest, fibula, radial forearm, anterolateral thigh). Starting from CT data and jaw plaster model scanning, virtual dental prosthesis was designed. Then prosthetically driven dental implacement was also virtually planned and transferred to the patient by means of intraoperative infrared optical navigation (first four patients), and a combination of conventional static teeth supported 3D-printed acrylic guide stent, intraoperative dynamic navigation, and augmented reality for final intraoperative verification (last 7 patients). Coronal, apical, and angular deviation between virtual surgical planning and final guided intraoperative position was measured on each implant. There is a clear learning curve for surgeons when applying guided methods. Initial only-navigated cases achieved low accuracy but were comparable to non-guided freehand positioning due to jig registration instability. Subsequent dynamic navigation cases combining highly stable acrylic static guides as reference and registration markers result in the highest accuracy with a 1-1.5-mm deviation at the insertion point. Smartphone-based augmented reality visualization is a valuable tool for intraoperative visualization and final verification, although it is still a difficult technique for guiding surgery. A fixed screw-retained ideal dental prosthesis was achieved in every case as virtually planned. Implant placement, the final step in free flap oncological reconstruction, could be accurately planned and placed with image-guided surgery, 3D printing, and CAD/CAM technology. The learning curve could be overcome with preclinical laboratory training, but virtually designed and 3D-printed tracer registration stability is crucial for accurate and predictable results. Applying these concepts to our difficult oncologic patient subgroup with deep anatomic alterations ended in comparable results as those reported in non-oncologic patients.This work was supported by grant PI18/01625 (Ministerio de Ciencia e Innovación-Instituto de Salud Carlos III and European Regional Development Fund "Una manera de hacer Europa"). This study was also supported by Ticare® implants (Mozo-Grau, Valladolid, Spain). The funder was not involved in the study design, collection, analysis, interpretation of data, the writing of this article or the decision to submit it for publication

    HoloLens 1 vs. HoloLens 2: Improvements in the New Model for Orthopedic Oncological Interventions

    Get PDF
    This work analyzed the use of Microsoft HoloLens 2 in orthopedic oncological surgeries and compares it to its predecessor (Microsoft HoloLens 1). Specifically, we developed two equivalent applications, one for each device, and evaluated the augmented reality (AR) projection accuracy in an experimental scenario using phantoms based on two patients. We achieved automatic registration between virtual and real worlds using patient-specific surgical guides on each phantom. They contained a small adaptor for a 3D-printed AR marker, the characteristic patterns of which were easily recognized using both Microsoft HoloLens devices. The newest model improved the AR projection accuracy by almost 25%, and both of them yielded an RMSE below 3 mm. After ascertaining the enhancement of the second model in this aspect, we went a step further with Microsoft HoloLens 2 and tested it during the surgical intervention of one of the patients. During this experience, we collected the surgeons’ feedback in terms of comfortability, usability, and ergonomics. Our goal was to estimate whether the improved technical features of the newest model facilitate its implementation in actual surgical scenarios. All of the results point to Microsoft HoloLens 2 being better in all the aspects affecting surgical interventions and support its use in future experiences.This work was supported by projects PI18/01625, AC20/00102-3 and Era Permed PerPlanRT (Ministerio de Ciencia, Innovación y Universidades, Instituto de Salud Carlos III, Asociación Española Contra el Cáncer and European Regional Development Fund "Una manera de hacer Europa") and IND2018/TIC-9753 (Comunidad de Madrid)

    Combining Augmented Reality and 3D Printing to Display Patient Models on a Smartphone.

    Get PDF
    This report was supported by projects PI18/01625 and PI15/02121 (Ministerio de Ciencia, InnovaciĂłn y Universidades, Instituto de Salud Carlos III and European Regional Development Fund "Una manera de hacer Europa") and IND2018/TIC-9753 (Comunidad de Madrid)
    corecore