2,013 research outputs found

    Three-dimensional virtual reality in surgical planning for breast cancer with reconstruction

    Get PDF
    Breast surgery is performed to achieve local control in patients with breast cancer. Visualization of the anatomy with a virtual reality software platform reconstructed from magnetic resonance imaging data improves surgical planning with regards to volume and localization of the tumor, lymph nodes, blood vessels, and surrounding tissue to perform oncoplastic tissue rearrangement. We report the use and advantages of virtual reality added to the magnetic resonance imaging assessment in a 36-year-old woman with breast cancer who underwent nipple sparing mastectomy with tissue expander reconstructio

    Mixed Reality system to study deformable objects: Breast Cancer application

    Get PDF
    Treballs Finals de Grau d'Enginyeria Biomèdica. Facultat de Medicina i Ciències de la Salut. Universitat de Barcelona. Curs: 2020-2021. Directors: Eduardo Soudah i Óscar de Coss. Tutor: Aida NiñerolaA significant amount of women who go through a breast cancer conservative surgery to treat early stage breast cancer undergo a repeat surgery due to concerns that residual tumor was left behind. To avoid this, tumor localization is needed to assist the surgeon in order to determine tumor extent and also, it is critical to account for tissue deformations. For these reasons, new navigation systems, like the one proposed on this project, are emerging to cover those needs. This project focuses on the use of a Mixed Reality system to improve the accuracy in placing the static hologram of the tumor and, to implement a dynamical hologram when deformation takes place. In order to do so, two different molds with objects inside have been manufactured. Next, two different approaches were considered, a mathematical approach to create a 3D CAD model of the molds and a medical approach, which consisted in performing a CT and then, segment the images. The models were post-processed and imported to the HoloLens head-mounted display. The system was tested on the molds and on a breast phantom provided by the Hospital Clinic. The results obtained were encouraging and although some things need to be improved, this exciting new use for Augmented Reality has the potential to improve the lives of many patients

    Crepuscular Rays for Tumor Accessibility Planning

    Get PDF

    Binocular Goggle Augmented Imaging and Navigation System provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping

    Get PDF
    The inability to identify microscopic tumors and assess surgical margins in real-time during oncologic surgery leads to incomplete tumor removal, increases the chances of tumor recurrence, and necessitates costly repeat surgery. To overcome these challenges, we have developed a wearable goggle augmented imaging and navigation system (GAINS) that can provide accurate intraoperative visualization of tumors and sentinel lymph nodes in real-time without disrupting normal surgical workflow. GAINS projects both near-infrared fluorescence from tumors and the natural color images of tissue onto a head-mounted display without latency. Aided by tumor-targeted contrast agents, the system detected tumors in subcutaneous and metastatic mouse models with high accuracy (sensitivity = 100%, specificity = 98% ± 5% standard deviation). Human pilot studies in breast cancer and melanoma patients using a near-infrared dye show that the GAINS detected sentinel lymph nodes with 100% sensitivity. Clinical use of the GAINS to guide tumor resection and sentinel lymph node mapping promises to improve surgical outcomes, reduce rates of repeat surgery, and improve the accuracy of cancer staging

    Optimization of computer-assisted intraoperative guidance for complex oncological procedures

    Get PDF
    Mención Internacional en el título de doctorThe role of technology inside the operating room is constantly increasing, allowing surgical procedures previously considered impossible or too risky due to their complexity or limited access. These reliable tools have improved surgical efficiency and safety. Cancer treatment is one of the surgical specialties that has benefited most from these techniques due to its high incidence and the accuracy required for tumor resections with conservative approaches and clear margins. However, in many cases, introducing these technologies into surgical scenarios is expensive and entails complex setups that are obtrusive, invasive, and increase the operative time. In this thesis, we proposed convenient, accessible, reliable, and non-invasive solutions for two highly complex regions for tumor resection surgeries: pelvis and head and neck. We explored how the introduction of 3D printing, surgical navigation, and augmented reality in these scenarios provided high intraoperative precision. First, we presented a less invasive setup for osteotomy guidance in pelvic tumor resections based on small patient-specific instruments (PSIs) fabricated with a desktop 3D printer at a low cost. We evaluated their accuracy in a cadaveric study, following a realistic workflow, and obtained similar results to previous studies with more invasive setups. We also identified the ilium as the region more prone to errors. Then, we proposed surgical navigation using these small PSIs for image-to-patient registration. Artificial landmarks included in the PSIs substitute the anatomical landmarks and the bone surface commonly used for this step, which require additional bone exposure and is, therefore, more invasive. We also presented an alternative and more convenient installation of the dynamic reference frame used to track the patient movements in surgical navigation. The reference frame is inserted in a socket included in the PSIs and can be attached and detached without losing precision and simplifying the installation. We validated the setup in a cadaveric study, evaluating the accuracy and finding the optimal PSI configuration in the three most common scenarios for pelvic tumor resection. The results demonstrated high accuracy, where the main source of error was again incorrect placements of PSIs in regular and homogeneous regions such as the ilium. The main limitation of PSIs is the guidance error resulting from incorrect placements. To overcome this issue, we proposed augmented reality as a tool to guide PSI installation in the patient’s bone. We developed an application for smartphones and HoloLens 2 that displays the correct position intraoperatively. We measured the placement errors in a conventional and a realistic phantom, including a silicone layer to simulate tissue. The results demonstrated a significant reduction of errors with augmented reality compared to freehand placement, ensuring an installation of the PSI close to the target area. Finally, we proposed three setups for surgical navigation in palate tumor resections, using optical trackers and augmented reality. The tracking tools for the patient and surgical instruments were fabricated with low-cost desktop 3D printers and designed to provide less invasive setups compared to previous solutions. All setups presented similar results with high accuracy when tested in a 3D-printed patient-specific phantom. They were then validated in the real surgical case, and one of the solutions was applied for intraoperative guidance. Postoperative results demonstrated high navigation accuracy, obtaining optimal surgical outcomes. The proposed solution enabled a conservative surgical approach with a less invasive navigation setup. To conclude, in this thesis we have proposed new setups for intraoperative navigation in two complex surgical scenarios for tumor resection. We analyzed their navigation precision, defining the optimal configurations to ensure accuracy. With this, we have demonstrated that computer-assisted surgery techniques can be integrated into the surgical workflow with accessible and non-invasive setups. These results are a step further towards optimizing the procedures and continue improving surgical outcomes in complex surgical scenarios.Programa de Doctorado en Ciencia y Tecnología Biomédica por la Universidad Carlos III de MadridPresidente: Raúl San José Estépar.- Secretario: Alba González Álvarez.- Vocal: Simon Droui

    A fiber optoacoustic guide with augmented reality for precision breast-conserving surgery

    Get PDF
    Lumpectomy, also called breast-conserving surgery, has become the standard surgical treatment for early-stage breast cancer. However, accurately locating the tumor during a lumpectomy, especially when the lesion is small and nonpalpable, is a challenge. Such difficulty can lead to either incomplete tumor removal or prolonged surgical time, which result in high re-operation rates (~25%) and increased surgical costs. Here, we report a fiber optoacoustic guide (FOG) with augmented reality (AR) for sub-millimeter tumor localization and intuitive surgical guidance with minimal interference. The FOG is preoperatively implanted in the tumor. Under external pulsed light excitation, the FOG omnidirectionally broadcasts acoustic waves through the optoacoustic effect by a specially designed nano-composite layer at its tip. By capturing the acoustic wave, three ultrasound sensors on the breast skin triangulate the FOG tip's position with 0.25-mm accuracy. An AR system with a tablet measures the coordinates of the ultrasound sensors and transforms the FOG tip's position into visual feedback with <1-mm accuracy, thus aiding surgeons in directly visualizing the tumor location and performing fast and accurate tumor removal. We further show the use of a head-mounted display to visualize the same information in the surgeons' first-person view and achieve hands-free guidance. Towards clinical application, a surgeon successfully deployed the FOG to excise a "pseudo tumor" in a female human cadaver. With the high-accuracy tumor localization by FOG and the intuitive surgical guidance by AR, the surgeon performed accurate and fast tumor removal, which will significantly reduce re-operation rates and shorten the surgery time

    SURGICAL NAVIGATION AND AUGMENTED REALITY FOR MARGINS CONTROL IN HEAD AND NECK CANCER

    Get PDF
    I tumori maligni del distretto testa-collo rappresentano un insieme di lesioni dalle diverse caratteristiche patologiche, epidemiologiche e prognostiche. Per una porzione considerevole di tali patologie, l’intervento chirurgico finalizzato all’asportazione completa del tumore rappresenta l’elemento chiave del trattamento, quand’anche esso includa altre modalità quali la radioterapia e la terapia sistemica. La qualità dell’atto chirurgico ablativo è pertanto essenziale al fine di garantire le massime chance di cura al paziente. Nell’ambito della chirurgia oncologica, la qualità delle ablazioni viene misurata attraverso l’analisi dello stato dei margini di resezione. Oltre a rappresentare un surrogato della qualità della resezione chirurgica, lo stato dei margini di resezione ha notevoli implicazioni da un punto di vista clinico e prognostico. Infatti, il coinvolgimento dei margini di resezione da parte della neoplasia rappresenta invariabilmente un fattore prognostico sfavorevole, oltre che implicare la necessità di intensificare i trattamenti postchirurgici (e.g., ponendo indicazione alla chemioradioterapia adiuvante), comportando una maggiore tossicità per il paziente. La proporzione di resezioni con margini positivi (i.e., coinvolti dalla neoplasia) nel distretto testa-collo è tra le più elevate in ambito di chirurgia oncologica. In tale contesto si pone l’obiettivo del dottorato di cui questa tesi riporta i risultati. Le due tecnologie di cui si è analizzata l’utilità in termini di ottimizzazione dello stato dei margini di resezione sono la navigazione chirurgica con rendering tridimensionale e la realtà aumentata basata sulla videoproiezione di immagini. Le sperimentazioni sono state svolte parzialmente presso l’Università degli Studi di Brescia, parzialmente presso l’Azienda Ospedale Università di Padova e parzialmente presso l’University Health Network (Toronto, Ontario, Canada). I risultati delle sperimentazioni incluse in questo elaborato dimostrano che l'impiego della navigazione chirurgica con rendering tridimensionale nel contesto di procedure oncologiche ablative cervico-cefaliche risulta associata ad un vantaggio significativo in termini di riduzione della frequenza di margini positivi. Al contrario, le tecniche di realtà aumentata basata sulla videoproiezione, nell'ambito della sperimentazione preclinica effettuata, non sono risultate associate a vantaggi sufficienti per poter considerare tale tecnologia per la traslazione clinica.Head and neck malignancies are an heterogeneous group of tumors. Surgery represents the mainstay of treatment for the large majority of head and neck cancers, with ablation being aimed at removing completely the tumor. Radiotherapy and systemic therapy have also a substantial role in the multidisciplinary management of head and neck cancers. The quality of surgical ablation is intimately related to margin status evaluated at a microscopic level. Indeed, margin involvement has a remarkably negative effect on prognosis of patients and mandates the escalation of postoperative treatment by adding concomitant chemotherapy to radiotherapy and accordingly increasing the toxicity of overall treatment. The rate of margin involvement in the head and neck is among the highest in the entire field of surgical oncology. In this context, the present PhD project was aimed at testing the utility of 2 technologies, namely surgical navigation with 3-dimensional rendering and pico projector-based augmented reality, in decreasing the rate of involved margins during oncologic surgical ablations in the craniofacial area. Experiments were performed in the University of Brescia, University of Padua, and University Health Network (Toronto, Ontario, Canada). The research activities completed in the context of this PhD course demonstrated that surgical navigation with 3-dimensional rendering confers a higher quality to oncologic ablations in the head and neck, irrespective of the open or endoscopic surgical technique. The benefits deriving from this implementation come with no relevant drawbacks from a logistical and practical standpoint, nor were major adverse events observed. Thus, implementation of this technology into the standard care is the logical proposed step forward. However, the genuine presence of a prognostic advantage needs longer and larger study to be formally addressed. On the other hand, pico projector-based augmented reality showed no sufficient advantages to encourage translation into the clinical setting. Although observing a clear practical advantage deriving from the projection of osteotomy lines onto the surgical field, no substantial benefits were measured when comparing this technology with surgical navigation with 3-dimensional rendering. Yet recognizing a potential value of this technology from an educational standpoint, the performance displayed in the preclinical setting in terms of surgical margins optimization is not in favor of a clinical translation with this specific aim

    Virtual 3D tumor marking-exact intraoperative coordinate mapping improve post-operative radiotherapy

    Get PDF
    The quality of the interdisciplinary interface in oncological treatment between surgery, pathology and radiotherapy is mainly dependent on reliable anatomical three-dimensional (3D) allocation of specimen and their context sensitive interpretation which defines further treatment protocols. Computer-assisted preoperative planning (CAPP) allows for outlining macroscopical tumor size and margins. A new technique facilitates the 3D virtual marking and mapping of frozen sections and resection margins or important surgical intraoperative information. These data could be stored in DICOM format (Digital Imaging and Communication in Medicine) in terms of augmented reality and transferred to communicate patient's specific tumor information (invasion to vessels and nerves, non-resectable tumor) to oncologists, radiotherapists and pathologists

    Goggle Augmented Imaging and Navigation System for Fluorescence-Guided Surgery

    Get PDF
    Surgery remains the only curative option for most solid tumors. The standard-of-care usually involves tumor resection and sentinel lymph node biopsy for cancer staging. Surgeons rely on their vision and touch to distinguish healthy from cancer tissue during surgery, often leading to incomplete tumor resection that necessitates repeat surgery. Sentinel lymph node biopsy by conventional radioactive tracking exposes patients and caregivers to ionizing radiation, while blue dye tracking stains the tissue highlighting only superficial lymph nodes. Improper identification of sentinel lymph nodes may misdiagnose the stage of the cancer. Therefore there is a clinical need for accurate intraoperative tumor and sentinel lymph node visualization. Conventional imaging modalities such as x-ray computed tomography, positron emission tomography, magnetic resonance imaging, and ultrasound are excellent for preoperative cancer diagnosis and surgical planning. However, they are not suitable for intraoperative use, due to bulky complicated hardware, high cost, non-real-time imaging, severe restrictions to the surgical workflow and lack of sufficient resolution for tumor boundary assessment. This has propelled interest in fluorescence-guided surgery, due to availability of simple hardware that can achieve real-time, high resolution and sensitive imaging. Near-infrared fluorescence imaging is of particular interest due to low background absorbance by photoactive biomolecules, enabling thick tissue assessment. As a result several near-infrared fluorescence-guided surgery systems have been developed. However, they are limited by bulky hardware, disruptive information display and non-matched field of view to the user. To address these limitations we have developed a compact, light-weight and wearable goggle augmented imaging and navigation system (GAINS). It detects the near-infrared fluorescence from a tumor accumulated contrast agent, along with the normal color view and displays accurately aligned, color-fluorescence images via a head-mounted display worn by the surgeon, in real-time. GAINS is a platform technology and capable of very sensitive fluorescence detection. Image display options include both video see-through and optical see-through head-mounted displays for high-contrast image guidance as well as direct visual access to the surgical bed. Image capture options from large field of view camera as well high magnification handheld microscope, ensures macroscopic as well as microscopic assessment of the tumor bed. Aided by tumor targeted near-infrared contrast agents, GAINS guided complete tumor resection in subcutaneous, metastatic and spontaneous mouse models of cancer with high sensitivity and specificity, in real-time. Using a clinically-approved near-infrared contrast agent, GAINS provided real-time image guidance for accurate visualization of lymph nodes in a porcine model and sentinel lymph nodes in human breast cancer and melanoma patients with high sensitivity. This work has addressed issues that have limited clinical adoption of fluorescence-guided surgery and paved the way for research into developing this approach towards standard-of-care practice that can potentially improve surgical outcomes in cancer
    corecore