237 research outputs found

    Engineering precision surgery: Design and implementation of surgical guidance technologies

    Get PDF
    In the quest for precision surgery, this thesis introduces several novel detection and navigation modalities for the localization of cancer-related tissues in the operating room. The engineering efforts have focused on image-guided surgery modalities that use the complementary tracer signatures of nuclear and fluorescence radiation. The first part of the thesis covers the use of “GPS-like” navigation concepts to navigate fluorescence cameras during surgery, based on SPECT images of the patient. The second part of the thesis introduces several new imaging modalities such as a hybrid 3D freehand Fluorescence and freehand SPECT imaging and navigation device. Furthermore, to improve the detection of radioactive tracer-emissions during robot-assisted laparoscopic surgery, a tethered DROP-IN gamma probe is introduced. The clinical indications that are used to evaluate the new technologies were all focused on sentinel lymph node procedures in urology (i.e. prostate and penile cancer). Nevertheless, all presented techniques are of such a nature, that they can be applied to different surgical indications, including sentinel lymph node and tumor-receptor-targeted procedures, localization the primary tumor and metastatic spread. This will hopefully contribute towards more precise, less invasive and more effective surgical procedures in the field of oncology. Crystal Photonics GmbH Eurorad S.A. Intuitive Surgical Inc. KARL STORZ Endoscopie Nederland B.V. MILabs B.V. PI Medical Diagnostic Equipment B.V. SurgicEye GmbH Verb Surgical Inc.LUMC / Geneeskund

    Building and validation of low-cost breast phantoms for interventional procedures

    Get PDF
    Breast cancer is one of the types of cancer with the highest incidence in female population. Current treatment for breast cancer is lumpectomy, a breast conserving tumor excision procedure based on localizing the tumor with the help of hook-wire needle placement. Although this constitutes the standard approach in clinical practice, these procedures do not ensure the complete removal of the lesion due to the demonstrated high rate of positive margins. Improvements in these techniques are needed in order to reduce the number of second interventions, which usually involve mastectomy. Here is where ultrasound-guided interventions with real-time position tracking find their place. The problem is that these techniques require a high level of expertise and they present long learning curves. Therefore, training is needed in order to get from these tools their highest potential and have a real impact in the life of patients. For this purpose, breast phantoms were manufactured using liquid vinyl in order to achieve a mammary mimicking tissue. Optimal manufacturing technique was determined based on a gold-standard (commercial phantom). CT and ultrasound imaging were used to assess the identification of lesions. In addition, manufactured breast phantoms were evaluated by an expert clinician and surgical navigation was tested. This was done with the purpose of validating the breast phantom as a training tool useful for improving the outcomes of these procedures. The results indicated that the optimized formula achieved for the manufacturing of low-cost breast phantoms was suitable for training the skillset required in the interventions related with breast cancer treatment.Ingeniería Biomédica (Plan 2010

    Modular framework for a breast biopsy smart navigation system

    Get PDF
    Dissertação de mestrado em Informatics EngineeringBreast cancer is currently one of the most commonly diagnosed cancers and the fifth leading cause of cancer-related deaths. Its treatment has a higher survivorship rate when diagnosed in the disease’s early stages. The screening procedure uses medical imaging techniques, such as mammography or ultrasound, to discover possible lesions. When a physician finds a lesion that is likely to be malignant, a biopsy is performed to obtain a sample and determine its characteristics. Currently, real-time ultrasound is the preferred medical imaging modality to perform this procedure. The breast biopsy procedure is highly reliant on the operator’s skill and experience, due to the difficulty in interpreting ultrasound images and correctly aiming the needle. Robotic solutions, and the usage of automatic lesion segmentation in ultrasound imaging along with advanced visualization techniques, such as augmented reality, can potentially make this process simpler, safer, and faster. The OncoNavigator project, in which this dissertation integrates, aims to improve the precision of the current breast cancer interventions. To accomplish this objective various medical training and robotic biopsy aid were developed. An augmented reality ultrasound training solution was created and the device’s tracking capabilities were validated by comparing it with an electromagnetic tracking device. Another solution for ultrasound-guided breast biopsy assisted with augmented reality was developed. This solution displays real-time ultrasound video, automatic lesion segmentation, and biopsy needle trajectory display in the user’s field of view. The validation of this solution was made by comparing its usability with the traditional procedure. A modular software framework was also developed that focuses on the integration of a collaborative medical robot with real-time ultrasound imaging and automatic lesion segmentation. Overall, the developed solutions offered good results. The augmented reality glasses tracking capabilities proved to be as capable as the electromagnetic system, and the augmented reality assisted breast biopsy proved to make the procedure more accurate and precise than the traditional system.O cancro da mama é, atualmente, um dos tipos de cancro mais comuns a serem diagnosticados e a quinta principal causa de mortes relacionadas ao cancro. O seu tratamento tem maior taxa de sobrevivência quando é diagnosticado nas fases iniciais da doença. O procedimento de triagem utiliza técnicas de imagem médica, como mamografia ou ultrassom, para descobrir possíveis lesões. Quando um médico encontra uma lesão com probabilidade de ser maligna, é realizada uma biópsia para obter uma amostra e determinar as suas características. O ultrassom em tempo real é a modalidade de imagem médica preferida para realizar esse procedimento. A biópsia mamária depende da habilidade e experiência do operador, devido à dificuldade de interpretação das imagens ultrassonográficas e ao direcionamento correto da agulha. Soluções robóticas, com o uso de segmentação automática de lesões em imagens de ultrassom, juntamente com técnicas avançadas de visualização, nomeadamente realidade aumentada, podem tornar esse processo mais simples, seguro e rápido. O projeto OncoNavigator, que esta dissertação integra, visa melhorar a precisão das atuais intervenções ao cancro da mama. Para atingir este objetivo, vários ajudas para treino médico e auxílio à biópsia por meio robótico foram desenvolvidas. Uma solução de treino de ultrassom com realidade aumentada foi criada e os recursos de rastreio do dispositivo foram validados comparando-os com um dispositivo eletromagnético. Outra solução para biópsia de mama guiada por ultrassom assistida com realidade aumentada foi desenvolvida. Esta solução exibe vídeo de ultrassom em tempo real, segmentação automática de lesões e exibição da trajetória da agulha de biópsia no campo de visão do utilizador. A validação desta solução foi feita comparando a sua usabilidade com o procedimento tradicional. Também foi desenvolvida uma estrutura de software modular que se concentra na integração de um robô médico colaborativo com imagens de ultrassom em tempo real e segmentação automática de lesões. Os recursos de rastreio dos óculos de realidade aumentada mostraram-se tão capazes quanto o sistema eletromagnético, e a biópsia de mama assistida por realidade aumentada provou tornar o procedimento mais exato e preciso do que o sistema tradicional

    Magnetic resonance imaging and navigation of ferromagnetic thermoseeds to deliver thermal ablation therapy

    Get PDF
    Minimally invasive therapies aim to deliver effective treatment whilst reducing off-target burden, limiting side effects, and shortening patient recovery times. Remote navigation of untethered devices is one method that can be used to deliver targeted treatment to deep and otherwise inaccessible locations within the body. Minimally invasive image-guided ablation (MINIMA) is a novel thermal ablation therapy for the treatment of solid tumours, whereby an untethered ferromagnetic thermoseed is navigated through tissue to a target site within the body, using the magnetic field gradients generated by a magnetic resonance imaging (MRI) system. Once at the tumour, the thermoseed is heated remotely using an alternating magnetic field, to induce cell death in the surrounding cancer tissue. The thermoseed is then navigated through the tumour, heating at pre-defined locations until the entire volume has been ablated. The aim of this PhD project is to develop MINIMA through a series of proof-of-concept studies and to assess the efficacy of the three key project components: imaging, navigation, and heating. First, an MR imaging sequence was implemented to track the thermoseeds during navigation and subsequently assessed for precision and accuracy. Secondly, movement of the thermoseeds through a viscous fluid was characterised, by measuring the effect of different navigation parameters. This was followed by navigation experiments performed in ex vivo tissue. To assess thermoseed heating, a series of in vitro experiments were conducted in air, water, and ex vivo liver tissue, before moving onto in vivo experiments in the rat brain and a murine subcutaneous tumour model. These final experiments allowed the extent of cell death induced by thermoseed heating to be determined, in both healthy and diseased tissue respectively

    Advancements and Breakthroughs in Ultrasound Imaging

    Get PDF
    Ultrasonic imaging is a powerful diagnostic tool available to medical practitioners, engineers and researchers today. Due to the relative safety, and the non-invasive nature, ultrasonic imaging has become one of the most rapidly advancing technologies. These rapid advances are directly related to the parallel advancements in electronics, computing, and transducer technology together with sophisticated signal processing techniques. This book focuses on state of the art developments in ultrasonic imaging applications and underlying technologies presented by leading practitioners and researchers from many parts of the world

    Optimization of computer-assisted intraoperative guidance for complex oncological procedures

    Get PDF
    Mención Internacional en el título de doctorThe role of technology inside the operating room is constantly increasing, allowing surgical procedures previously considered impossible or too risky due to their complexity or limited access. These reliable tools have improved surgical efficiency and safety. Cancer treatment is one of the surgical specialties that has benefited most from these techniques due to its high incidence and the accuracy required for tumor resections with conservative approaches and clear margins. However, in many cases, introducing these technologies into surgical scenarios is expensive and entails complex setups that are obtrusive, invasive, and increase the operative time. In this thesis, we proposed convenient, accessible, reliable, and non-invasive solutions for two highly complex regions for tumor resection surgeries: pelvis and head and neck. We explored how the introduction of 3D printing, surgical navigation, and augmented reality in these scenarios provided high intraoperative precision. First, we presented a less invasive setup for osteotomy guidance in pelvic tumor resections based on small patient-specific instruments (PSIs) fabricated with a desktop 3D printer at a low cost. We evaluated their accuracy in a cadaveric study, following a realistic workflow, and obtained similar results to previous studies with more invasive setups. We also identified the ilium as the region more prone to errors. Then, we proposed surgical navigation using these small PSIs for image-to-patient registration. Artificial landmarks included in the PSIs substitute the anatomical landmarks and the bone surface commonly used for this step, which require additional bone exposure and is, therefore, more invasive. We also presented an alternative and more convenient installation of the dynamic reference frame used to track the patient movements in surgical navigation. The reference frame is inserted in a socket included in the PSIs and can be attached and detached without losing precision and simplifying the installation. We validated the setup in a cadaveric study, evaluating the accuracy and finding the optimal PSI configuration in the three most common scenarios for pelvic tumor resection. The results demonstrated high accuracy, where the main source of error was again incorrect placements of PSIs in regular and homogeneous regions such as the ilium. The main limitation of PSIs is the guidance error resulting from incorrect placements. To overcome this issue, we proposed augmented reality as a tool to guide PSI installation in the patient’s bone. We developed an application for smartphones and HoloLens 2 that displays the correct position intraoperatively. We measured the placement errors in a conventional and a realistic phantom, including a silicone layer to simulate tissue. The results demonstrated a significant reduction of errors with augmented reality compared to freehand placement, ensuring an installation of the PSI close to the target area. Finally, we proposed three setups for surgical navigation in palate tumor resections, using optical trackers and augmented reality. The tracking tools for the patient and surgical instruments were fabricated with low-cost desktop 3D printers and designed to provide less invasive setups compared to previous solutions. All setups presented similar results with high accuracy when tested in a 3D-printed patient-specific phantom. They were then validated in the real surgical case, and one of the solutions was applied for intraoperative guidance. Postoperative results demonstrated high navigation accuracy, obtaining optimal surgical outcomes. The proposed solution enabled a conservative surgical approach with a less invasive navigation setup. To conclude, in this thesis we have proposed new setups for intraoperative navigation in two complex surgical scenarios for tumor resection. We analyzed their navigation precision, defining the optimal configurations to ensure accuracy. With this, we have demonstrated that computer-assisted surgery techniques can be integrated into the surgical workflow with accessible and non-invasive setups. These results are a step further towards optimizing the procedures and continue improving surgical outcomes in complex surgical scenarios.Programa de Doctorado en Ciencia y Tecnología Biomédica por la Universidad Carlos III de MadridPresidente: Raúl San José Estépar.- Secretario: Alba González Álvarez.- Vocal: Simon Droui

    A Comprehensive Framework for Image Guided Breast Surgery

    Get PDF

    Personalized medicine in surgical treatment combining tracking systems, augmented reality and 3D printing

    Get PDF
    Mención Internacional en el título de doctorIn the last twenty years, a new way of practicing medicine has been focusing on the problems and needs of each patient as an individual thanks to the significant advances in healthcare technology, the so-called personalized medicine. In surgical treatments, personalization has been possible thanks to key technologies adapted to the specific anatomy of each patient and the needs of the physicians. Tracking systems, augmented reality (AR), three-dimensional (3D) printing and artificial intelligence (AI) have previously supported this individualized medicine in many ways. However, their independent contributions show several limitations in terms of patient-to-image registration, lack of flexibility to adapt to the requirements of each case, large preoperative planning times, and navigation complexity. The main objective of this thesis is to increase patient personalization in surgical treatments by combining these technologies to bring surgical navigation to new complex cases by developing new patient registration methods, designing patient-specific tools, facilitating access to augmented reality by the medical community, and automating surgical workflows. In the first part of this dissertation, we present a novel framework for acral tumor resection combining intraoperative open-source navigation software, based on an optical tracking system, and desktop 3D printing. We used additive manufacturing to create a patient-specific mold that maintained the same position of the distal extremity during image-guided surgery as in the preoperative images. The feasibility of the proposed workflow was evaluated in two clinical cases (soft-tissue sarcomas in hand and foot). We achieved an overall accuracy of the system of 1.88 mm evaluated on the patient-specific 3D printed phantoms. Surgical navigation was feasible during both surgeries, allowing surgeons to verify the tumor resection margin. Then, we propose and augmented reality navigation system that uses 3D printed surgical guides with a tracking pattern enabling automatic patient-to-image registration in orthopedic oncology. This specific tool fits on the patient only in a pre-designed location, in this case bone tissue. This solution has been developed as a software application running on Microsoft HoloLens. The workflow was validated on a 3D printed phantom replicating the anatomy of a patient presenting an extraosseous Ewing’s sarcoma, and then tested during the actual surgical intervention. The results showed that the surgical guide with the reference marker can be placed precisely with an accuracy of 2 mm and a visualization error lower than 3 mm. The application allowed physicians to visualize the skin, bone, tumor and medical images overlaid on the phantom and patient. To enable the use of AR and 3D printing by inexperienced users without broad technical knowledge, we designed a step-by-step methodology. The proposed protocol describes how to develop an AR smartphone application that allows superimposing any patient-based 3D model onto a real-world environment using a 3D printed marker tracked by the smartphone camera. Our solution brings AR solutions closer to the final clinical user, combining free and open-source software with an open-access protocol. The proposed guide is already helping to accelerate the adoption of these technologies by medical professionals and researchers. In the next section of the thesis, we wanted to show the benefits of combining these technologies during different stages of the surgical workflow in orthopedic oncology. We designed a novel AR-based smartphone application that can display the patient’s anatomy and the tumor’s location. A 3D printed reference marker, designed to fit in a unique position of the affected bone tissue, enables automatic registration. The system has been evaluated in terms of visualization accuracy and usability during the whole surgical workflow on six realistic phantoms achieving a visualization error below 3 mm. The AR system was tested in two clinical cases during surgical planning, patient communication, and surgical intervention. These results and the positive feedback obtained from surgeons and patients suggest that the combination of AR and 3D printing can improve efficacy, accuracy, and patients’ experience In the final section, two surgical navigation systems have been developed and evaluated to guide electrode placement in sacral neurostimulation procedures based on optical tracking and augmented reality. Our results show that both systems could minimize patient discomfort and improve surgical outcomes by reducing needle insertion time and number of punctures. Additionally, we proposed a feasible clinical workflow for guiding SNS interventions with both navigation methodologies, including automatically creating sacral virtual 3D models for trajectory definition using artificial intelligence and intraoperative patient-to-image registration. To conclude, in this thesis we have demonstrated that the combination of technologies such as tracking systems, augmented reality, 3D printing, and artificial intelligence overcomes many current limitations in surgical treatments. Our results encourage the medical community to combine these technologies to improve surgical workflows and outcomes in more clinical scenarios.Programa de Doctorado en Ciencia y Tecnología Biomédica por la Universidad Carlos III de MadridPresidenta: María Jesús Ledesma Carbayo.- Secretaria: María Arrate Muñoz Barrutia.- Vocal: Csaba Pinte
    corecore