8 research outputs found

    Building and validation of low-cost breast phantoms for interventional procedures

    Get PDF
    Breast cancer is one of the types of cancer with the highest incidence in female population. Current treatment for breast cancer is lumpectomy, a breast conserving tumor excision procedure based on localizing the tumor with the help of hook-wire needle placement. Although this constitutes the standard approach in clinical practice, these procedures do not ensure the complete removal of the lesion due to the demonstrated high rate of positive margins. Improvements in these techniques are needed in order to reduce the number of second interventions, which usually involve mastectomy. Here is where ultrasound-guided interventions with real-time position tracking find their place. The problem is that these techniques require a high level of expertise and they present long learning curves. Therefore, training is needed in order to get from these tools their highest potential and have a real impact in the life of patients. For this purpose, breast phantoms were manufactured using liquid vinyl in order to achieve a mammary mimicking tissue. Optimal manufacturing technique was determined based on a gold-standard (commercial phantom). CT and ultrasound imaging were used to assess the identification of lesions. In addition, manufactured breast phantoms were evaluated by an expert clinician and surgical navigation was tested. This was done with the purpose of validating the breast phantom as a training tool useful for improving the outcomes of these procedures. The results indicated that the optimized formula achieved for the manufacturing of low-cost breast phantoms was suitable for training the skillset required in the interventions related with breast cancer treatment.Ingeniería Biomédica (Plan 2010

    Advanced Endoscopic Navigation:Surgical Big Data,Methodology,and Applications

    Get PDF
    随着科学技术的飞速发展,健康与环境问题日益成为人类面临的最重大问题之一。信息科学、计算机技术、电子工程与生物医学工程等学科的综合应用交叉前沿课题,研究现代工程技术方法,探索肿瘤癌症等疾病早期诊断、治疗和康复手段。本论文综述了计算机辅助微创外科手术导航、多模态医疗大数据、方法论及其临床应用:从引入微创外科手术导航概念出发,介绍了医疗大数据的术前与术中多模态医学成像方法、阐述了先进微创外科手术导航的核心流程包括计算解剖模型、术中实时导航方案、三维可视化方法及交互式软件技术,归纳了各类微创外科手术方法的临床应用。同时,重点讨论了全球各种手术导航技术在临床应用中的优缺点,分析了目前手术导航领域内的最新技术方法。在此基础上,提出了微创外科手术方法正向数字化、个性化、精准化、诊疗一体化、机器人化以及高度智能化的发展趋势。【Abstract】Interventional endoscopy (e.g., bronchoscopy, colonoscopy, laparoscopy, cystoscopy) is a widely performed procedure that involves either diagnosis of suspicious lesions or guidance for minimally invasive surgery in a variety of organs within the body cavity. Endoscopy may also be used to guide the introduction of certain items (e.g., stents) into the body. Endoscopic navigation systems seek to integrate big data with multimodal information (e.g., computed tomography, magnetic resonance images, endoscopic video sequences, ultrasound images, external trackers) relative to the patient's anatomy, control the movement of medical endoscopes and surgical tools, and guide the surgeon's actions during endoscopic interventions. Nevertheless, it remains challenging to realize the next generation of context-aware navigated endoscopy. This review presents a broad survey of various aspects of endoscopic navigation, particularly with respect to the development of endoscopic navigation techniques. First, we investigate big data with multimodal information involved in endoscopic navigation. Next, we focus on numerous methodologies used for endoscopic navigation. We then review different endoscopic procedures in clinical applications. Finally, we discuss novel techniques and promising directions for the development of endoscopic navigation.X.L. acknowledges funding from the Fundamental Research Funds for the Central Universities. T.M.P. acknowledges funding from the Canadian Foundation for Innovation, the Canadian Institutes for Health Research, the National Sciences and Engineering Research Council of Canada, and a grant from Intuitive Surgical Inc

    ConoSurf: Open-source 3D scanning system based on a conoscopic holography device for acquiring surgical surfaces

    Get PDF
    Background. A difficulty in computer-assisted interventions is acquiring the patient's anatomy intraoperatively. Standard modalities have several limitations: low image quality (ultrasound), radiation exposure (computed tomography) or high costs (magnetic resonance imaging). An alternative approach uses a tracked pointer; however, the pointer causes tissue deformation and requires sterilizing. Recent proposals, utilizing a tracked conoscopic holography device, have shown promising results without the previously mentioned drawbacks. Methods. We have developed an open-source software system that enables real-time surface scanning using a conoscopic holography device and a wide variety of tracking systems, integrated into pre-existing and well-supported software solutions. Results. The mean target registration error of point measurements was 1.46 mm. For a quick guidance scan, surface reconstruction improved the surface registration error compared with point-set registration. Conclusions. We have presented a system enabling real-time surface scanning using a tracked conoscopic holography device. Results show that it can be useful for acquiring the patient's anatomy during surgery.Funding information: (Comunidad de Madrid), Grant/Award Number: TOPUS‐CM S2013/MIT‐3024; (Ministerio de Economía y Competitividad, ISCIII), Grant/Award Number: PI15/02121. DTS14/00192. TEC2013–48251‐C2–1‐R, FEDER fund

    Desktop 3D Printing: Key for Surgical Navigation in Acral Tumors?

    Get PDF
    Surgical navigation techniques have shown potential benefits in orthopedic oncologic surgery. However, the translation of these results to acral tumor resection surgeries is challenging due to the large number of joints with complex movements of the affected areas (located in distal extremities). This study proposes a surgical workflow that combines an intraoperative open-source navigation software, based on a multi-camera tracking, with desktop three-dimensional (3D) printing for accurate navigation of these tumors. Desktop 3D printing was used to fabricate patient-specific 3D printed molds to ensure that the distal extremity is in the same position both in preoperative images and during image-guided surgery (IGS). The feasibility of the proposed workflow was evaluated in two clinical cases (soft-tissue sarcomas in hand and foot). The validation involved deformation analysis of the 3D-printed mold after sterilization, accuracy of the system in patient-specific 3D-printed phantoms, and feasibility of the workflow during the surgical intervention. The sterilization process did not lead to significant deformations of the mold (mean error below 0.20 mm). The overall accuracy of the system was 1.88 mm evaluated on the phantoms. IGS guidance was feasible during both surgeries, allowing surgeons to verify enough margin during tumor resection. The results obtained have demonstrated the viability of combining open-source navigation and desktop 3D printing for acral tumor surgeries. The suggested framework can be easily personalized to any patient and could be adapted to other surgical scenarios.This work was supported by projects TEC2013-48251-C2-1-R (Ministerio de Economía y Competitividad); PI18/01625 and PI15/02121 (Ministerio de Ciencia, Innovación y Universidades, Instituto de Salud Carlos III and European Regional Development Fund “Una manera de hacer Europa”) and IND2018/TIC-9753 (Comunidad de Madrid).Publicad

    Personalized medicine in surgical treatment combining tracking systems, augmented reality and 3D printing

    Get PDF
    Mención Internacional en el título de doctorIn the last twenty years, a new way of practicing medicine has been focusing on the problems and needs of each patient as an individual thanks to the significant advances in healthcare technology, the so-called personalized medicine. In surgical treatments, personalization has been possible thanks to key technologies adapted to the specific anatomy of each patient and the needs of the physicians. Tracking systems, augmented reality (AR), three-dimensional (3D) printing and artificial intelligence (AI) have previously supported this individualized medicine in many ways. However, their independent contributions show several limitations in terms of patient-to-image registration, lack of flexibility to adapt to the requirements of each case, large preoperative planning times, and navigation complexity. The main objective of this thesis is to increase patient personalization in surgical treatments by combining these technologies to bring surgical navigation to new complex cases by developing new patient registration methods, designing patient-specific tools, facilitating access to augmented reality by the medical community, and automating surgical workflows. In the first part of this dissertation, we present a novel framework for acral tumor resection combining intraoperative open-source navigation software, based on an optical tracking system, and desktop 3D printing. We used additive manufacturing to create a patient-specific mold that maintained the same position of the distal extremity during image-guided surgery as in the preoperative images. The feasibility of the proposed workflow was evaluated in two clinical cases (soft-tissue sarcomas in hand and foot). We achieved an overall accuracy of the system of 1.88 mm evaluated on the patient-specific 3D printed phantoms. Surgical navigation was feasible during both surgeries, allowing surgeons to verify the tumor resection margin. Then, we propose and augmented reality navigation system that uses 3D printed surgical guides with a tracking pattern enabling automatic patient-to-image registration in orthopedic oncology. This specific tool fits on the patient only in a pre-designed location, in this case bone tissue. This solution has been developed as a software application running on Microsoft HoloLens. The workflow was validated on a 3D printed phantom replicating the anatomy of a patient presenting an extraosseous Ewing’s sarcoma, and then tested during the actual surgical intervention. The results showed that the surgical guide with the reference marker can be placed precisely with an accuracy of 2 mm and a visualization error lower than 3 mm. The application allowed physicians to visualize the skin, bone, tumor and medical images overlaid on the phantom and patient. To enable the use of AR and 3D printing by inexperienced users without broad technical knowledge, we designed a step-by-step methodology. The proposed protocol describes how to develop an AR smartphone application that allows superimposing any patient-based 3D model onto a real-world environment using a 3D printed marker tracked by the smartphone camera. Our solution brings AR solutions closer to the final clinical user, combining free and open-source software with an open-access protocol. The proposed guide is already helping to accelerate the adoption of these technologies by medical professionals and researchers. In the next section of the thesis, we wanted to show the benefits of combining these technologies during different stages of the surgical workflow in orthopedic oncology. We designed a novel AR-based smartphone application that can display the patient’s anatomy and the tumor’s location. A 3D printed reference marker, designed to fit in a unique position of the affected bone tissue, enables automatic registration. The system has been evaluated in terms of visualization accuracy and usability during the whole surgical workflow on six realistic phantoms achieving a visualization error below 3 mm. The AR system was tested in two clinical cases during surgical planning, patient communication, and surgical intervention. These results and the positive feedback obtained from surgeons and patients suggest that the combination of AR and 3D printing can improve efficacy, accuracy, and patients’ experience In the final section, two surgical navigation systems have been developed and evaluated to guide electrode placement in sacral neurostimulation procedures based on optical tracking and augmented reality. Our results show that both systems could minimize patient discomfort and improve surgical outcomes by reducing needle insertion time and number of punctures. Additionally, we proposed a feasible clinical workflow for guiding SNS interventions with both navigation methodologies, including automatically creating sacral virtual 3D models for trajectory definition using artificial intelligence and intraoperative patient-to-image registration. To conclude, in this thesis we have demonstrated that the combination of technologies such as tracking systems, augmented reality, 3D printing, and artificial intelligence overcomes many current limitations in surgical treatments. Our results encourage the medical community to combine these technologies to improve surgical workflows and outcomes in more clinical scenarios.Programa de Doctorado en Ciencia y Tecnología Biomédica por la Universidad Carlos III de MadridPresidenta: María Jesús Ledesma Carbayo.- Secretaria: María Arrate Muñoz Barrutia.- Vocal: Csaba Pinte

    Advances in Biomedical Applications and Assessment of Ultrasound Nonrigid Image Registration.

    Full text link
    Image volume based registration (IVBaR) is the process of determining a one-to-one transformation between points in two images that relates the information in one image to that in the other image quantitatively. IVBaR is done primarily to spatially align the two images in the same coordinate system in order to allow better comparison and visualization of changes. The potential use of IVBaR has been explored in three different contexts. In a preliminary study on identification of biometric from internal finger structure, semi-automated IVBaR-based study provided a sensitivity and specificity of 0.93 and 1.00 respectively. Visual matching of all image pairs by four readers yielded 96% successful match. IVBaR could potentially be useful for routine breast cancer screening and diagnosis. Nearly whole breast ultrasound (US) scanning with mammographic-style compression and successful IVBaR were achieved. The image volume was registered off-line with a mutual information cost function and global interpolation based on the non-rigid thin-plate spline deformation. This Institutional Review Board approved study was conducted on 10 patients undergoing chemotherapy and 14 patients with a suspicious/unknown mass scheduled to undergo biopsy. IVBaR was successful with mean registration error (MRE) of 5.2±2 mm in 12 of 17 ABU image pairs collected before, during or after 115±14 days of chemotherapy. Semi-automated tumor volume estimation was performed on registered image volumes giving 86±8% mean accuracy compared with a radiologist hand-segmented tumor volume on 7 cases with correlation coefficient of 0.99 (p<0.001). In a reader study by 3 radiologists assigned to mark the tumor boundary, significant reduction in time taken (p<0.03) was seen due to IVBaR in 6 cases. Three new methods were developed for independent validation of IVBaR based on Doppler US signals. Non-rigid registration tools were also applied in the field of interventional guidance of medical tools used in minimally invasive surgery. The mean positional error in a CT scanner environment improved from 3.9±1.5 mm to 1.0±0.3 mm (p<0.0002). These results show that 3D image volumes and data can be spatially aligned using non-rigid registration for comparison as well as quantification of changes.Ph.D.Applied PhysicsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/64802/1/gnarayan_1.pd

    The Largest Unethical Medical Experiment in Human History

    Get PDF
    This monograph describes the largest unethical medical experiment in human history: the implementation and operation of non-ionizing non-visible EMF radiation (hereafter called wireless radiation) infrastructure for communications, surveillance, weaponry, and other applications. It is unethical because it violates the key ethical medical experiment requirement for “informed consent” by the overwhelming majority of the participants. The monograph provides background on unethical medical research/experimentation, and frames the implementation of wireless radiation within that context. The monograph then identifies a wide spectrum of adverse effects of wireless radiation as reported in the premier biomedical literature for over seven decades. Even though many of these reported adverse effects are extremely severe, the true extent of their severity has been grossly underestimated. Most of the reported laboratory experiments that produced these effects are not reflective of the real-life environment in which wireless radiation operates. Many experiments do not include pulsing and modulation of the carrier signal, and most do not account for synergistic effects of other toxic stimuli acting in concert with the wireless radiation. These two additions greatly exacerbate the severity of the adverse effects from wireless radiation, and their neglect in current (and past) experimentation results in substantial under-estimation of the breadth and severity of adverse effects to be expected in a real-life situation. This lack of credible safety testing, combined with depriving the public of the opportunity to provide informed consent, contextualizes the wireless radiation infrastructure operation as an unethical medical experiment
    corecore