134 research outputs found

    Intraoperative Navigation Systems for Image-Guided Surgery

    Get PDF
    Recent technological advancements in medical imaging equipment have resulted in a dramatic improvement of image accuracy, now capable of providing useful information previously not available to clinicians. In the surgical context, intraoperative imaging provides a crucial value for the success of the operation. Many nontrivial scientific and technical problems need to be addressed in order to efficiently exploit the different information sources nowadays available in advanced operating rooms. In particular, it is necessary to provide: (i) accurate tracking of surgical instruments, (ii) real-time matching of images from different modalities, and (iii) reliable guidance toward the surgical target. Satisfying all of these requisites is needed to realize effective intraoperative navigation systems for image-guided surgery. Various solutions have been proposed and successfully tested in the field of image navigation systems in the last ten years; nevertheless several problems still arise in most of the applications regarding precision, usability and capabilities of the existing systems. Identifying and solving these issues represents an urgent scientific challenge. This thesis investigates the current state of the art in the field of intraoperative navigation systems, focusing in particular on the challenges related to efficient and effective usage of ultrasound imaging during surgery. The main contribution of this thesis to the state of the art are related to: Techniques for automatic motion compensation and therapy monitoring applied to a novel ultrasound-guided surgical robotic platform in the context of abdominal tumor thermoablation. Novel image-fusion based navigation systems for ultrasound-guided neurosurgery in the context of brain tumor resection, highlighting their applicability as off-line surgical training instruments. The proposed systems, which were designed and developed in the framework of two international research projects, have been tested in real or simulated surgical scenarios, showing promising results toward their application in clinical practice

    AUGMENTED REALITY AND INTRAOPERATIVE C-ARM CONE-BEAM COMPUTED TOMOGRAPHY FOR IMAGE-GUIDED ROBOTIC SURGERY

    Get PDF
    Minimally-invasive robotic-assisted surgery is a rapidly-growing alternative to traditionally open and laparoscopic procedures; nevertheless, challenges remain. Standard of care derives surgical strategies from preoperative volumetric data (i.e., computed tomography (CT) and magnetic resonance (MR) images) that benefit from the ability of multiple modalities to delineate different anatomical boundaries. However, preoperative images may not reflect a possibly highly deformed perioperative setup or intraoperative deformation. Additionally, in current clinical practice, the correspondence of preoperative plans to the surgical scene is conducted as a mental exercise; thus, the accuracy of this practice is highly dependent on the surgeon’s experience and therefore subject to inconsistencies. In order to address these fundamental limitations in minimally-invasive robotic surgery, this dissertation combines a high-end robotic C-arm imaging system and a modern robotic surgical platform as an integrated intraoperative image-guided system. We performed deformable registration of preoperative plans to a perioperative cone-beam computed tomography (CBCT), acquired after the patient is positioned for intervention. From the registered surgical plans, we overlaid critical information onto the primary intraoperative visual source, the robotic endoscope, by using augmented reality. Guidance afforded by this system not only uses augmented reality to fuse virtual medical information, but also provides tool localization and other dynamic intraoperative updated behavior in order to present enhanced depth feedback and information to the surgeon. These techniques in guided robotic surgery required a streamlined approach to creating intuitive and effective human-machine interferences, especially in visualization. Our software design principles create an inherently information-driven modular architecture incorporating robotics and intraoperative imaging through augmented reality. The system's performance is evaluated using phantoms and preclinical in-vivo experiments for multiple applications, including transoral robotic surgery, robot-assisted thoracic interventions, and cocheostomy for cochlear implantation. The resulting functionality, proposed architecture, and implemented methodologies can be further generalized to other C-arm-based image guidance for additional extensions in robotic surgery

    Projection-based Spatial Augmented Reality for Interactive Visual Guidance in Surgery

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Personalized medicine in surgical treatment combining tracking systems, augmented reality and 3D printing

    Get PDF
    Mención Internacional en el título de doctorIn the last twenty years, a new way of practicing medicine has been focusing on the problems and needs of each patient as an individual thanks to the significant advances in healthcare technology, the so-called personalized medicine. In surgical treatments, personalization has been possible thanks to key technologies adapted to the specific anatomy of each patient and the needs of the physicians. Tracking systems, augmented reality (AR), three-dimensional (3D) printing and artificial intelligence (AI) have previously supported this individualized medicine in many ways. However, their independent contributions show several limitations in terms of patient-to-image registration, lack of flexibility to adapt to the requirements of each case, large preoperative planning times, and navigation complexity. The main objective of this thesis is to increase patient personalization in surgical treatments by combining these technologies to bring surgical navigation to new complex cases by developing new patient registration methods, designing patient-specific tools, facilitating access to augmented reality by the medical community, and automating surgical workflows. In the first part of this dissertation, we present a novel framework for acral tumor resection combining intraoperative open-source navigation software, based on an optical tracking system, and desktop 3D printing. We used additive manufacturing to create a patient-specific mold that maintained the same position of the distal extremity during image-guided surgery as in the preoperative images. The feasibility of the proposed workflow was evaluated in two clinical cases (soft-tissue sarcomas in hand and foot). We achieved an overall accuracy of the system of 1.88 mm evaluated on the patient-specific 3D printed phantoms. Surgical navigation was feasible during both surgeries, allowing surgeons to verify the tumor resection margin. Then, we propose and augmented reality navigation system that uses 3D printed surgical guides with a tracking pattern enabling automatic patient-to-image registration in orthopedic oncology. This specific tool fits on the patient only in a pre-designed location, in this case bone tissue. This solution has been developed as a software application running on Microsoft HoloLens. The workflow was validated on a 3D printed phantom replicating the anatomy of a patient presenting an extraosseous Ewing’s sarcoma, and then tested during the actual surgical intervention. The results showed that the surgical guide with the reference marker can be placed precisely with an accuracy of 2 mm and a visualization error lower than 3 mm. The application allowed physicians to visualize the skin, bone, tumor and medical images overlaid on the phantom and patient. To enable the use of AR and 3D printing by inexperienced users without broad technical knowledge, we designed a step-by-step methodology. The proposed protocol describes how to develop an AR smartphone application that allows superimposing any patient-based 3D model onto a real-world environment using a 3D printed marker tracked by the smartphone camera. Our solution brings AR solutions closer to the final clinical user, combining free and open-source software with an open-access protocol. The proposed guide is already helping to accelerate the adoption of these technologies by medical professionals and researchers. In the next section of the thesis, we wanted to show the benefits of combining these technologies during different stages of the surgical workflow in orthopedic oncology. We designed a novel AR-based smartphone application that can display the patient’s anatomy and the tumor’s location. A 3D printed reference marker, designed to fit in a unique position of the affected bone tissue, enables automatic registration. The system has been evaluated in terms of visualization accuracy and usability during the whole surgical workflow on six realistic phantoms achieving a visualization error below 3 mm. The AR system was tested in two clinical cases during surgical planning, patient communication, and surgical intervention. These results and the positive feedback obtained from surgeons and patients suggest that the combination of AR and 3D printing can improve efficacy, accuracy, and patients’ experience In the final section, two surgical navigation systems have been developed and evaluated to guide electrode placement in sacral neurostimulation procedures based on optical tracking and augmented reality. Our results show that both systems could minimize patient discomfort and improve surgical outcomes by reducing needle insertion time and number of punctures. Additionally, we proposed a feasible clinical workflow for guiding SNS interventions with both navigation methodologies, including automatically creating sacral virtual 3D models for trajectory definition using artificial intelligence and intraoperative patient-to-image registration. To conclude, in this thesis we have demonstrated that the combination of technologies such as tracking systems, augmented reality, 3D printing, and artificial intelligence overcomes many current limitations in surgical treatments. Our results encourage the medical community to combine these technologies to improve surgical workflows and outcomes in more clinical scenarios.Programa de Doctorado en Ciencia y Tecnología Biomédica por la Universidad Carlos III de MadridPresidenta: María Jesús Ledesma Carbayo.- Secretaria: María Arrate Muñoz Barrutia.- Vocal: Csaba Pinte

    Developmental delays and subcellular stress as downstream effects of sonoporation

    Get PDF
    Posters: no. 2Control ID: 1672434OBJECTIVES: The biological impact of sonoporation has often been overlooked. Here we seek to obtain insight into the cytotoxic impact of sonoporation by gaining new perspectives on anti-proliferative characteristics that may emerge within sonoporated cells. We particularly focused on investigating the cell-cycle progression kinetics of sonoporated cells and identifying organelles that may be stressed in the recovery process. METHODS: In line with recommendations on exposure hardware design, an immersion-based ultrasound platform has been developed. It delivers 1 MHz ultrasound pulses (100 cycles; 1 kHz PRF; 60 s total duration) with 0.45 MPa peak negative pressure to a cell chamber that housed HL-60 leukemia cells and lipid-shelled microbubbles at a 10:1 cell-tobubble ratio (for 1e6/ml cell density). Calcein was used to facilitate tracking of sonoporated cells with enhanced uptake of exogenous molecules. The developmental trend of sonoporated cells was quantitatively analyzed using BrdU/DNA flow cytometry that monitors the cell population’s DNA synthesis kinetics. This allowed us to measure the temporal progression of DNA synthesis of sonoporated cells. To investigate whether sonoporation would upset subcellular homeostasis, post-exposure cell samples were also assayed for various proteins using Western blot analysis. Analysis focus was placed on the endoplasmic reticulum (ER): an important organelle with multi-faceted role in cellular functioning. The post-exposure observation time spanned between 0-24 h. RESULTS: Despite maintaining viability, sonoporated cells were found to exhibit delays in cell-cycle progression. Specifically, their DNA synthesis time was lengthened substantially (for HL-60 cells: 8.7 h for control vs 13.4 h for the sonoporated group). This indicates that sonoporated cells were under stress: a phenomenon that is supported by our Western blot assays showing upregulation of ER-resident enzymes (PDI, Ero1), ER stress sensors (PERK, IRE1), and ER-triggered pro-apoptotic signals (CHOP, JNK). CONCLUSIONS: Sonoporation, whilst being able to facilitate internalization of exogenous molecules, may inadvertently elicit a cellular stress response. These findings seem to echo recent calls for reconsideration of efficiency issues in sonoporation-mediated drug delivery. Further efforts would be necessary to improve the efficiency of sonoporation-based biomedical applications where cell death is not desirable.postprin

    A study on the change in plasma membrane potential during sonoporation

    Get PDF
    Posters: no. 4Control ID: 1680329OBJECTIVES: There has been validated that the correlation of sonoporation with calcium transients is generated by ultrasound-mediated microbubbles activity. Besides calcium, other ionic flows are likely involved in sonoporation. Our hypothesis is the cell electrophysiological properties are related to the intracellular delivery by ultrasound and microbubbles. In this study, a real-time live cell imaging platform is used to determine whether plasma membrane potential change is related to the sonoporation process at the cellular level. METHODS: Hela cells were cultured in DMEM supplemented with 10% FBS in Opticell Chamber at 37 °C and 5% CO2, and reached 80% confluency before experiments. The Calcein Blue-AM, DiBAC4(3) loaded cells in the Opticell chamber filled with PI solution and Sonovue microbubbles were immerged in a water tank on a inverted fluorescence microscope. Pulsed ultrasound (1MHz freq., 20 cycles, 20Hz PRF, 0.2-0.5MPa PNP) was irradiated at the angle of 45° to the region of interest for 1s.The real-time fluorescence imaging for different probes was acquired by a cooled CCD camera every 20s for 10min. The time-lapse fluorescence images were quantitatively analyzed to evaluate the correlation of cell viability, intracellular delivery with plasma membrane potential change. RESULTS: Our preliminary data showed that the PI fluorescence, which indicated intracellular delivery, was immediately accumulated in cells adjacent to microbubbles after exposure, suggesting that their membranes were damaged by ultrasound-activated microbubbles. However, the fluorescence reached its highest level within 4 to 6 minutes and was unchanged thereafter, indicating the membrane was gradually repaired within this period. Furthermore, using DIBAC4(3), which detected the change in the cell membrane potential, we found that the loss of membrane potential might be associated with intracellular delivery, because the PI fluorescence accumulation was usually accompanied with the change in DIBAC4 (3) fluorescence. CONCLUSIONS: Our study suggests that there may be a linkage between the cell membrane potential change and intracellular delivery mediated by ultrasound and microbubbles. We also suggest that other ionic flows or ion channels may be involved in the cell membrane potential change in sonoporation. Further efforts to explore the cellular mechanism of this phenomenon will improve our understanding of sonoporation.postprin

    How sonoporation disrupts cellular structural integrity: morphological and cytoskeletal observations

    Get PDF
    Posters: no. 1Control ID: 1672429OBJECTIVES: In considering sonoporation for drug delivery applications, it is essential to understand how living cells respond to this puncturing force. Here we seek to investigate the effects of sonoporation on cellular structural integrity. We hypothesize that the membrane morphology and cytoskeletal behavior of sonoporated cells under recovery would inherently differ from that of normal viable cells. METHODS: A customized and calibrated exposure platform was developed for this work, and the ZR-75-30 breast carcinoma cells were used as the cell model. The cells were exposed to either single or multiple pulses of 1 MHz ultrasound (pulse length: 30 or 100 cycles; PRF: 1kHz; duration: up to 60s) with 0.45 MPa spatial-averaged peak negative pressure and in the presence of lipid-shelled microbubbles. Confocal microscopy was used to examine insitu the structural integrity of sonoporated cells (identified as ones with exogenous fluorescent marker internalization). For investigations on membrane morphology, FM 4-64 was used as the membrane dye (red), and calcein was used as the sonoporation marker (green); for studies on cytoskeletal behavior, CellLight (green) and propidium iodide (red) were used to respectively label actin filaments and sonoporated cells. Observation started from before exposure to up to 2 h after exposure, and confocal images were acquired at real-time frame rates. Cellular structural features and their temporal kinetics were quantitatively analyzed to assess the consistency of trends amongst a group of cells. RESULTS: Sonoporated cells exhibited membrane shrinkage (decreased by 61% in a cell’s cross-sectional area) and intracellular lipid accumulation (381% increase compared to control) over a 2 h period. The morphological repression of sonoporated cells was also found to correspond with post-sonoporation cytoskeletal processes: actin depolymerization was observed as soon as pores were induced on the membrane. These results show that cellular structural integrity is indeed disrupted over the course of sonoporation. CONCLUSIONS: Our investigation shows that the biophysical impact of sonoporation is by no means limited to the induction of membrane pores: e.g. structural integrity is concomitantly affected in the process. This prompts the need for further fundamental studies to unravel the complex sequence of biological events involved in sonoporation.postprin

    Image guided robotic assistance for the diagnosis and treatment of tumor

    Get PDF
    The aim of this thesis is to demonstrate the feasibility and the potentiality of introduction of robotics and image guidance in the overall oncologic workflow, from the diagnosis to the treatment phase. The popularity of robotics in the operating room has grown in recent years. Currently the most popular systems is the da Vinci telemanipulator (Intuitive Surgical), it is based on a master-slave control, for minimally invasive surgery and it is used in several surgical fields such us urology, general, gynecology, cardiothoracic. An accurate study of this system, from a technological field of view, has been conducted addressing all drawbacks and advantages of this system. The da Vinci System creates an immersive operating environment for the surgeon by providing both high quality stereo visualization and a human-machine interface that directly connects the surgeon’s hands to the motion of the surgical tool tips inside the patient’s body. It has undoubted advantages for the surgeon work and for the patient health, at least for some interventions, while its very high costs leaves many doubts on its price benefit ratio. In the robotic surgery field many researchers are working on the optimization and miniaturization robots mechanic, while others are trying to obtain smart functionalities to realize robotic systems, that, “knowing” the patient anatomy from radiological images, can assists the surgeon in an active way. Regarding the second point, image guided systems can be useful to plan and to control medical robots motion and to provide the surgeon pre-operative and intra-operative images with augmented reality visualization to enhance his/her perceptual capacities and, as a consequence, to improve the quality of treatments. To demonstrate this thesis some prototypes has been designed, implemented and tested. The development of image guided medical devices, comprehensive of augmented reality, virtual navigation and robotic surgical features, requires to address several problems. The first ones are the choosing of the robotic platform and of the image source to employ. An industrial anthropomorphic arm has been used as testing platform. The idea of integrating industrial robot components in the clinical workflow has been supported by the da Vinci technical analysis. The algorithms and methods developed, regarding in particular robot calibration, based on literature theories and on an easily integration in the clinical scenario, can be adapted to each anthropomorphic arm. In this way this work can be integrated with light-weight robots, for industrial or clinical use, able to work in close contact to humans, which will become numerous in the early future. Regarding the medical image source, it has been decided to work with ultrasound imaging. Two-dimensional ultrasound imaging is widely used in clinical practice because is not dangerous for the patient, inexpensive, compact and it is a highly flexible imaging that allows users to study many anatomic structures. It is routinely used for diagnosis and as guidance in percutaneous treatments. However the use of 2D ultrasound imaging presents some disadvantages that require great ability of the user: it requires that the clinician mentally integrates many images to reconstruct a complete idea of the anatomy in 3D. Furthermore the freehand control of the probe make it difficult to individuate anatomic positions and orientations and probe repositioning to reach a particular location. To overcome these problems it has been developed an image guided system that fuse 2D US real time images with routinely CT or MRI 3D images, previously acquired from the patient, to enhance clinician orientation and probe guidance. The implemented algorithms for robot calibration and US image guidance has been used to realize two applications responding to specific clinical needs. The first one to speed up the execution of routinely and very recurrently procedures like percutaneous biopsy or ablation. The second one to improve a new completely non invasive type of treatment for solid tumors, the HIFU (High Intensity Focused Ultrasound). An ultrasound guided robotic system has been developed to assist the clinician to execute complicated biopsies, or percutaneous ablations, in particular for deep abdominal organs. It was developed an integrated system that provides the clinician two types of assistance: a mixed reality visualization allows accurate and easy planning of needle trajectory and target reaching verification; the robot arm equipped with a six-degree-of-freedom force sensor allows the precise positioning of the needle holder and allows the clinician to adjust, by means of a cooperative control, the planned trajectory to overcome needle deflection and target motion. The second application consists in an augmented reality navigation system for HIFU treatment. HIFU represents a completely non invasive method for treatment of solid tumors, hemostasis and other vascular features in human tissues. The technology for HIFU treatments is still evolving and the systems available on the market have some limitations and drawbacks. A disadvantage resulting from our experience with the machinery available in our hospital (JC200 therapeutic system Haifu (HIFU) by Tech Co., Ltd, Chongqing), which is similar to other analogous machines, is the long time required to perform the procedure due to the difficulty to find the target, using the remote motion of an ultrasound probe under the patient. This problem has been addressed developing an augmented reality navigation system to enhance US guidance during HIFU treatments allowing an easy target localization. The system was implemented using an additional free hand ultrasound probe coupled with a localizer and CT fused imaging. It offers a simple and an economic solution to an easy HIFU target localization. This thesis demonstrates the utility and usability of robots for diagnosis and treatment of the tumor, in particular the combination of automatic positioning and cooperative control allows the surgeon and the robot to work in synergy. Further the work demonstrates the feasibility and the potentiality of the use of a mixed reality navigation system to facilitate the target localization and consequently to reduce the times of sittings, to increase the number of possible diagnosis/treatments and to decrease the risk of potential errors. The proposed solutions for the integration of robotics and image guidance in the overall oncologic workflow, take into account current available technologies, traditional clinical procedures and cost minimization

    Design-centric Method for an Augmented Reality Robotic Surgery

    Get PDF
    Master'sMASTER OF ENGINEERIN

    Real-time imaging of cellular dynamics during low-intensity pulsed ultrasound exposure

    Get PDF
    Control ID: 1671584Oral Session 5 - Bioeffects of therapeutic ultrasoundOBJECTIVE: Although the therapeutic potential of low-intensity pulsed ultrasound is unquestionable, the wave-matter interactions involved in the process remain to be vaguely characterized. Here we seek to undertake a series of in-situ cellular imaging studies that aim to analyze the mechanical impact of low-intensity pulsed ultrasound on attached fibroblasts from three different aspects: membrane, cytoskeleton, and nucleus. METHODS: Our experimental platform comprised an in-house ultrasound exposure hardware that was coupled to a confocal microscopy system. The waveguided ultrasound beam was geometrically aligned to the microscope’s fieldof-view that corresponds to the center of a polystyrene dish containing fibroblasts. Short ultrasound pulses (5 cycles; 2 kHz PRF) with 0.8 MPa peak acoustic pressure (0.21 W/cm2 SPTA intensity) were delivered over a 10 min period. Live imaging was performed on both membrane (CellMask) and cytoskeleton (actin-GFP, tubulin-RFP) over the entire observation period (up to 30 min after end of exposure). Also, pre- and post-exposure fixed-cell imaging was conducted on the nucleus (Hoechst 33342) and two cytoskeleton components related to stress fibers: F-actin (phalloidin-FITC) and vincullin (Alexa Fluor 647 conjugated). To study whether mechanotransduction was responsible in mediating ultrasound-cell interactions, some experiments were conducted with the addition of gadolinium that blocks stretch-sensitive ion channels. RESULTS: Cell shrinkage was evident over the course of low-intensity pulsed ultrasound exposure. This was accompanied with contraction of actin and tubulin. Also, an increase in central stress fibers was observed at the end of exposure, while the nucleus was found to have decreased in size. Interestingly, after the exposure, a significant rebound in cell volume was observed over a 30 min. period. These effects were not observed in cases with gadolinium blockage of mechanosensitive ion channels. CONCLUSIONS: Our results suggest that low-intensity pulsed ultrasound would transiently induce remodeling of a cell’s membrane and cytoskeleton, and it will lead to repression of nucleus. This indicates that ultrasound after all represents a mechanical stress on cellular membrane. The post-exposure outgrowth phenomenon is also of practical relevance as it may be linked to the stimulatory effects that have been already observed in low-intensity pulsed ultrasound treatments.postprin
    corecore