153 research outputs found

    Navigated Ultrasound in Laparoscopic Surgery

    Get PDF

    Essential updates 2020/2021 : Current topics of simulation and navigation in hepatectomy

    Get PDF
    With the development of three-dimensional (3D) simulation software, preoperative simulation technology is almost completely established. The remaining issue is how to recognize anatomy three-dimensionally. Extended reality is a newly developed technology with several merits for surgical application: no requirement for a sterilized display monitor, better spatial awareness, and the ability to share 3D images among all surgeons. Various technology or devices for intraoperative navigation have also been developed to support the safety and certainty of liver surgery. Consensus recommendations regarding indocyanine green fluorescence were determined in 2021. Extended reality has also been applied to intraoperative navigation, and artificial intelligence (AI) is one of the topics of real-time navigation. AI might overcome the problem of liver deformity with automatic registration. Including the issues described above, this article focuses on recent advances in simulation and navigation in liver surgery from 2020 to 2021

    Image-Fusion for Biopsy, Intervention, and Surgical Navigation in Urology

    Get PDF

    The Challenge of Augmented Reality in Surgery

    Get PDF
    Imaging has revolutionized surgery over the last 50 years. Diagnostic imaging is a key tool for deciding to perform surgery during disease management; intraoperative imaging is one of the primary drivers for minimally invasive surgery (MIS), and postoperative imaging enables effective follow-up and patient monitoring. However, notably, there is still relatively little interchange of information or imaging modality fusion between these different clinical pathway stages. This book chapter provides a critique of existing augmented reality (AR) methods or application studies described in the literature using relevant examples. The aim is not to provide a comprehensive review, but rather to give an indication of the clinical areas in which AR has been proposed, to begin to explain the lack of clinical systems and to provide some clear guidelines to those intending pursue research in this area

    Optical techniques for 3D surface reconstruction in computer-assisted laparoscopic surgery

    Get PDF
    One of the main challenges for computer-assisted surgery (CAS) is to determine the intra-opera- tive morphology and motion of soft-tissues. This information is prerequisite to the registration of multi-modal patient-specific data for enhancing the surgeon’s navigation capabilites by observ- ing beyond exposed tissue surfaces and for providing intelligent control of robotic-assisted in- struments. In minimally invasive surgery (MIS), optical techniques are an increasingly attractive approach for in vivo 3D reconstruction of the soft-tissue surface geometry. This paper reviews the state-of-the-art methods for optical intra-operative 3D reconstruction in laparoscopic surgery and discusses the technical challenges and future perspectives towards clinical translation. With the recent paradigm shift of surgical practice towards MIS and new developments in 3D opti- cal imaging, this is a timely discussion about technologies that could facilitate complex CAS procedures in dynamic and deformable anatomical regions

    Goggle Augmented Imaging and Navigation System for Fluorescence-Guided Surgery

    Get PDF
    Surgery remains the only curative option for most solid tumors. The standard-of-care usually involves tumor resection and sentinel lymph node biopsy for cancer staging. Surgeons rely on their vision and touch to distinguish healthy from cancer tissue during surgery, often leading to incomplete tumor resection that necessitates repeat surgery. Sentinel lymph node biopsy by conventional radioactive tracking exposes patients and caregivers to ionizing radiation, while blue dye tracking stains the tissue highlighting only superficial lymph nodes. Improper identification of sentinel lymph nodes may misdiagnose the stage of the cancer. Therefore there is a clinical need for accurate intraoperative tumor and sentinel lymph node visualization. Conventional imaging modalities such as x-ray computed tomography, positron emission tomography, magnetic resonance imaging, and ultrasound are excellent for preoperative cancer diagnosis and surgical planning. However, they are not suitable for intraoperative use, due to bulky complicated hardware, high cost, non-real-time imaging, severe restrictions to the surgical workflow and lack of sufficient resolution for tumor boundary assessment. This has propelled interest in fluorescence-guided surgery, due to availability of simple hardware that can achieve real-time, high resolution and sensitive imaging. Near-infrared fluorescence imaging is of particular interest due to low background absorbance by photoactive biomolecules, enabling thick tissue assessment. As a result several near-infrared fluorescence-guided surgery systems have been developed. However, they are limited by bulky hardware, disruptive information display and non-matched field of view to the user. To address these limitations we have developed a compact, light-weight and wearable goggle augmented imaging and navigation system (GAINS). It detects the near-infrared fluorescence from a tumor accumulated contrast agent, along with the normal color view and displays accurately aligned, color-fluorescence images via a head-mounted display worn by the surgeon, in real-time. GAINS is a platform technology and capable of very sensitive fluorescence detection. Image display options include both video see-through and optical see-through head-mounted displays for high-contrast image guidance as well as direct visual access to the surgical bed. Image capture options from large field of view camera as well high magnification handheld microscope, ensures macroscopic as well as microscopic assessment of the tumor bed. Aided by tumor targeted near-infrared contrast agents, GAINS guided complete tumor resection in subcutaneous, metastatic and spontaneous mouse models of cancer with high sensitivity and specificity, in real-time. Using a clinically-approved near-infrared contrast agent, GAINS provided real-time image guidance for accurate visualization of lymph nodes in a porcine model and sentinel lymph nodes in human breast cancer and melanoma patients with high sensitivity. This work has addressed issues that have limited clinical adoption of fluorescence-guided surgery and paved the way for research into developing this approach towards standard-of-care practice that can potentially improve surgical outcomes in cancer

    On uncertainty propagation in image-guided renal navigation: Exploring uncertainty reduction techniques through simulation and in vitro phantom evaluation

    Get PDF
    Image-guided interventions (IGIs) entail the use of imaging to augment or replace direct vision during therapeutic interventions, with the overall goal is to provide effective treatment in a less invasive manner, as an alternative to traditional open surgery, while reducing patient trauma and shortening the recovery time post-procedure. IGIs rely on pre-operative images, surgical tracking and localization systems, and intra-operative images to provide correct views of the surgical scene. Pre-operative images are used to generate patient-specific anatomical models that are then registered to the patient using the surgical tracking system, and often complemented with real-time, intra-operative images. IGI systems are subject to uncertainty from several sources, including surgical instrument tracking / localization uncertainty, model-to-patient registration uncertainty, user-induced navigation uncertainty, as well as the uncertainty associated with the calibration of various surgical instruments and intra-operative imaging devices (i.e., laparoscopic camera) instrumented with surgical tracking sensors. All these uncertainties impact the overall targeting accuracy, which represents the error associated with the navigation of a surgical instrument to a specific target to be treated under image guidance provided by the IGI system. Therefore, understanding the overall uncertainty of an IGI system is paramount to the overall outcome of the intervention, as procedure success entails achieving certain accuracy tolerances specific to individual procedures. This work has focused on studying the navigation uncertainty, along with techniques to reduce uncertainty, for an IGI platform dedicated to image-guided renal interventions. We constructed life-size replica patient-specific kidney models from pre-operative images using 3D printing and tissue emulating materials and conducted experiments to characterize the uncertainty of both optical and electromagnetic surgical tracking systems, the uncertainty associated with the virtual model-to-physical phantom registration, as well as the uncertainty associated with live augmented reality (AR) views of the surgical scene achieved by enhancing the pre-procedural model and tracked surgical instrument views with live video views acquires using a camera tracked in real time. To better understand the effects of the tracked instrument calibration, registration fiducial configuration, and tracked camera calibration on the overall navigation uncertainty, we conducted Monte Carlo simulations that enabled us to identify optimal configurations that were subsequently validated experimentally using patient-specific phantoms in the laboratory. To mitigate the inherent accuracy limitations associated with the pre-procedural model-to-patient registration and their effect on the overall navigation, we also demonstrated the use of tracked video imaging to update the registration, enabling us to restore targeting accuracy to within its acceptable range. Lastly, we conducted several validation experiments using patient-specific kidney emulating phantoms using post-procedure CT imaging as reference ground truth to assess the accuracy of AR-guided navigation in the context of in vitro renal interventions. This work helped find answers to key questions about uncertainty propagation in image-guided renal interventions and led to the development of key techniques and tools to help reduce optimize the overall navigation / targeting uncertainty

    Impact of Soft Tissue Heterogeneity on Augmented Reality for Liver Surgery

    Get PDF
    International audienceThis paper presents a method for real-time augmented reality of internal liver structures during minimally invasive hepatic surgery. Vessels and tumors computed from pre-operative CT scans can be overlaid onto the laparoscopic view for surgery guidance. Compared to current methods, our method is able to locate the in-depth positions of the tumors based on partial three-dimensional liver tissue motion using a real-time biomechanical model. This model permits to properly handle the motion of internal structures even in the case of anisotropic or heterogeneous tissues, as it is the case for the liver and many anatomical structures. Experimentations conducted on phantom liver permits to measure the accuracy of the augmentation while real-time augmentation on in vivo human liver during real surgery shows the benefits of such an approach for minimally invasive surgery
    corecore