691 research outputs found

    Advancements and Breakthroughs in Ultrasound Imaging

    Get PDF
    Ultrasonic imaging is a powerful diagnostic tool available to medical practitioners, engineers and researchers today. Due to the relative safety, and the non-invasive nature, ultrasonic imaging has become one of the most rapidly advancing technologies. These rapid advances are directly related to the parallel advancements in electronics, computing, and transducer technology together with sophisticated signal processing techniques. This book focuses on state of the art developments in ultrasonic imaging applications and underlying technologies presented by leading practitioners and researchers from many parts of the world

    Ultrasound Guidance in Perioperative Care

    Get PDF

    SMART IMAGE-GUIDED NEEDLE INSERTION FOR TISSUE BIOPSY

    Get PDF
    M.S

    Ultrasound Guidance in Perioperative Care

    Get PDF

    Imaging Sensors and Applications

    Get PDF
    In past decades, various sensor technologies have been used in all areas of our lives, thus improving our quality of life. In particular, imaging sensors have been widely applied in the development of various imaging approaches such as optical imaging, ultrasound imaging, X-ray imaging, and nuclear imaging, and contributed to achieve high sensitivity, miniaturization, and real-time imaging. These advanced image sensing technologies play an important role not only in the medical field but also in the industrial field. This Special Issue covers broad topics on imaging sensors and applications. The scope range of imaging sensors can be extended to novel imaging sensors and diverse imaging systems, including hardware and software advancements. Additionally, biomedical and nondestructive sensing applications are welcome

    A Novel System and Image Processing for Improving 3D Ultrasound-guided Interventional Cancer Procedures

    Get PDF
    Image-guided medical interventions are diagnostic and therapeutic procedures that focus on minimizing surgical incisions for improving disease management and reducing patient burden relative to conventional techniques. Interventional approaches, such as biopsy, brachytherapy, and ablation procedures, have been used in the management of cancer for many anatomical regions, including the prostate and liver. Needles and needle-like tools are often used for achieving planned clinical outcomes, but the increased dependency on accurate targeting, guidance, and verification can limit the widespread adoption and clinical scope of these procedures. Image-guided interventions that incorporate 3D information intraoperatively have been shown to improve the accuracy and feasibility of these procedures, but clinical needs still exist for improving workflow and reducing physician variability with widely applicable cost-conscience approaches. The objective of this thesis was to incorporate 3D ultrasound (US) imaging and image processing methods during image-guided cancer interventions in the prostate and liver to provide accessible, fast, and accurate approaches for clinical improvements. An automatic 2D-3D transrectal ultrasound (TRUS) registration algorithm was optimized and implemented in a 3D TRUS-guided system to provide continuous prostate motion corrections with sub-millimeter and sub-degree error in 36 ± 4 ms. An automatic and generalizable 3D TRUS prostate segmentation method was developed on a diverse clinical dataset of patient images from biopsy and brachytherapy procedures, resulting in errors at gold standard accuracy with a computation time of 0.62 s. After validation of mechanical and image reconstruction accuracy, a novel 3D US system for focal liver tumor therapy was developed to guide therapy applicators with 4.27 ± 2.47 mm error. The verification of applicators post-insertion motivated the development of a 3D US applicator segmentation approach, which was demonstrated to provide clinically feasible assessments in 0.246 ± 0.007 s. Lastly, a general needle and applicator tool segmentation algorithm was developed to provide accurate intraoperative and real-time insertion feedback for multiple anatomical locations during a variety of clinical interventional procedures. Clinical translation of these developed approaches has the potential to extend the overall patient quality of life and outcomes by improving detection rates and reducing local cancer recurrence in patients with prostate and liver cancer

    Improving needle visibility in LED-based photoacoustic imaging using deep learning with semi-synthetic datasets

    Get PDF
    Photoacoustic imaging has shown great potential for guiding minimally invasive procedures by accurate identification of critical tissue targets and invasive medical devices (such as metallic needles). The use of light emitting diodes (LEDs) as the excitation light sources accelerates its clinical translation owing to its high affordability and portability. However, needle visibility in LED-based photoacoustic imaging is compromised primarily due to its low optical fluence. In this work, we propose a deep learning framework based on U-Net to improve the visibility of clinical metallic needles with a LED-based photoacoustic and ultrasound imaging system. To address the complexity of capturing ground truth for real data and the poor realism of purely simulated data, this framework included the generation of semi-synthetic training datasets combining both simulated data to represent features from the needles and in vivo measurements for tissue background. Evaluation of the trained neural network was performed with needle insertions into blood-vessel-mimicking phantoms, pork joint tissue ex vivo and measurements on human volunteers. This deep learning-based framework substantially improved the needle visibility in photoacoustic imaging in vivo compared to conventional reconstruction by suppressing background noise and image artefacts, achieving 5.8 and 4.5 times improvements in terms of signal-to-noise ratio and the modified Hausdorff distance, respectively. Thus, the proposed framework could be helpful for reducing complications during percutaneous needle insertions by accurate identification of clinical needles in photoacoustic imaging

    Development and Validation of Mechatronic Systems for Image-Guided Needle Interventions and Point-of-Care Breast Cancer Screening with Ultrasound (2D and 3D) and Positron Emission Mammography

    Get PDF
    The successful intervention of breast cancer relies on effective early detection and definitive diagnosis. While conventional screening mammography has substantially reduced breast cancer-related mortalities, substantial challenges persist in women with dense breasts. Additionally, complex interrelated risk factors and healthcare disparities contribute to breast cancer-related inequities, which restrict accessibility, impose cost constraints, and reduce inclusivity to high-quality healthcare. These limitations predominantly stem from the inadequate sensitivity and clinical utility of currently available approaches in increased-risk populations, including those with dense breasts, underserved and vulnerable populations. This PhD dissertation aims to describe the development and validation of alternative, cost-effective, robust, and high-resolution systems for point-of-care (POC) breast cancer screening and image-guided needle interventions. Specifically, 2D and 3D ultrasound (US) and positron emission mammography (PEM) were employed to improve detection, independent of breast density, in conjunction with mechatronic and automated approaches for accurate image acquisition and precise interventional workflow. First, a mechatronic guidance system for US-guided biopsy under high-resolution PEM localization was developed to improve spatial sampling of early-stage breast cancers. Validation and phantom studies showed accurate needle positioning and 3D spatial sampling under simulated PEM localization. Subsequently, a whole-breast spatially-tracked 3DUS system for point-of-care screening was developed, optimized, and validated within a clinically-relevant workspace and healthy volunteer studies. To improve robust image acquisition and adaptability to diverse patient populations, an alternative, cost-effective, portable, and patient-dedicated 3D automated breast (AB) US system for point-of-care screening was developed. Validation showed accurate geometric reconstruction, feasible clinical workflow, and proof-of-concept utility across healthy volunteers and acquisition conditions. Lastly, an orthogonal acquisition and 3D complementary breast (CB) US generation approach were described and experimentally validated to improve spatial resolution uniformity by recovering poor out-of-plane resolution. These systems developed and described throughout this dissertation show promise as alternative, cost-effective, robust, and high-resolution approaches for improving early detection and definitive diagnosis. Consequently, these contributions may advance breast cancer-related equities and improve outcomes in increased-risk populations and limited-resource settings
    corecore