74 research outputs found

    Validation Strategies Supporting Clinical Integration of Prostate Segmentation Algorithms for Magnetic Resonance Imaging

    Get PDF
    Segmentation of the prostate in medical images is useful for prostate cancer diagnosis and therapy guidance. However, manual segmentation of the prostate is laborious and time-consuming, with inter-observer variability. The focus of this thesis was on accuracy, reproducibility and procedure time measurement for prostate segmentation on T2-weighted endorectal magnetic resonance imaging, and assessment of the potential of a computer-assisted segmentation technique to be translated to clinical practice for prostate cancer management. We collected an image data set from prostate cancer patients with manually-delineated prostate borders by one observer on all the images and by two other observers on a subset of images. We used a complementary set of error metrics to measure the different types of observed segmentation errors. We compared expert manual segmentation as well as semi-automatic and automatic segmentation approaches before and after manual editing by expert physicians. We recorded the time needed for user interaction to initialize the semi-automatic algorithm, algorithm execution, and manual editing as necessary. Comparing to manual segmentation, the measured errors for the algorithms compared favourably with observed differences between manual segmentations. The measured average editing times for the computer-assisted segmentation were lower than fully manual segmentation time, and the algorithms reduced the inter-observer variability as compared to manual segmentation. The accuracy of the computer-assisted approaches was near to or within the range of observed variability in manual segmentation. The recorded procedure time for prostate segmentation was reduced using computer-assisted segmentation followed by manual editing, compared to the time required for fully manual segmentation

    Software and Hardware-based Tools for Improving Ultrasound Guided Prostate Brachytherapy

    Get PDF
    Minimally invasive procedures for prostate cancer diagnosis and treatment, including biopsy and brachytherapy, rely on medical imaging such as two-dimensional (2D) and three-dimensional (3D) transrectal ultrasound (TRUS) and magnetic resonance imaging (MRI) for critical tasks such as target definition and diagnosis, treatment guidance, and treatment planning. Use of these imaging modalities introduces challenges including time-consuming manual prostate segmentation, poor needle tip visualization, and variable MR-US cognitive fusion. The objective of this thesis was to develop, validate, and implement software- and hardware-based tools specifically designed for minimally invasive prostate cancer procedures to overcome these challenges. First, a deep learning-based automatic 3D TRUS prostate segmentation algorithm was developed and evaluated using a diverse dataset of clinical images acquired during prostate biopsy and brachytherapy procedures. The algorithm significantly outperformed state-of-the-art fully 3D CNNs trained using the same dataset while a segmentation time of 0.62 s demonstrated a significant reduction compared to manual segmentation. Next, the impact of dataset size, image quality, and image type on segmentation performance using this algorithm was examined. Using smaller training datasets, segmentation accuracy was shown to plateau with as little as 1000 training images, supporting the use of deep learning approaches even when data is scarce. The development of an image quality grading scale specific to 3D TRUS images will allow for easier comparison between algorithms trained using different datasets. Third, a power Doppler (PD) US-based needle tip localization method was developed and validated in both phantom and clinical cases, demonstrating reduced tip error and variation for obstructed needles compared to conventional US. Finally, a surface-based MRI-3D TRUS deformable image registration algorithm was developed and implemented clinically, demonstrating improved registration accuracy compared to manual rigid registration and reduced variation compared to the current clinical standard of physician cognitive fusion. These generalizable and easy-to-implement tools have the potential to improve workflow efficiency and accuracy for minimally invasive prostate procedures

    A Novel System and Image Processing for Improving 3D Ultrasound-guided Interventional Cancer Procedures

    Get PDF
    Image-guided medical interventions are diagnostic and therapeutic procedures that focus on minimizing surgical incisions for improving disease management and reducing patient burden relative to conventional techniques. Interventional approaches, such as biopsy, brachytherapy, and ablation procedures, have been used in the management of cancer for many anatomical regions, including the prostate and liver. Needles and needle-like tools are often used for achieving planned clinical outcomes, but the increased dependency on accurate targeting, guidance, and verification can limit the widespread adoption and clinical scope of these procedures. Image-guided interventions that incorporate 3D information intraoperatively have been shown to improve the accuracy and feasibility of these procedures, but clinical needs still exist for improving workflow and reducing physician variability with widely applicable cost-conscience approaches. The objective of this thesis was to incorporate 3D ultrasound (US) imaging and image processing methods during image-guided cancer interventions in the prostate and liver to provide accessible, fast, and accurate approaches for clinical improvements. An automatic 2D-3D transrectal ultrasound (TRUS) registration algorithm was optimized and implemented in a 3D TRUS-guided system to provide continuous prostate motion corrections with sub-millimeter and sub-degree error in 36 ± 4 ms. An automatic and generalizable 3D TRUS prostate segmentation method was developed on a diverse clinical dataset of patient images from biopsy and brachytherapy procedures, resulting in errors at gold standard accuracy with a computation time of 0.62 s. After validation of mechanical and image reconstruction accuracy, a novel 3D US system for focal liver tumor therapy was developed to guide therapy applicators with 4.27 ± 2.47 mm error. The verification of applicators post-insertion motivated the development of a 3D US applicator segmentation approach, which was demonstrated to provide clinically feasible assessments in 0.246 ± 0.007 s. Lastly, a general needle and applicator tool segmentation algorithm was developed to provide accurate intraoperative and real-time insertion feedback for multiple anatomical locations during a variety of clinical interventional procedures. Clinical translation of these developed approaches has the potential to extend the overall patient quality of life and outcomes by improving detection rates and reducing local cancer recurrence in patients with prostate and liver cancer

    Image Guided Robots for Urology

    Get PDF
    This dissertation addresses the development of medical image-guided robots and their applications in urology. Image-guided robots integrate medical image information with robotic precision to assist the planning and execution of the image-guided interventions. Robots guided by two different image modalities, ultrasound and MR image, were developed. Ultrasound image-guided robots manipulate an ultrasound probe and a needle-guide that are calibrated with respect to the robot for image-guided targeting. A method for calibration was developed and verified through the image-guided targeting experiments. Robotic manipulation of the calibrated probe allows acquisition of image slices at precise location, which can be combined to generate a 3D ultrasound image. Software for 3D ultrasound image acquisition, processing, and segmentation was developed as a part of the image-guided robot system. The feasibility of several image-guided intervention procedures using the ultrasound image-guided robot system was tested. The robot was used in a clinical trial of intraoperative transrectal ultrasound (TRUS) guided prostatectomy. The accuracy of TRUS-guided prostate biopsy using the robot was evaluated in a comparative study versus the classic human operation of the probe. Robot controlled palpation and image processing methods were developed for ultrasound elastography imaging of the prostate. An ultrasound to CT image-fusion method using the robot as a common reference was developed for percutaneous access of the kidney. MRI-guided robots were developed for transrectal and transperineal prostate biopsy. Extensive in-vitro tests were performed to ensure MRI compatibility and image-guided accuracy of the robots. The transrectal robot was evaluated in an animal study and the transperineal robot is undergoing a clinical trial. The collection of methods and algorithms presented in this dissertation can contribute to the development of image-guided robots that may provide less invasive and more precise interventions in urology, interventional radiology, and other fields

    New Mechatronic Systems for the Diagnosis and Treatment of Cancer

    Get PDF
    Both two dimensional (2D) and three dimensional (3D) imaging modalities are useful tools for viewing the internal anatomy. Three dimensional imaging techniques are required for accurate targeting of needles. This improves the efficiency and control over the intervention as the high temporal resolution of medical images can be used to validate the location of needle and target in real time. Relying on imaging alone, however, means the intervention is still operator dependent because of the difficulty of controlling the location of the needle within the image. The objective of this thesis is to improve the accuracy and repeatability of needle-based interventions over conventional techniques: both manual and automated techniques. This includes increasing the accuracy and repeatability of these procedures in order to minimize the invasiveness of the procedure. In this thesis, I propose that by combining the remote center of motion concept using spherical linkage components into a passive or semi-automated device, the physician will have a useful tracking and guidance system at their disposal in a package, which is less threatening than a robot to both the patient and physician. This design concept offers both the manipulative transparency of a freehand system, and tremor reduction through scaling currently offered in automated systems. In addressing each objective of this thesis, a number of novel mechanical designs incorporating an remote center of motion architecture with varying degrees of freedom have been presented. Each of these designs can be deployed in a variety of imaging modalities and clinical applications, ranging from preclinical to human interventions, with an accuracy of control in the millimeter to sub-millimeter range

    Image Processing and Analysis for Preclinical and Clinical Applications

    Get PDF
    Radiomics is one of the most successful branches of research in the field of image processing and analysis, as it provides valuable quantitative information for the personalized medicine. It has the potential to discover features of the disease that cannot be appreciated with the naked eye in both preclinical and clinical studies. In general, all quantitative approaches based on biomedical images, such as positron emission tomography (PET), computed tomography (CT) and magnetic resonance imaging (MRI), have a positive clinical impact in the detection of biological processes and diseases as well as in predicting response to treatment. This Special Issue, “Image Processing and Analysis for Preclinical and Clinical Applications”, addresses some gaps in this field to improve the quality of research in the clinical and preclinical environment. It consists of fourteen peer-reviewed papers covering a range of topics and applications related to biomedical image processing and analysis

    Erectile dysfunction after external beam radiotherapy for porstate cancer

    Get PDF
    corecore