146 research outputs found

    The impact of a radiologist-led workshop on MRI target volume delineation for radiotherapy

    Get PDF
    Introduction: Magnetic resonance imaging (MRI) is increasingly used for target volume delineation in radiotherapy due to its superior soft tissue visualisation compared to computed tomography (CT). The aim of this study was to assess the impact of a radiologist-led workshop on inter-observer variability in volume delineation on MRI. Methods: Data from three separate studies evaluating the impact of MRI in lung, breast and cervix were collated. At pre-workshop evaluation, observers involved in each clinical site were instructed to delineate specified volumes. Radiologists specialising in each cancer site conducted an interactive workshop on interpretation of images and anatomy for each clinical site. At post-workshop evaluation, observers repeated delineation a minimum of 2 weeks after the workshops. Inter-observer variability was evaluated using dice similarity coefficient (DSC) and volume similarity (VOLSIM) index comparing reference and observer volumes. Results: Post-workshop primary gross tumour volumes (GTV) were smaller than pre-workshop volumes for lung with a mean percentage reduction of 10.4%. Breast clinical target volumes (CTV) were similar but seroma volumes were smaller post-workshop on both supine (65% reduction) and prone MRI (73% reduction). Based on DSC scores, improvement in inter-observer variability was seen for the seroma cavity volume on prone MRI with a reduction in DSC score range from 0.4-0.8 to 0.7-0.9. Breast CTV demonstrated good inter-observer variability scores (mean DSC 0.9) for both pre- and post-workshop. Post-workshop observer delineated cervix GTV was smaller than pre-workshop by 26.9%. Conclusion: A radiologist-led workshop did not significantly reduce inter-observer variability in volume delineation for the three clinical sites. However, some improvement was noted in delineation of breast CTV, seroma volumes and cervix GTV

    Techniques and software tool for 3D multimodality medical image segmentation

    Get PDF
    The era of noninvasive diagnostic radiology and image-guided radiotherapy has witnessed burgeoning interest in applying different imaging modalities to stage and localize complex diseases such as atherosclerosis or cancer. It has been observed that using complementary information from multimodality images often significantly improves the robustness and accuracy of target volume definitions in radiotherapy treatment of cancer. In this work, we present techniques and an interactive software tool to support this new framework for 3D multimodality medical image segmentation. To demonstrate this methodology, we have designed and developed a dedicated open source software tool for multimodality image analysis MIASYS. The software tool aims to provide a needed solution for 3D image segmentation by integrating automatic algorithms, manual contouring methods, image preprocessing filters, post-processing procedures, user interactive features and evaluation metrics. The presented methods and the accompanying software tool have been successfully evaluated for different radiation therapy and diagnostic radiology applications

    Artificial Intelligence in Radiation Therapy

    Get PDF
    Artificial intelligence (AI) has great potential to transform the clinical workflow of radiotherapy. Since the introduction of deep neural networks, many AI-based methods have been proposed to address challenges in different aspects of radiotherapy. Commercial vendors have started to release AI-based tools that can be readily integrated to the established clinical workflow. To show the recent progress in AI-aided radiotherapy, we have reviewed AI-based studies in five major aspects of radiotherapy including image reconstruction, image registration, image segmentation, image synthesis, and automatic treatment planning. In each section, we summarized and categorized the recently published methods, followed by a discussion of the challenges, concerns, and future development. Given the rapid development of AI-aided radiotherapy, the efficiency and effectiveness of radiotherapy in the future could be substantially improved through intelligent automation of various aspects of radiotherapy

    Segment Anything Model (SAM) for Radiation Oncology

    Full text link
    In this study, we evaluate the performance of the Segment Anything Model (SAM) model in clinical radiotherapy. We collected real clinical cases from four regions at the Mayo Clinic: prostate, lung, gastrointestinal, and head \& neck, which are typical treatment sites in radiation oncology. For each case, we selected the OARs of concern in radiotherapy planning and compared the Dice and Jaccard outcomes between clinical manual delineation, automatic segmentation using SAM's "segment anything" mode, and automatic segmentation using SAM with box prompt. Our results indicate that SAM performs better in automatic segmentation for the prostate and lung regions, while its performance in the gastrointestinal and head \& neck regions was relatively inferior. When considering the size of the organ and the clarity of its boundary, SAM displays better performance for larger organs with clear boundaries, such as the lung and liver, and worse for smaller organs with unclear boundaries, like the parotid and cochlea. These findings align with the generally accepted variations in difficulty level associated with manual delineation of different organs at different sites in clinical radiotherapy. Given that SAM, a single trained model, could handle the delineation of OARs in four regions, these results also demonstrate SAM's robust generalization capabilities in automatic segmentation for radiotherapy, i.e., achieving delineation of different radiotherapy OARs using a generic automatic segmentation model. SAM's generalization capabilities across different regions make it technically feasible to develop a generic model for automatic segmentation in radiotherapy

    Validation Strategies Supporting Clinical Integration of Prostate Segmentation Algorithms for Magnetic Resonance Imaging

    Get PDF
    Segmentation of the prostate in medical images is useful for prostate cancer diagnosis and therapy guidance. However, manual segmentation of the prostate is laborious and time-consuming, with inter-observer variability. The focus of this thesis was on accuracy, reproducibility and procedure time measurement for prostate segmentation on T2-weighted endorectal magnetic resonance imaging, and assessment of the potential of a computer-assisted segmentation technique to be translated to clinical practice for prostate cancer management. We collected an image data set from prostate cancer patients with manually-delineated prostate borders by one observer on all the images and by two other observers on a subset of images. We used a complementary set of error metrics to measure the different types of observed segmentation errors. We compared expert manual segmentation as well as semi-automatic and automatic segmentation approaches before and after manual editing by expert physicians. We recorded the time needed for user interaction to initialize the semi-automatic algorithm, algorithm execution, and manual editing as necessary. Comparing to manual segmentation, the measured errors for the algorithms compared favourably with observed differences between manual segmentations. The measured average editing times for the computer-assisted segmentation were lower than fully manual segmentation time, and the algorithms reduced the inter-observer variability as compared to manual segmentation. The accuracy of the computer-assisted approaches was near to or within the range of observed variability in manual segmentation. The recorded procedure time for prostate segmentation was reduced using computer-assisted segmentation followed by manual editing, compared to the time required for fully manual segmentation

    Modular framework for a breast biopsy smart navigation system

    Get PDF
    Dissertação de mestrado em Informatics EngineeringBreast cancer is currently one of the most commonly diagnosed cancers and the fifth leading cause of cancer-related deaths. Its treatment has a higher survivorship rate when diagnosed in the disease’s early stages. The screening procedure uses medical imaging techniques, such as mammography or ultrasound, to discover possible lesions. When a physician finds a lesion that is likely to be malignant, a biopsy is performed to obtain a sample and determine its characteristics. Currently, real-time ultrasound is the preferred medical imaging modality to perform this procedure. The breast biopsy procedure is highly reliant on the operator’s skill and experience, due to the difficulty in interpreting ultrasound images and correctly aiming the needle. Robotic solutions, and the usage of automatic lesion segmentation in ultrasound imaging along with advanced visualization techniques, such as augmented reality, can potentially make this process simpler, safer, and faster. The OncoNavigator project, in which this dissertation integrates, aims to improve the precision of the current breast cancer interventions. To accomplish this objective various medical training and robotic biopsy aid were developed. An augmented reality ultrasound training solution was created and the device’s tracking capabilities were validated by comparing it with an electromagnetic tracking device. Another solution for ultrasound-guided breast biopsy assisted with augmented reality was developed. This solution displays real-time ultrasound video, automatic lesion segmentation, and biopsy needle trajectory display in the user’s field of view. The validation of this solution was made by comparing its usability with the traditional procedure. A modular software framework was also developed that focuses on the integration of a collaborative medical robot with real-time ultrasound imaging and automatic lesion segmentation. Overall, the developed solutions offered good results. The augmented reality glasses tracking capabilities proved to be as capable as the electromagnetic system, and the augmented reality assisted breast biopsy proved to make the procedure more accurate and precise than the traditional system.O cancro da mama é, atualmente, um dos tipos de cancro mais comuns a serem diagnosticados e a quinta principal causa de mortes relacionadas ao cancro. O seu tratamento tem maior taxa de sobrevivência quando é diagnosticado nas fases iniciais da doença. O procedimento de triagem utiliza técnicas de imagem médica, como mamografia ou ultrassom, para descobrir possíveis lesões. Quando um médico encontra uma lesão com probabilidade de ser maligna, é realizada uma biópsia para obter uma amostra e determinar as suas características. O ultrassom em tempo real é a modalidade de imagem médica preferida para realizar esse procedimento. A biópsia mamária depende da habilidade e experiência do operador, devido à dificuldade de interpretação das imagens ultrassonográficas e ao direcionamento correto da agulha. Soluções robóticas, com o uso de segmentação automática de lesões em imagens de ultrassom, juntamente com técnicas avançadas de visualização, nomeadamente realidade aumentada, podem tornar esse processo mais simples, seguro e rápido. O projeto OncoNavigator, que esta dissertação integra, visa melhorar a precisão das atuais intervenções ao cancro da mama. Para atingir este objetivo, vários ajudas para treino médico e auxílio à biópsia por meio robótico foram desenvolvidas. Uma solução de treino de ultrassom com realidade aumentada foi criada e os recursos de rastreio do dispositivo foram validados comparando-os com um dispositivo eletromagnético. Outra solução para biópsia de mama guiada por ultrassom assistida com realidade aumentada foi desenvolvida. Esta solução exibe vídeo de ultrassom em tempo real, segmentação automática de lesões e exibição da trajetória da agulha de biópsia no campo de visão do utilizador. A validação desta solução foi feita comparando a sua usabilidade com o procedimento tradicional. Também foi desenvolvida uma estrutura de software modular que se concentra na integração de um robô médico colaborativo com imagens de ultrassom em tempo real e segmentação automática de lesões. Os recursos de rastreio dos óculos de realidade aumentada mostraram-se tão capazes quanto o sistema eletromagnético, e a biópsia de mama assistida por realidade aumentada provou tornar o procedimento mais exato e preciso do que o sistema tradicional

    Improving Radiotherapy Targeting for Cancer Treatment Through Space and Time

    Get PDF
    Radiotherapy is a common medical treatment in which lethal doses of ionizing radiation are preferentially delivered to cancerous tumors. In external beam radiotherapy, radiation is delivered by a remote source which sits several feet from the patient\u27s surface. Although great effort is taken in properly aligning the target to the path of the radiation beam, positional uncertainties and other errors can compromise targeting accuracy. Such errors can lead to a failure in treating the target, and inflict significant toxicity to healthy tissues which are inadvertently exposed high radiation doses. Tracking the movement of targeted anatomy between and during treatment fractions provides valuable localization information that allows for the reduction of these positional uncertainties. Inter- and intra-fraction anatomical localization data not only allows for more accurate treatment setup, but also potentially allows for 1) retrospective treatment evaluation, 2) margin reduction and modification of the dose distribution to accommodate daily anatomical changes (called `adaptive radiotherapy\u27), and 3) targeting interventions during treatment (for example, suspending radiation delivery while the target it outside the path of the beam). The research presented here investigates the use of inter- and intra-fraction localization technologies to improve radiotherapy to targets through enhanced spatial and temporal accuracy. These technologies provide significant advancements in cancer treatment compared to standard clinical technologies. Furthermore, work is presented for the use of localization data acquired from these technologies in adaptive treatment planning, an investigational technique in which the distribution of planned dose is modified during the course of treatment based on biological and/or geometrical changes of the patient\u27s anatomy. The focus of this research is directed at abdominal sites, which has historically been central to the problem of motion management in radiation therapy

    Tools for improving high-dose-rate prostate cancer brachytherapy using three-dimensional ultrasound and magnetic resonance imaging

    Get PDF
    High-dose-rate brachytherapy (HDR-BT) is an interstitial technique for the treatment of intermediate and high-risk localized prostate cancer that involves placement of a radiation source directly inside the prostate using needles. Dose-escalated whole-gland treatments have led to improvements in survival, and tumour-targeted treatments may offer future improvements in therapeutic ratio. The efficacy of tumour-targeted HDR-BT depends on imaging tools to enable accurate dose delivery to prostate sub-volumes. This thesis is focused on implementing ultrasound tools to improve HDR-BT needle localization accuracy and efficiency, and evaluating dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) for tumour localization. First, we implemented a device enabling sagittally-reconstructed 3D (SR3D) ultrasound, which provides sub-millimeter resolution in the needle insertion direction. We acquired SR3D and routine clinical images in a cohort of 12 consecutive eligible HDR-BT patients, with a total of 194 needles. The SR3D technique provided needle insertion depth errors within 5 mm for 93\% of needles versus 76\% for the clinical imaging technique, leading to increased precision in dose delivered to the prostate. Second, we implemented an algorithm to automatically segment multiple HDR-BT needles in a SR3D image. The algorithm was applied to the SR3D images from the first patient cohort, demonstrating mean execution times of 11.0 s per patient and successfully segmenting 82\% of needles within 3 mm. Third, we augmented SR3D imaging with live-2D sagittal ultrasound for needle tip localization. This combined technique was applied to another cohort of 10 HDR-BT patients, reducing insertion depth errors compared to routine imaging from a range of [-8.1 mm, 7.7 mm] to [-6.2 mm, 5.9 mm]. Finally, we acquired DCE-MRI in 16 patients scheduled to undergo prostatectomy, using either high spatial resolution or high temporal resolution imaging, and compared the images to whole-mount histology. The high spatial resolution images demonstrated improved high-grade cancer classification compared to the high temporal resolution images, with areas under the receiver operating characteristic curve of 0.79 and 0.70, respectively. In conclusion, we have translated and evaluated specialized imaging tools for HDR-BT which are ready to be tested in a clinical trial investigating tumour-targeted treatment
    • …
    corecore