61 research outputs found

    Learning Deep Similarity Metric for 3D MR-TRUS Registration

    Full text link
    Purpose: The fusion of transrectal ultrasound (TRUS) and magnetic resonance (MR) images for guiding targeted prostate biopsy has significantly improved the biopsy yield of aggressive cancers. A key component of MR-TRUS fusion is image registration. However, it is very challenging to obtain a robust automatic MR-TRUS registration due to the large appearance difference between the two imaging modalities. The work presented in this paper aims to tackle this problem by addressing two challenges: (i) the definition of a suitable similarity metric and (ii) the determination of a suitable optimization strategy. Methods: This work proposes the use of a deep convolutional neural network to learn a similarity metric for MR-TRUS registration. We also use a composite optimization strategy that explores the solution space in order to search for a suitable initialization for the second-order optimization of the learned metric. Further, a multi-pass approach is used in order to smooth the metric for optimization. Results: The learned similarity metric outperforms the classical mutual information and also the state-of-the-art MIND feature based methods. The results indicate that the overall registration framework has a large capture range. The proposed deep similarity metric based approach obtained a mean TRE of 3.86mm (with an initial TRE of 16mm) for this challenging problem. Conclusion: A similarity metric that is learned using a deep neural network can be used to assess the quality of any given image registration and can be used in conjunction with the aforementioned optimization framework to perform automatic registration that is robust to poor initialization.Comment: To appear on IJCAR

    Meta-Learning Initializations for Interactive Medical Image Registration

    Get PDF
    We present a meta-learning framework for interactive medical image registration. Our proposed framework comprises three components: a learning-based medical image registration algorithm, a form of user interaction that refines registration at inference, and a meta-learning protocol that learns a rapidly adaptable network initialization. This paper describes a specific algorithm that implements the registration, interaction and meta-learning protocol for our exemplar clinical application: registration of magnetic resonance (MR) imaging to interactively acquired, sparsely-sampled transrectal ultrasound (TRUS) images. Our approach obtains comparable registration error (4.26 mm) to the best-performing non-interactive learning-based 3D-to-3D method (3.97 mm) while requiring only a fraction of the data, and occurring in real-time during acquisition. Applying sparsely sampled data to non-interactive methods yields higher registration errors (6.26 mm), demonstrating the effectiveness of interactive MR-TRUS registration, which may be applied intraoperatively given the real-time nature of the adaptation process.Comment: 11 pages, 10 figures. Paper accepted to IEEE Transactions on Medical Imaging (October 26 2022

    Real-time multimodal image registration with partial intraoperative point-set data

    Get PDF
    We present Free Point Transformer (FPT) - a deep neural network architecture for non-rigid point-set registration. Consisting of two modules, a global feature extraction module and a point transformation module, FPT does not assume explicit constraints based on point vicinity, thereby overcoming a common requirement of previous learning-based point-set registration methods. FPT is designed to accept unordered and unstructured point-sets with a variable number of points and uses a "model-free" approach without heuristic constraints. Training FPT is flexible and involves minimizing an intuitive unsupervised loss function, but supervised, semi-supervised, and partially- or weakly-supervised training are also supported. This flexibility makes FPT amenable to multimodal image registration problems where the ground-truth deformations are difficult or impossible to measure. In this paper, we demonstrate the application of FPT to non-rigid registration of prostate magnetic resonance (MR) imaging and sparsely-sampled transrectal ultrasound (TRUS) images. The registration errors were 4.71 mm and 4.81 mm for complete TRUS imaging and sparsely-sampled TRUS imaging, respectively. The results indicate superior accuracy to the alternative rigid and non-rigid registration algorithms tested and substantially lower computation time. The rapid inference possible with FPT makes it particularly suitable for applications where real-time registration is beneficial

    Toward optimization of target planning for magnetic resonance image-targeted, 3D transrectal ultrasound-guided fusion prostate biopsy

    Get PDF
    The current clinical standard for diagnosis of prostate cancer (PCa) is 2D transrectal ultrasound (TRUS)-guided biopsy. However, this procedure has a false negative rate of 21-47% and therefore many patients return for repeat biopsies. A potential solution for improving upon this problem is “fusion” biopsy, where magnetic resonance imaging (MRI) is used for PCa detection and localization prior to biopsy. In this procedure, tumours are delineated on pre-procedural MRI and registered to the 3D TRUS needle guidance modality. However, fusion biopsy continues to yield false negative results and there remains a gap in knowledge regarding biopsy needle target selection. Within-tumour needle targets are currently chosen ad hoc by the operating clinician without accounting for guidance system and registration errors. The objective of this thesis was to investigate how the choice of target selection strategy and number of biopsy attempts made per lesion may affect PCa diagnosis in the presence of needle delivery error. A fusion prostate biopsy simulation software platform was developed, which allowed for the investigation of how needle delivery error affects PCa diagnosis and cancer burden estimation. Initial work was conducted using 3D lesions contoured on MRI by collaborating radiologists. The results indicated that more than one core must be taken from the majority of lesions to achieve a sampling probability 95% for a biopsy system with needle delivery error ≄ 3.5 mm. Furthermore, it was observed that the optimal targeting scheme depends on the relative levels of systematic and random needle delivery errors inherent to the specific fusion biopsy system. Lastly, PCa tumours contoured on digital histology images by genitourinary pathologists were used to conduct biopsy simulations. The results demonstrated that needle delivery error has a substantial impact on the biopsy core involvement observed, and that targeting of high-grade lesions may result in higher core involvement variability compared with lesions of all grades. This work represents a first step toward improving the manner in which lesions are targeted using fusion biopsy. Successful integration of these findings into current fusion biopsy system operation could lead to earlier PCa diagnosis with the need for fewer repeat biopsy procedures

    Deformable registration of X-ray and MRI for post-implant dosimetry in low-dose-rate prostate brachytherapy

    Get PDF
    Purpose Dosimetric assessment following permanent prostate brachytherapy (PPB) commonly involves seed localization using CT and prostate delineation using coregistered MRI. However, pelvic CT leads to additional imaging dose and requires significant resources to acquire and process both CT and MRI. In this study, we propose an automatic postimplant dosimetry approach that retains MRI for soft‐tissue contouring, but eliminates the need for CT and reduces imaging dose while overcoming the inconsistent appearance of seeds on MRI with three projection x rays acquired using a mobile C‐arm. Methods Implanted seeds are reconstructed using x rays by solving a combinatorial optimization problem and deformably registered to MRI. Candidate seeds are located in MR images using local hypointensity identification. X ray‐based seeds are registered to these candidate seeds in three steps: (a) rigid registration using a stochastic evolutionary optimizer, (b) affine registration using an iterative closest point optimizer, and (c) deformable registration using a local feature point search and nonrigid coherent point drift. The algorithm was evaluated using 20 PPB patients with x rays acquired immediately postimplant and T2‐weighted MR images acquired the next day at 1.5 T with mean 0.8 × 0.8 × 3.0 mmurn:x-wiley:00942405:media:mp13667:mp13667-math-0001 voxel dimensions. Target registration error (TRE) was computed based on the distance from algorithm results to manually identified seed locations using coregistered CT acquired the same day as the MRI. Dosimetric accuracy was determined by comparing prostate D90 determined using the algorithm and the ground truth CT‐based seed locations. Results The mean ± standard deviation TREs across 20 patients including 1774 seeds were 2.23 ± 0.52 mm (rigid), 1.99 ± 0.49 mm (rigid + affine), and 1.76 ± 0.43 mm (rigid + affine + deformable). The corresponding mean ± standard deviation D90 errors were 5.8 ± 4.8%, 3.4 ± 3.4%, and 2.3 ± 1.9%, respectively. The mean computation time of the registration algorithm was 6.1 s. Conclusion The registration algorithm accuracy and computation time are sufficient for clinical PPB postimplant dosimetry

    Image-based registration methods for quantification and compensation of prostate motion during trans-rectal ultrasound (TRUS)-guided biopsy

    Get PDF
    Prostate biopsy is the clinical standard for cancer diagnosis and is typically performed under two-dimensional (2D) transrectal ultrasound (TRUS) for needle guidance. Unfortunately, most early stage prostate cancers are not visible on ultrasound and the procedure suffers from high false negative rates due to the lack of visible targets. Fusion of pre-biopsy MRI to 3D TRUS for targeted biopsy could improve cancer detection rates and volume of tumor sampled. In MRI-TRUS fusion biopsy systems, patient or prostate motion during the procedure causes misalignments in the MR targets mapped to the live 2D TRUS images, limiting the targeting accuracy of the biopsy system. In order to sample smallest clinically significant tumours of 0.5 cm3with 95% confidence, the root mean square (RMS) error of the biopsy system needs to be The target misalignments due to intermittent prostate motion during the procedure can be compensated by registering the live 2D TRUS images acquired during the biopsy procedure to the pre-acquired baseline 3D TRUS image. The registration must be performed both accurately and quickly in order to be useful during the clinical procedure. We developed an intensity-based 2D-3D rigid registration algorithm and validated it by calculating the target registration error (TRE) using manually identified fiducials within the prostate. We discuss two different approaches that can be used to improve the robustness of this registration to meet the clinical requirements. Firstly, we evaluated the impact of intra-procedural 3D TRUS imaging on motion compensation accuracy since the limited anatomical context available in live 2D TRUS images could limit the robustness of the 2D-3D registration. The results indicated that TRE improved when intra-procedural 3D TRUS images were used in registration, with larger improvements in the base and apex regions as compared with the mid-gland region. Secondly, we developed and evaluated a registration algorithm whose optimization is based on learned prostate motion characteristics. Compared to our initial approach, the updated optimization improved the robustness during 2D-3D registration by reducing the number of registrations with a TRE \u3e 5 mm from 9.2% to 1.2% with an overall RMS TRE of 2.3 mm. The methods developed in this work were intended to improve the needle targeting accuracy of 3D TRUS-guided biopsy systems. The successful integration of the techniques into current 3D TRUS-guided systems could improve the overall cancer detection rate during the biopsy and help to achieve earlier diagnosis and fewer repeat biopsy procedures in prostate cancer diagnosis
    • 

    corecore