148 research outputs found

    Medical image computing and computer-aided medical interventions applied to soft tissues. Work in progress in urology

    Full text link
    Until recently, Computer-Aided Medical Interventions (CAMI) and Medical Robotics have focused on rigid and non deformable anatomical structures. Nowadays, special attention is paid to soft tissues, raising complex issues due to their mobility and deformation. Mini-invasive digestive surgery was probably one of the first fields where soft tissues were handled through the development of simulators, tracking of anatomical structures and specific assistance robots. However, other clinical domains, for instance urology, are concerned. Indeed, laparoscopic surgery, new tumour destruction techniques (e.g. HIFU, radiofrequency, or cryoablation), increasingly early detection of cancer, and use of interventional and diagnostic imaging modalities, recently opened new challenges to the urologist and scientists involved in CAMI. This resulted in the last five years in a very significant increase of research and developments of computer-aided urology systems. In this paper, we propose a description of the main problems related to computer-aided diagnostic and therapy of soft tissues and give a survey of the different types of assistance offered to the urologist: robotization, image fusion, surgical navigation. Both research projects and operational industrial systems are discussed

    Adversarial Deformation Regularization for Training Image Registration Neural Networks

    Get PDF
    We describe an adversarial learning approach to constrain convolutional neural network training for image registration, replacing heuristic smoothness measures of displacement fields often used in these tasks. Using minimally-invasive prostate cancer intervention as an example application, we demonstrate the feasibility of utilizing biomechanical simulations to regularize a weakly-supervised anatomical-label-driven registration network for aligning pre-procedural magnetic resonance (MR) and 3D intra-procedural transrectal ultrasound (TRUS) images. A discriminator network is optimized to distinguish the registration-predicted displacement fields from the motion data simulated by finite element analysis. During training, the registration network simultaneously aims to maximize similarity between anatomical labels that drives image alignment and to minimize an adversarial generator loss that measures divergence between the predicted- and simulated deformation. The end-to-end trained network enables efficient and fully-automated registration that only requires an MR and TRUS image pair as input, without anatomical labels or simulated data during inference. 108 pairs of labelled MR and TRUS images from 76 prostate cancer patients and 71,500 nonlinear finite-element simulations from 143 different patients were used for this study. We show that, with only gland segmentation as training labels, the proposed method can help predict physically plausible deformation without any other smoothness penalty. Based on cross-validation experiments using 834 pairs of independent validation landmarks, the proposed adversarial-regularized registration achieved a target registration error of 6.3 mm that is significantly lower than those from several other regularization methods.Comment: Accepted to MICCAI 201

    Biomechanical Modeling and Inverse Problem Based Elasticity Imaging for Prostate Cancer Diagnosis

    Get PDF
    Early detection of prostate cancer plays an important role in successful prostate cancer treatment. This requires screening the prostate periodically after the age of 50. If screening tests lead to prostate cancer suspicion, prostate needle biopsy is administered which is still considered as the clinical gold standard for prostate cancer diagnosis. Given that needle biopsy is invasive and is associated with issues including discomfort and infection, it is desirable to develop a prostate cancer diagnosis system that has high sensitivity and specificity for early detection with a potential to improve needle biopsy outcome. Given the complexity and variability of prostate cancer pathologies, many research groups have been pursuing multi-parametric imaging approach as no single modality imaging technique has proven to be adequate. While imaging additional tissue properties increases the chance of reliable prostate cancer detection and diagnosis, selecting an additional property needs to be done carefully by considering clinical acceptability and cost. Clinical acceptability entails ease with respect to both operating by the radiologist and patient comfort. In this work, effective tissue biomechanics based diagnostic techniques are proposed for prostate cancer assessment with the aim of early detection and minimizing the numbers of prostate biopsies. The techniques take advantage of the low cost, widely available and well established TRUS imaging method. The proposed techniques include novel elastography methods which were formulated based on an inverse finite element frame work. Conventional finite element analysis is known to have high computational complexity, hence computation time demanding. This renders the proposed elastography methods not suitable for real-time applications. To address this issue, an accelerated finite element method was proposed which proved to be suitable for prostate elasticity reconstruction. In this method, accurate finite element analysis of a large number of prostates undergoing TRUS probe loadings was performed. Geometry input and displacement and stress fields output obtained from the analysis were used to train a neural network mapping function to be used for elastopgraphy imaging of prostate cancer patients. The last part of the research presented in this thesis tackles an issue with the current 3D TRUS prostate needle biopsy. Current 3D TRUS prostate needle biopsy systems require registering preoperative 3D TRUS to intra-operative 2D TRUS images. Such image registration is time-consuming while its real-time implementation is yet to be developed. To bypass this registration step, concept of a robotic system was proposed which can reliably determine the preoperative TRUS probe position relative to the prostate to place at the same position relative to the prostate intra-operatively. For this purpose, a contact pressure feedback system is proposed to ensure similar prostate deformation during 3D and 2D image acquisition in order to bypass the registration step

    Label-driven weakly-supervised learning for multimodal deformable image registration

    Get PDF
    Spatially aligning medical images from different modalities remains a challenging task, especially for intraoperative applications that require fast and robust algorithms. We propose a weakly-supervised, label-driven formulation for learning 3D voxel correspondence from higher-level label correspondence, thereby bypassing classical intensity-based image similarity measures. During training, a convolutional neural network is optimised by outputting a dense displacement field (DDF) that warps a set of available anatomical labels from the moving image to match their corresponding counterparts in the fixed image. These label pairs, including solid organs, ducts, vessels, point landmarks and other ad hoc structures, are only required at training time and can be spatially aligned by minimising a cross-entropy function of the warped moving label and the fixed label. During inference, the trained network takes a new image pair to predict an optimal DDF, resulting in a fully-automatic, label-free, real-time and deformable registration. For interventional applications where large global transformation prevails, we also propose a neural network architecture to jointly optimise the global- and local displacements. Experiment results are presented based on cross-validating registrations of 111 pairs of T2-weighted magnetic resonance images and 3D transrectal ultrasound images from prostate cancer patients with a total of over 4000 anatomical labels, yielding a median target registration error of 4.2 mm on landmark centroids and a median Dice of 0.88 on prostate glands.Comment: Accepted to ISBI 201

    Biomechanical modelling of the pelvic system: improving the accuracy of the location of neoplasms in MRI-TRUS fusion prostate biopsy

    Get PDF
    Background An accurate knowledge of the relocation of prostate neoplasms during biopsy is of great importance to reduce the number of false negative results. Prostate neoplasms are visible in magnetic resonance images (MRI) but it is difficult for the practitioner to locate them at the time of performing a transrectal ultrasound (TRUS) guided biopsy. In this study, we present a new methodology, based on simulation, that predicts both prostate deformation and lesion migration during the biopsy. Methods A three-dimensional (3-D) anatomy model of the pelvic region, based on medical images, is constructed. A finite element (FE) numerical simulation of the organs motion and deformation as a result of the pressure exerted by the TRUS probe is carried out using the Code-Aster open-source computer software. Initial positions of potential prostate lesions prior to biopsy are taken into consideration and the final location of each lesion is targeted in the FE simulation output. Results Our 3-D FE simulations show that the effect of the pressure exerted by the TRUS probe is twofold as the prostate experiences both a motion and a deformation of its original shape. We targeted the relocation of five small prostate lesions when the TRUS probe exerts a force of 30 N on the rectum inner wall. The distance travelled by these lesions ranged between 5.6 and 13.9 mm. Conclusions Our new methodology can help to predict the location of neoplasms during a prostate biopsy but further studies are needed to validate our results. Moreover, the new methodology is completely developed on open-source software, which means that its implementation would be affordable to all healthcare providers

    Atlas-Based Prostate Segmentation Using an Hybrid Registration

    Full text link
    Purpose: This paper presents the preliminary results of a semi-automatic method for prostate segmentation of Magnetic Resonance Images (MRI) which aims to be incorporated in a navigation system for prostate brachytherapy. Methods: The method is based on the registration of an anatomical atlas computed from a population of 18 MRI exams onto a patient image. An hybrid registration framework which couples an intensity-based registration with a robust point-matching algorithm is used for both atlas building and atlas registration. Results: The method has been validated on the same dataset that the one used to construct the atlas using the "leave-one-out method". Results gives a mean error of 3.39 mm and a standard deviation of 1.95 mm with respect to expert segmentations. Conclusions: We think that this segmentation tool may be a very valuable help to the clinician for routine quantitative image exploitation.Comment: International Journal of Computer Assisted Radiology and Surgery (2008) 000-99

    Real-time multimodal image registration with partial intraoperative point-set data

    Get PDF
    We present Free Point Transformer (FPT) - a deep neural network architecture for non-rigid point-set registration. Consisting of two modules, a global feature extraction module and a point transformation module, FPT does not assume explicit constraints based on point vicinity, thereby overcoming a common requirement of previous learning-based point-set registration methods. FPT is designed to accept unordered and unstructured point-sets with a variable number of points and uses a "model-free" approach without heuristic constraints. Training FPT is flexible and involves minimizing an intuitive unsupervised loss function, but supervised, semi-supervised, and partially- or weakly-supervised training are also supported. This flexibility makes FPT amenable to multimodal image registration problems where the ground-truth deformations are difficult or impossible to measure. In this paper, we demonstrate the application of FPT to non-rigid registration of prostate magnetic resonance (MR) imaging and sparsely-sampled transrectal ultrasound (TRUS) images. The registration errors were 4.71 mm and 4.81 mm for complete TRUS imaging and sparsely-sampled TRUS imaging, respectively. The results indicate superior accuracy to the alternative rigid and non-rigid registration algorithms tested and substantially lower computation time. The rapid inference possible with FPT makes it particularly suitable for applications where real-time registration is beneficial

    Deformable MRI to Transrectal Ultrasound Registration for Prostate Interventions Using Deep Learning

    Get PDF
    RÉSUMÉ: Le cancer de la prostate est l’un des principaux problèmes de santé publique dans le monde. Un diagnostic précoce du cancer de la prostate pourrait jouer un rôle vital dans le traitement des patients. Les procédures de biopsie sont utilisées à des fins de diagnostic. À cet égard, l’échographie transrectale (TRUS) est considérée comme un standard pour l’imagerie de la prostate lors d’une biopsie ou d’une curiethérapie. Cette technique d’imagerie est relativement peu coûteuse, peut scanner l’organe en temps réel et est sans radiation. Ainsi, les scans TRUS sont utilisés pour guider les cliniciens sur l’emplacement d’une tumeur à l’intérieur de la prostate. Le défi majeur réside dans le fait que les images TRUS ont une faible résolution et qualité d’image. Il est difficile de distinguer l’emplacement exact de la tumeur et l’étendue de la maladie. De plus, l’organe de la prostate subit d’importantes variations de forme au cours d’une intervention de la prostate, ce qui rend l’identification de la tumeur encore plus difficile.----------ABSTRACT: Prostate cancer is one of the major public health issues in the world. An accurate and early diagnosis of prostate cancer could play a vital role in the treatment of patients. Biopsy procedures are used for diagnosis purposes. In this regard, Transrectal Ultrasound (TRUS) is considered a standard for imaging the prostate during a biopsy or brachytherapy procedure. This imaging technique is comparatively low-cost, can scan the organ in real-time, and is radiation free. Thus, TRUS scans are used to guide the clinicians about the location of a tumor inside the prostate organ. The major challenge lies in the fact that TRUS images have low resolution and quality. This makes it difficult to distinguish the exact tumor location and the extent of the disease. In addition, the prostate organ undergoes important shape variations during a prostate intervention procedure, which makes the tumor identification even harder

    Medical Image Registration Using Deep Neural Networks

    Get PDF
    Registration is a fundamental problem in medical image analysis wherein images are transformed spatially to align corresponding anatomical structures in each image. Recently, the development of learning-based methods, which exploit deep neural networks and can outperform classical iterative methods, has received considerable interest from the research community. This interest is due in part to the substantially reduced computational requirements that learning-based methods have during inference, which makes them particularly well-suited to real-time registration applications. Despite these successes, learning-based methods can perform poorly when applied to images from different modalities where intensity characteristics can vary greatly, such as in magnetic resonance and ultrasound imaging. Moreover, registration performance is often demonstrated on well-curated datasets, closely matching the distribution of the training data. This makes it difficult to determine whether demonstrated performance accurately represents the generalization and robustness required for clinical use. This thesis presents learning-based methods which address the aforementioned difficulties by utilizing intuitive point-set-based representations, user interaction and meta-learning-based training strategies. Primarily, this is demonstrated with a focus on the non-rigid registration of 3D magnetic resonance imaging to sparse 2D transrectal ultrasound images to assist in the delivery of targeted prostate biopsies. While conventional systematic prostate biopsy methods can require many samples to be taken to confidently produce a diagnosis, tumor-targeted approaches have shown improved patient, diagnostic, and disease management outcomes with fewer samples. However, the available intraoperative transrectal ultrasound imaging alone is insufficient for accurate targeted guidance. As such, this exemplar application is used to illustrate the effectiveness of sparse, interactively-acquired ultrasound imaging for real-time, interventional registration. The presented methods are found to improve registration accuracy, relative to state-of-the-art, with substantially lower computation time and require a fraction of the data at inference. As a result, these methods are particularly attractive given their potential for real-time registration in interventional applications
    • …
    corecore