7 research outputs found
Prosper: image and robot-guided prostate brachytherapy
Brachytherapy for localized prostate cancer consists in destroying cancer by
introducing iodine radioactive seeds into the gland through hollow needles. The
planning of the position of the seeds and their introduction into the prostate
is based on intra-operative ultrasound (US) imaging. We propose to optimize the
global quality of the procedure by: i) using 3D US; ii) enhancing US data with
MRI registration; iii) using a specially designed needle-insertion robot,
connected to the imaging data. The imaging methods have been successfully
tested on patient data while the robot accuracy has been evaluated on a
realistic deformable phantom
Biopsym : a learning environment for transrectal ultrasound guided prostate biopsies
This paper describes a learning environment for image-guided prostate
biopsies in cancer diagnosis; it is based on an ultrasound probe simulator
virtually exploring real datasets obtained from patients. The aim is to make
the training of young physicians easier and faster with a tool that combines
lectures, biopsy simulations and recommended exercises to master this medical
gesture. It will particularly help acquiring the three-dimensional
representation of the prostate needed for practicing biopsy sequences. The
simulator uses a haptic feedback to compute the position of the virtual probe
from three-dimensional (3D) ultrasound recorded data. This paper presents the
current version of this learning environment
Technical Note: Error metrics for estimating the accuracy of needle/instrument placement during transperineal MR/US-guided prostate interventions
Purpose: Image-guided systems that fuse magnetic resonance imaging (MRI) with three-dimensional (3D) ultrasound (US) images for performing targeted prostate needle biopsy and minimally-invasive treatments for prostate cancer are of increasing clinical interest. To date, a wide range of different accuracy estimation procedures and error metrics have been reported, which makes comparing the performance of different systems difficult.
Methods: A set of 9 measures are presented to assess the accuracy of MRI-US image registration, needle positioning, needle guidance, and overall system error, with the aim of providing a methodology for estimating the accuracy of instrument placement using a MR/US-guided transperineal approach.
Results: Using the SmartTarget fusion system, an MRI-US image alignment error was determined to be 2.0±1.0 mm (mean ± SD), and an overall system instrument targeting error of 3.0±1.2 mm. Three needle deployments for each target phantom lesion was found to result in a 100% lesion hit rate and a median predicted cancer core length of 5.2 mm.
Conclusions: The application of a comprehensive, unbiased validation assessment for MR/TRUS guided systems can provide useful information on system performance for quality assurance and system comparison. Furthermore, such an analysis can be helpful in identifying relationships between these errors, providing insight into the technical behaviour of these systems
Prostate biopsy tracking with deformation estimation
Transrectal biopsies under 2D ultrasound (US) control are the current
clinical standard for prostate cancer diagnosis. The isoechogenic nature of
prostate carcinoma makes it necessary to sample the gland systematically,
resulting in a low sensitivity. Also, it is difficult for the clinician to
follow the sampling protocol accurately under 2D US control and the exact
anatomical location of the biopsy cores is unknown after the intervention.
Tracking systems for prostate biopsies make it possible to generate biopsy
distribution maps for intra- and post-interventional quality control and 3D
visualisation of histological results for diagnosis and treatment planning.
They can also guide the clinician toward non-ultrasound targets. In this paper,
a volume-swept 3D US based tracking system for fast and accurate estimation of
prostate tissue motion is proposed. The entirely image-based system solves the
patient motion problem with an a priori model of rectal probe kinematics.
Prostate deformations are estimated with elastic registration to maximize
accuracy. The system is robust with only 17 registration failures out of 786
(2%) biopsy volumes acquired from 47 patients during biopsy sessions. Accuracy
was evaluated to 0.760.52mm using manually segmented fiducials on 687
registered volumes stemming from 40 patients. A clinical protocol for assisted
biopsy acquisition was designed and implemented as a biopsy assistance system,
which allows to overcome the draw-backs of the standard biopsy procedure.Comment: Medical Image Analysis (2011) epub ahead of prin
MR prior based automatic segmentation of the prostate in TRUS images for MR/TRUS data fusion
International audienceThe poor signal-to-noise ratio in transrectal ultrasound (TRUS) images makes the fully automatic segmentation of the prostate challenging and most approaches proposed in the literature still lack robustness and accuracy. However, it is relatively straightforward to obtain high quality segmentations in magnetic resonance (MR) images. In the context of MR to TRUS data fusion the information gathered in the MR images can hence provide a strong prior for US segmentation. In this paper, we describe a method to non-linearly register a patient specific mesh of the prostate build from MR images to TRUS volume. The MR prior provides shape and volume constraints that are used to guide the MR-to-TRUS surface deformation, in collaboration with a US image contour appearance model. The anatomical point correspondences between the MR and TRUS surfaces are obtained implicitly. The method was validated on 30 pairs of MR/TRUS patient exams and achieves a mean Dice value 0.85 and a mean surface error of 2.0 mm
Registration of magnetic resonance and ultrasound images for guiding prostate cancer interventions
Prostate cancer is a major international health problem with a large and rising incidence in many parts of the world. Transrectal ultrasound (TRUS) imaging is used routinely to guide surgical procedures, such as needle biopsy and a number of minimally-invasive therapies, but its limited ability to visualise prostate cancer is widely recognised. Magnetic resonance (MR) imaging techniques, on the other hand, have recently been developed that can provide clinically useful diagnostic information. Registration (or alignment) of MR and TRUS images during TRUS-guided surgical interventions potentially provides a cost-effective approach to augment TRUS images with clinically useful, MR-derived information (for example, tumour location, shape and size). This thesis describes a deformable image registration framework that enables automatic and/or semi-automatic alignment of MR and 3D TRUS images of the prostate gland. The method combines two technical developments in the field: First, a method for constructing patient-specific statistical shape models of prostate motion/deformation, based on learning from finite element simulations of gland motion using geometric data from a preoperative MR image, is proposed. Second, a novel “model-to-image” registration framework is developed to register this statistical shape model automatically to an intraoperative TRUS image. This registration approach is implemented using a novel model-to-image vector alignment (MIVA) algorithm, which maximises the likelihood of a particular instance of a statistical shape model given a voxel-intensity-based feature vector that represents an estimate of the surface normal vectors at the boundary of the organ in question. Using real patient data, the MR-TRUS registration accuracy of the new algorithm is validated using intra-prostatic anatomical landmarks. A rigorous and extensive validation analysis is also provided for assessing the image registration experiments. The final target registration error after performing 100 MR–TRUS registrations for each patient have a median of 2.40 mm, meaning that over 93% registrations may successfully hit the target representing a clinically significant lesion. The implemented registration algorithms took less than 30 seconds and 2 minutes for manually defined point- and normal vector features, respectively. The thesis concludes with a summary of potential applications and future research directions