2,291 research outputs found

    Numerical methods for coupled reconstruction and registration in digital breast tomosynthesis.

    Get PDF
    Digital Breast Tomosynthesis (DBT) provides an insight into the fine details of normal fibroglandular tissues and abnormal lesions by reconstructing a pseudo-3D image of the breast. In this respect, DBT overcomes a major limitation of conventional X-ray mam- mography by reducing the confounding effects caused by the superposition of breast tissue. In a breast cancer screening or diagnostic context, a radiologist is interested in detecting change, which might be indicative of malignant disease. To help automate this task image registration is required to establish spatial correspondence between time points. Typically, images, such as MRI or CT, are first reconstructed and then registered. This approach can be effective if reconstructing using a complete set of data. However, for ill-posed, limited-angle problems such as DBT, estimating the deformation is com- plicated by the significant artefacts associated with the reconstruction, leading to severe inaccuracies in the registration. This paper presents a mathematical framework, which couples the two tasks and jointly estimates both image intensities and the parameters of a transformation. Under this framework, we compare an iterative method and a simultaneous method, both of which tackle the problem of comparing DBT data by combining reconstruction of a pair of temporal volumes with their registration. We evaluate our methods using various computational digital phantoms, uncom- pressed breast MR images, and in-vivo DBT simulations. Firstly, we compare both iter- ative and simultaneous methods to the conventional, sequential method using an affine transformation model. We show that jointly estimating image intensities and parametric transformations gives superior results with respect to reconstruction fidelity and regis- tration accuracy. Also, we incorporate a non-rigid B-spline transformation model into our simultaneous method. The results demonstrate a visually plausible recovery of the deformation with preservation of the reconstruction fidelity

    Performance Evaluation of a Small-Animal PET/CT System

    Get PDF
    This master project is the first vendor-independent performance evaluation of the new nanoScan PET/CT system at the University of Bergen. A comprehensive performance evaluation of a novel scanner is very important, particularly when quantitative assessments of images are required. The nanoScan PET/CT system is a fully integrated small-animal PET/CT system. An abbreviated performance evaluation of the CT subsystem was done, which included a Hounsfield quality check, a comparison of reconstruction filters and an evaluation of the different scanning methods. The PET subsystem was performance evaluated according to the NEMA NU 4-2008 standard. This standard includes tests of spatial resolution, counting rate capabilities, sensitivity and image quality. The CT evaluation proved adequate for its intended use. There were only minor differences in the noise measurement of the different reconstruction filters. The scanning method helical, 1 pitch" would for most applications be recommended, as this scanning method had lowest dose, good images, and just few minutes longer scan time than the scanning method with lowest scan time. The measurements from the PET evaluation were in good agreement with values reported by vendor and in literature. The evaluation of the scanner shows that it has one of the best spatial resolutions available, approximately 1 mm at center of field of view (FOV). The sensitivity at center of FOV was 8.8%, just a bit lower than the highest reported absolute sensitivity at center of FOV, i.e. 10%. The counting rate capabilities proved adequate for all applications undertaken to date, and the NEMA image quality phantom studies demonstrated good values of uniformity and recovery coefficients. The procedures and methods in this thesis will make it easier to monitor the scanner performance with periodic testing to check if the scanner is robust, reliable and reproducible.Master i FysikkMAMN-PHYSPHYS39

    Incorporating accurate statistical modeling in PET: reconstruction for whole-body imaging

    Get PDF
    Tese de doutoramento em Biofísica, apresentada à Universidade de Lisboa através da Faculdade de Ciências, 2007The thesis is devoted to image reconstruction in 3D whole-body PET imaging. OSEM ( Ordered Subsets Expectation maximization ) is a statistical algorithm that assumes Poisson data. However, corrections for physical effects (attenuation, scattered and random coincidences) and detector efficiency remove the Poisson characteristics of these data. The Fourier Rebinning (FORE), that combines 3D imaging with fast 2D reconstructions, requires corrected data. Thus, if it will be used or whenever data are corrected prior to OSEM, the need to restore the Poisson-like characteristics is present. Restoring Poisson-like data, i.e., making the variance equal to the mean, was achieved through the use of weighted OSEM algorithms. One of them is the NECOSEM, relying on the NEC weighting transformation. The distinctive feature of this algorithm is the NEC multiplicative factor, defined as the ratio between the mean and the variance. With real clinical data this is critical, since there is only one value collected for each bin the data value itself. For simulated data, if we keep track of the values for these two statistical moments, the exact values for the NEC weights can be calculated. We have compared the performance of five different weighted algorithms (FORE+AWOSEM, FORE+NECOSEM, ANWOSEM3D, SPOSEM3D and NECOSEM3D) on the basis of tumor detectablity. The comparison was done for simulated and clinical data. In the former case an analytical simulator was used. This is the ideal situation, since all the weighting factors can be exactly determined. For comparing the performance of the algorithms, we used the Non-Prewhitening Matched Filter (NPWMF) numerical observer. With some knowledge obtained from the simulation study we proceeded to the reconstruction of clinical data. In that case, it was necessary to devise a strategy for estimating the NEC weighting factors. The comparison between reconstructed images was done by a physician largely familiar with whole-body PET imaging

    Statistical reconstruction methods in PET: resolution limit, noise, edge artifacts and considerations for the design of better scanners

    Get PDF
    Proceeding of: 2005 IEEE Nuclear Science Symposium Conference Record, Puerto Rico, 23-29 Oct. 2005Small animal positron emission tomography (PET) scanners are being increasingly used as a basic measurement tool in modern biomedical research. The new designs and technologies of these scanners and the modern reconstruction methods have allowed to reach high spatial resolution and sensitivity. Despite their successes, some important issues remain to be addressed in high resolution PET imaging. First, iterative reconstruction methods like maximum likelihood-expectation maximization (MLEM) are known to recover resolution, but also to create noisy images and edge artifacts if some kind of regularization is not imposed. Second, the limit of resolution achievable by iterative methods on high resolution scanners is not quantitatively understood. Third, the use of regularization methods like Sieves or maximum a posteriori (MAP) requires them determination of the optimal values of several adjustable parameter that may be object-dependent. In this work we review these problems in high resolution PET and establish that the origin of them is more related with the physical effects involved in the emission and detection of the radiation during the acquisition than with the kind of iterative reconstruction method chosen. These physical effects (positron range, non-collinearity, scatter inside the object and inside the detector materials) cause that the tube of response (TOR) that connects the voxels with a line of response (LOR) is rather thick. This implies that the higher frequencies of the patient organ structures are not recorded by the scanner and therefore cannot be recovered during the reconstruction. As iterations grow, MLEM algorithms try to recover higher frequencies in the image. Once that a certain critic frequency is reached, this only maximizes high frequency noise. Using frequency response analyses techniques, we determine the maximum achievable resolution, before edge artifacts spoil the quality of the image, for a particular scanner as a function of the thickness of the TOR, and independently of the reconstruction method employed. With the same techniques, we can deduce well defined stopping criteria for reconstructions methods. Also, criteria for the highest number of subsets which should be used and how the design of the scanners can be optimized when statistical reconstruction methods are employed, is established.This work was supported in part by the UCM, Fundación para la investigación biomédica del Hospital Gregorio Marañon”, support from MEC project (BFM2003 04147 C02 01)

    Compensation for Nonuniform Resolution Using Penalized-Likelihood Reconstruction in Space-Variant Imaging Systems

    Full text link
    Imaging systems that form estimates using a statistical approach generally yield images with nonuniform resolution properties. That is, the reconstructed images possess resolution properties marked by space-variant and/or anisotropic responses. We have previously developed a space-variant penalty for penalized-likelihood (PL) reconstruction that yields nearly uniform resolution properties . We demonstrated how to calculate this penalty efficiently and apply it to an idealized positron emission tomography (PET) system whose geometric response is space-invariant. In this paper, we demonstrate the efficient calculation and application of this penalty to space-variant systems. (The method is most appropriate when the system matrix has been precalculated.) We apply the penalty to a large field of view PET system where crystal penetration effects make the geometric response space-variant, and to a two-dimensional single photon emission computed tomography system whose detector responses are modeled by a depth-dependent Gaussian with linearly varying full-width at half-maximum. We perform a simulation study comparing reconstructions using our proposed PL approach with other reconstruction methods and demonstrate the relative resolution uniformity, and discuss tradeoffs among estimators that yield nearly uniform resolution. We observe similar noise performance for the PL and post-smoothed maximum-likelihood (ML) approaches with carefully matched resolution, so choosing one estimator over another should be made on other factors like computational complexity and convergence rates of the iterative reconstruction. Additionally, because the postsmoothed ML and the proposed PL approach can outperform one another in terms of resolution uniformity depending on the desired reconstruction resolution, we present and discuss a hybrid approach adopting both a penalty and post-smoothing.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85975/1/Fessler63.pd

    Noise correlation in PET, CT, SPECT and PET/CT data evaluated using autocorrelation function: a phantom study on data, reconstructed using FBP and OSEM

    Get PDF
    BACKGROUND: Positron Emission Tomography (PET), Computed Tomography (CT), PET/CT and Single Photon Emission Tomography (SPECT) are non-invasive imaging tools used for creating two dimensional (2D) cross section images of three dimensional (3D) objects. PET and SPECT have the potential of providing functional or biochemical information by measuring distribution and kinetics of radiolabelled molecules, whereas CT visualizes X-ray density in tissues in the body. PET/CT provides fused images representing both functional and anatomical information with better precision in localization than PET alone. Images generated by these types of techniques are generally noisy, thereby impairing the imaging potential and affecting the precision in quantitative values derived from the images. It is crucial to explore and understand the properties of noise in these imaging techniques. Here we used autocorrelation function (ACF) specifically to describe noise correlation and its non-isotropic behaviour in experimentally generated images of PET, CT, PET/CT and SPECT. METHODS: Experiments were performed using phantoms with different shapes. In PET and PET/CT studies, data were acquired in 2D acquisition mode and reconstructed by both analytical filter back projection (FBP) and iterative, ordered subsets expectation maximisation (OSEM) methods. In the PET/CT studies, different magnitudes of X-ray dose in the transmission were employed by using different mA settings for the X-ray tube. In the CT studies, data were acquired using different slice thickness with and without applied dose reduction function and the images were reconstructed by FBP. SPECT studies were performed in 2D, reconstructed using FBP and OSEM, using post 3D filtering. ACF images were generated from the primary images, and profiles across the ACF images were used to describe the noise correlation in different directions. The variance of noise across the images was visualised as images and with profiles across these images. RESULTS: The most important finding was that the pattern of noise correlation is rotation symmetric or isotropic, independent of object shape in PET and PET/CT images reconstructed using the iterative method. This is, however, not the case in FBP images when the shape of phantom is not circular. Also CT images reconstructed using FBP show the same non-isotropic pattern independent of slice thickness and utilization of care dose function. SPECT images show an isotropic correlation of the noise independent of object shape or applied reconstruction algorithm. Noise in PET/CT images was identical independent of the applied X-ray dose in the transmission part (CT), indicating that the noise from transmission with the applied doses does not propagate into the PET images showing that the noise from the emission part is dominant. The results indicate that in human studies it is possible to utilize a low dose in transmission part while maintaining the noise behaviour and the quality of the images. CONCLUSION: The combined effect of noise correlation for asymmetric objects and a varying noise variance across the image field significantly complicates the interpretation of the images when statistical methods are used, such as with statistical estimates of precision in average values, use of statistical parametric mapping methods and principal component analysis. Hence it is recommended that iterative reconstruction methods are used for such applications. However, it is possible to calculate the noise analytically in images reconstructed by FBP, while it is not possible to do the same calculation in images reconstructed by iterative methods. Therefore for performing statistical methods of analysis which depend on knowing the noise, FBP would be preferred

    Assessment of Image Quality of a PET/CT scanner for a Standarized Image situation Using a NEMA Body Phantom. “The impact of Different Image Reconstruction Parameters on Image quality”

    Get PDF
    Radiologists and medical practitioners are working daily with images from integrated Positron Emission Tomography/ Computed Tomography (PET/CT) scanners in order to detect potentially lethal diseases. It is thus very important to ensure that these images have adequate image quality. For the staff responsible of quality assurance of the applied scanner, it is important to ensure that the reconstruction procedures and image protocols in use enable acquisition of image with a high quality with respect to resolution and contrast, while the data sets are containing as little noise as possible. The goal of the quality assurance work will be to continuously make sure that, data acquisition settings and especially the reconstruction procedure that is utilized for routine and daily clinical purposes, enables lesions or cancer cells and diseases to be detected. This master thesis project aims at evaluating a reconstruction algorithm (iterative reconstruction) and some key parameters applied in image reconstruction. These parameters include selected filters (Gaussian, median, Hann and Butterworth filter), selected full width at half maximum values (FWHM: 3, 5, and 7 mm) and image matrix sizes (128 x 128 and 168 x 168 pixels respectively), in order to provide information on how these key parameters will affect image quality. The National Electrical Manufacturers Association (NEMA) International Electrotechnical Commission (IEC) Body Phantom Set was used in this work. It consists of a lid with six fillable spheres (with internal diameters 37, 28, 22, 17, 13 and 10 mm respectively), lung insert, body phantom (which represent the background volume) and a test phantom. The work in this thesis project has been carried out using the radiopharmaceutical tracer an F-18 FDG, fluotodeoxyglucose, produced with a cyclotron, a General Electric’s PETtrace 6 cyclotron, at the Center for Nuclear Medicine/PET at Haukeland University Hospital in Bergen, Norway. The applied radiopharmaceutical F-18 FDG was produced in a 2.5 ml target volume at the cyclotron. After the production, this volume was delivered from the cyclotron into a 20 ml sealed cylindrical glass already containing 17.5 ml of non-radioactive water. The activity level in this new solution with 20 ml F-18 FDG and water was measured in a dose calibrator (ISOMED 2010TM). The solution was diluted further, in an iterative process, a number of times in order to acquire the necessary activity concentrations for both the selected hot spheres and the background volume. The aim was to obtain activity concentrations for sphere-to-background ratios of either 4:1 or 8:1. The sphere-to-background ratio in this work is the ratio between the radioactivity level in four small spheres (with diameters 22, 17, 13 and 10 mm respectively, and having a total volume of 9.8 ml for all the 4 spheres) and the radioactivity level in the main body of the applied phantom; the so-called background volume (9708 ml). The two bigger spheres (28 and 37 mm) were filled with non-radioactive water in order to represent areas without radioactivity, i.e. “cold spheres”. When the spheres and volumes under study were filled with the desired level of activity and the activity level was measured, the spheres were positioned into the applied body phantom and the phantom was sealed to avoid spillage. The prepared NEMA IEC body phantom was placed on the table of a Siemens Biograph 40 PET/CT scanner in a predetermined reproducible position and scanned using a standard clinical whole body PET/CT protocol. The acquired images were reconstructed. Three repetitive studies were done for each concentration ratio. For each experiment performed, the sphere-to-background ratios were either 4:1 or 8:1. A selection of different standardized reconstruction parameters and different image corrections were applied. This was done in order to study what impact changes of the reconstruction parameters will have on the image quality. The image quality being defined by a quantification of the measured relative contrast in the images studied. The procedures followed while performing the PET/CT were in compliance with the recommended procedure presented in the NEMA NU2 – 2007 manual (from the manufacturer of the NEMA IEC body phantom described above). The reconstructed images were analyzed manually on a PET/CT workstation and also analyzed automatically with python programming software specially developed for the purpose of this work. The image quality results obtained from analyzes of the reconstructed images when different reconstruction parameters were used, were thereafter compared to the standardized protocol for reconstruction of PET/CT images. Lastly, the results have been compared with other similar work on the same subject by Helmar Bergmann et al (2005).Master i Medisinsk biologiMAMD-MEDBIBMED39
    • …
    corecore