5 research outputs found

    Comparison of Quadratic- and Median-Based Roughness Penalties for Penalized-Likelihood Sinogram Restoration in Computed Tomography

    Get PDF
    We have compared the performance of two different penalty choices for a penalized-likelihood sinogram-restoration strategy we have been developing. One is a quadratic penalty we have employed previously and the other is a new median-based penalty. We compared the approaches to a noniterative adaptive filter that loosely but not explicitly models data statistics. We found that the two approaches produced similar resolution-variance tradeoffs to each other and that they outperformed the adaptive filter in the low-dose regime, which suggests that the particular choice of penalty in our approach may be less important than the fact that we are explicitly modeling data statistics at all. Since the quadratic penalty allows for derivation of an algorithm that is guaranteed to monotonically increase the penalized-likelihood objective function, we find it to be preferable to the median-based penalty

    Incorporating accurate statistical modeling in PET: reconstruction for whole-body imaging

    Get PDF
    Tese de doutoramento em BiofĂ­sica, apresentada Ă  Universidade de Lisboa atravĂ©s da Faculdade de CiĂȘncias, 2007The thesis is devoted to image reconstruction in 3D whole-body PET imaging. OSEM ( Ordered Subsets Expectation maximization ) is a statistical algorithm that assumes Poisson data. However, corrections for physical effects (attenuation, scattered and random coincidences) and detector efficiency remove the Poisson characteristics of these data. The Fourier Rebinning (FORE), that combines 3D imaging with fast 2D reconstructions, requires corrected data. Thus, if it will be used or whenever data are corrected prior to OSEM, the need to restore the Poisson-like characteristics is present. Restoring Poisson-like data, i.e., making the variance equal to the mean, was achieved through the use of weighted OSEM algorithms. One of them is the NECOSEM, relying on the NEC weighting transformation. The distinctive feature of this algorithm is the NEC multiplicative factor, defined as the ratio between the mean and the variance. With real clinical data this is critical, since there is only one value collected for each bin the data value itself. For simulated data, if we keep track of the values for these two statistical moments, the exact values for the NEC weights can be calculated. We have compared the performance of five different weighted algorithms (FORE+AWOSEM, FORE+NECOSEM, ANWOSEM3D, SPOSEM3D and NECOSEM3D) on the basis of tumor detectablity. The comparison was done for simulated and clinical data. In the former case an analytical simulator was used. This is the ideal situation, since all the weighting factors can be exactly determined. For comparing the performance of the algorithms, we used the Non-Prewhitening Matched Filter (NPWMF) numerical observer. With some knowledge obtained from the simulation study we proceeded to the reconstruction of clinical data. In that case, it was necessary to devise a strategy for estimating the NEC weighting factors. The comparison between reconstructed images was done by a physician largely familiar with whole-body PET imaging

    Non-uniform resolution and partial volume recovery in tomographic image reconstruction methods

    Get PDF
    Acquired data in tomographic imaging systems are subject to physical or detector based image degrading effects. These effects need to be considered and modeled in order to optimize resolution recovery. However, accurate modeling of the physics of data and acquisition processes still lead to an ill-posed reconstruction problem, because real data is incomplete and noisy. Real images are always a compromise between resolution and noise; therefore, noise processes also need to be fully considered for optimum bias variance trade off. Image degrading effects and noise are generally modeled in the reconstruction methods, while, statistical iterative methods can better model these effects, with noise processes, as compared to the analytical methods. Regularization is used to condition the problem and explicit regularization methods are considered better to model various noise processes with an extended control over the reconstructed image quality. Emission physics through object distribution properties are modeled in form of a prior function. Smoothing and edge-preserving priors have been investigated in detail and it has been shown that smoothing priors over-smooth images in high count areas and result in spatially non-uniform and nonlinear resolution response. Uniform resolution response is desirable for image comparison and other image processing tasks, such as segmentation and registration. This work proposes methods, based on MRPs in MAP estimators, to obtain images with almost uniform and linear resolution characteristics, using nonlinearity of MRPs as a correction tool. Results indicate that MRPs perform better in terms of response linearity, spatial uniformity and parameter sensitivity, as compared to QPs and TV priors. Hybrid priors, comprised of MRPs and QPs, have been developed and analyzed for their activity recovery performance in two popular PVC methods and for an analysis of list-mode data reconstruction methods showing that MPRs perform better than QPs in different situations
    corecore