2 research outputs found

    Penalized Weighted Least-Squares Image Reconstruction for Positron Emission Tomography

    Full text link
    Presents an image reconstruction method for positron-emission tomography (PET) based on a penalized, weighted least-squares (PWLS) objective. For PET measurements that are precorrected for accidental coincidences, the author argues statistically that a least-squares objective function is as appropriate, if not more so, than the popular Poisson likelihood objective. The author proposes a simple data-based method for determining the weights that accounts for attenuation and detector efficiency. A nonnegative successive over-relaxation (+SOR) algorithm converges rapidly to the global minimum of the PWLS objective. Quantitative simulation results demonstrate that the bias/variance tradeoff of the PWLS+SOR method is comparable to the maximum-likelihood expectation-maximization (ML-EM) method (but with fewer iterations), and is improved relative to the conventional filtered backprojection (FBP) method. Qualitative results suggest that the streak artifacts common to the FBP method are nearly eliminated by the PWLS+SOR method, and indicate that the proposed method for weighting the measurements is a significant factor in the improvement over FBP.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85851/1/Fessler105.pd

    Monotonic Algorithms for Transmission Tomography

    Full text link
    Presents a framework for designing fast and monotonic algorithms for transmission tomography penalized-likelihood image reconstruction. The new algorithms are based on paraboloidal surrogate functions for the log likelihood, Due to the form of the log-likelihood function it is possible to find low curvature surrogate functions that guarantee monotonicity. Unlike previous methods, the proposed surrogate functions lead to monotonic algorithms even for the nonconvex log likelihood that arises due to background events, such as scatter and random coincidences. The gradient and the curvature of the likelihood terms are evaluated only once per iteration. Since the problem is simplified at each iteration, the CPU time is less than that of current algorithms which directly minimize the objective, yet the convergence rate is comparable. The simplicity, monotonicity, and speed of the new algorithms are quite attractive. The convergence rates of the algorithms are demonstrated using real and simulated PET transmission scans.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85831/1/Fessler83.pd
    corecore