1,186 research outputs found

    Incorporating accurate statistical modeling in PET: reconstruction for whole-body imaging

    Get PDF
    Tese de doutoramento em Biofísica, apresentada à Universidade de Lisboa através da Faculdade de Ciências, 2007The thesis is devoted to image reconstruction in 3D whole-body PET imaging. OSEM ( Ordered Subsets Expectation maximization ) is a statistical algorithm that assumes Poisson data. However, corrections for physical effects (attenuation, scattered and random coincidences) and detector efficiency remove the Poisson characteristics of these data. The Fourier Rebinning (FORE), that combines 3D imaging with fast 2D reconstructions, requires corrected data. Thus, if it will be used or whenever data are corrected prior to OSEM, the need to restore the Poisson-like characteristics is present. Restoring Poisson-like data, i.e., making the variance equal to the mean, was achieved through the use of weighted OSEM algorithms. One of them is the NECOSEM, relying on the NEC weighting transformation. The distinctive feature of this algorithm is the NEC multiplicative factor, defined as the ratio between the mean and the variance. With real clinical data this is critical, since there is only one value collected for each bin the data value itself. For simulated data, if we keep track of the values for these two statistical moments, the exact values for the NEC weights can be calculated. We have compared the performance of five different weighted algorithms (FORE+AWOSEM, FORE+NECOSEM, ANWOSEM3D, SPOSEM3D and NECOSEM3D) on the basis of tumor detectablity. The comparison was done for simulated and clinical data. In the former case an analytical simulator was used. This is the ideal situation, since all the weighting factors can be exactly determined. For comparing the performance of the algorithms, we used the Non-Prewhitening Matched Filter (NPWMF) numerical observer. With some knowledge obtained from the simulation study we proceeded to the reconstruction of clinical data. In that case, it was necessary to devise a strategy for estimating the NEC weighting factors. The comparison between reconstructed images was done by a physician largely familiar with whole-body PET imaging

    Hybrid Poissoflolynomial Objective Functions for Tomographic Image Reconstruction from Transmission Scans

    Full text link
    This paper describes rapidly converging algorithms for computing attenuation maps from Poisson transmission measurements using penalized-likelihood objective functions. We demonstrate that an under-relaxed cyclic coordinate-ascent algorithm converges faster than the convex algorithm of Lange (see ibid., vol.4, no.10, p.1430-1438, 1995), which in turn converges faster than the expectation-maximization (EM) algorithm for transmission tomography. To further reduce computation, one could replace the log-likelihood objective with a quadratic approximation. However, we show with simulations and analysis that the quadratic objective function leads to biased estimates for low-count measurements. Therefore we introduce hybrid Poisson/polynomial objective functions that use the exact Poisson log-likelihood for detector measurements with low counts, but use computationally efficient quadratic or cubic approximations for the high-count detector measurements. We demonstrate that the hybrid objective functions reduce computation time without increasing estimation bias.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/86023/1/Fessler100.pd

    Regularized Emission Image Reconstruction Using Imperfect Side Information

    Full text link
    A spatially variant penalized-likelihood method for tomographic image reconstruction based on a weighted Gibbs penalty was investigated. The penalty weights are determined from structural side information, such as the locations of anatomical boundaries in high-resolution magnetic resonance images. Such side information will be imperfect in practice, and a simple simulation demonstrated the importance of accounting for the errors in boundary locations. Methods are discussed for prescribing the penalty weights when the side information is noisy. Simulation results suggest that even imperfect side information is useful for guiding spatially variant regularization.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85869/1/Fessler110.pd

    Penalized Weighted Least-Squares Image Reconstruction for Positron Emission Tomography

    Full text link
    Presents an image reconstruction method for positron-emission tomography (PET) based on a penalized, weighted least-squares (PWLS) objective. For PET measurements that are precorrected for accidental coincidences, the author argues statistically that a least-squares objective function is as appropriate, if not more so, than the popular Poisson likelihood objective. The author proposes a simple data-based method for determining the weights that accounts for attenuation and detector efficiency. A nonnegative successive over-relaxation (+SOR) algorithm converges rapidly to the global minimum of the PWLS objective. Quantitative simulation results demonstrate that the bias/variance tradeoff of the PWLS+SOR method is comparable to the maximum-likelihood expectation-maximization (ML-EM) method (but with fewer iterations), and is improved relative to the conventional filtered backprojection (FBP) method. Qualitative results suggest that the streak artifacts common to the FBP method are nearly eliminated by the PWLS+SOR method, and indicate that the proposed method for weighting the measurements is a significant factor in the improvement over FBP.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85851/1/Fessler105.pd

    Emission Image Reconstruction for Randoms-Precorrected PET Allowing Negative Sinogram Values

    Full text link
    Most positron emission tomography (PET) emission scans are corrected for accidental coincidence (AC) events by real-time subtraction of delayed-window coincidences, leaving only the randoms-precorrected data available for image reconstruction. The real-time randoms precorrection compensates in mean for AC events but destroys the Poisson statistics. The exact log-likelihood for randoms-precorrected data is inconvenient, so practical approximations are needed for maximum likelihood or penalized-likelihood image reconstruction. Conventional approximations involve setting negative sinogram values to zero, which can induce positive systematic biases, particularly for scans with low counts per ray. We propose new likelihood approximations that allow negative sinogram values without requiring zero-thresholding. With negative sinogram values, the log-likelihood functions can be nonconcave, complicating maximization; nevertheless, we develop monotonic algorithms for the new models by modifying the separable paraboloidal surrogates and the maximum-likelihood expectation-maximization (ML-EM) methods. These algorithms ascend to local maximizers of the objective function. Analysis and simulation results show that the new shifted Poisson (SP) model is nearly free of systematic bias yet keeps low variance. Despite its simpler implementation, the new SP performs comparably to the saddle-point model which has shown the best performance (as to systematic bias and variance) in randoms-precorrected PET emission reconstruction.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85994/1/Fessler61.pd

    New Complete-Data Spaces and Faster Algorithms for Penalized- Likelihood Emission Tomography

    Full text link
    The classical expectation-maximization (EM) algorithm for image reconstruction suffers from particularly slow convergence when additive background effects such as accidental coincidences and scatter are included. In addition, when smoothness penalties are included in the objective function, the M-step of the EM algorithm becomes intractable due to parameter coupling. The authors describe the space-alternating generalized EM (SAGE) algorithm, in which the parameters are updated sequentially using a sequence of small “hidden” data spaces rather than one large complete-data space. The sequential update decouples the M-step, so the maximization can typically be performed analytically. By choosing hidden-data spaces with considerably less Fisher information than the conventional complete-data space for Poisson data, the authors obtain significant improvements in convergence rate. This acceleration is due to statistical considerations, not to numerical overrelaxation methods, so monotonic increases in the objective function and global convergence are guaranteed. Due to the space constraints, the authors focus on the unpenalized case in this summary, and they eliminate derivations that are similar to those in Lange and Carson, J. Comput. Assist. Tomography, vol. 8, no. 2, p.306-16 (1984).Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85834/1/Fessler125.pd

    Spatial Resolution Properties of Penalized-Likelihood Image Reconstruction: Space-Invariant Tomographs

    Full text link
    This paper examines the spatial resolution properties of penalized-likelihood image reconstruction methods by analyzing the local impulse response. The analysis shows that standard regularization penalties induce space-variant local impulse response functions, even for space-invariant tomographic systems. Paradoxically, for emission image reconstruction, the local resolution is generally poorest in high-count regions. We show that the linearized local impulse response induced by quadratic roughness penalties depends on the object only through its projections. This analysis leads naturally to a modified regularization penalty that yields reconstructed images with nearly uniform resolution. The modified penalty also provides a very practical method for choosing the regularization parameter to obtain a specified resolution in images reconstructed by penalized-likelihood methods.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85890/1/Fessler97.pd

    Projected Nesterov’s Proximal-Gradient Signal Recovery from Compressive Poisson Measurements

    Get PDF
    We develop a projected Nesterov’s proximalgradient (PNPG) scheme for reconstructing sparse signals from compressive Poisson-distributed measurements with the mean signal intensity that follows an affine model with known intercept. The objective function to be minimized is a sum of convex data fidelity (negative log-likelihood (NLL)) and regularization terms. We apply sparse signal regularization where the signal belongs to a nonempty closed convex set within the domain of the NLL and signal sparsity is imposed using total-variation (TV) penalty. We present analytical upper bounds on the regularization tuning constant. The proposed PNPG method employs projected Nesterov’s acceleration step, function restart, and an adaptive stepsize selection scheme that accounts for varying local Lipschitz constant of the NLL.We establish O k2 convergence of the PNPG method with step-size backtracking only and no restart. Numerical examples compare PNPG with the state-of-the-art sparse Poisson-intensity reconstruction algorithm (SPIRAL)
    • …
    corecore