15 research outputs found

    New statistical models for randoms-precorrected PET scans

    Full text link
    PET measurements are usually precorrected for accidental coincidence events by real-time subtraction of the delayed window coincidences. Randoms subtraction compensates in mean for accidental coincidences but destroys the Poisson statistics. We propose and analyze two new approximations to the exact log-likelihood of the precorrected measurements, one based on a "shifted Poisson" model, the other based on saddle-point approximations to the measurement probability mass function (pmf). The methods apply to both emission and transmission tomography; however in this paper we focus on transmission tomography. We compare the new models to conventional data-weighted least squares (WLS) and conventional maximum likelihood (based on the ordinary Poisson (OP) model) using simulations and analytic approximations. The results demonstrate that the proposed methods avoid the systematic bias of the WLS method, and lead to significantly lower variance than the conventional OP method. The saddle-point method provides a more accurate approximation to the exact log-likelihood than the WLS, OP and shifted Poisson alternatives. However, the simpler shifted Poisson method yielded comparable bias-variance performance in the simulations. The new methods offer improved image reconstruction in PET through more realistic statistical modeling, yet with negligible increase in computation over the conventional OP method.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85945/1/Fessler92.pd

    Joint Estimation of Attenuation and Emission Images from PET Scans

    Full text link
    In modern PET scanners, image reconstruction is performed sequentially in two steps regardless of the reconstruction method: 1. Attenuation correction factor computation (ACF) from transmission scans, 2. Emission image reconstruction using the computed ACFs. This reconstruction scheme does not use all the information in the transmission and emission scans. Post-injection transmission scans contain emission contamination which includes information about emission parameters. Conversely emission scans contain information about the attenuating medium. To use all the available information, the authors propose a joint estimation approach that estimates the attenuation map and the emission image from these two scans. The penalized-likelihood objective function is nonconvex for this problem. The authors propose an algorithm based on paraboloidal surrogates that alternates between emission and attenuation parameters and is guaranteed to monotonically decrease the objective function.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85807/1/Fessler159.pd

    Algorithms for Joint Estimation of Attenuation and Emission Images in PET

    Full text link
    In positron emission tomography (PET), positron emission from radiolabeled compounds yields two high energy photons emitted in opposing directions. However, often the photons are not detected due to attenuation within the patient. This attenuation is nonuniform and must be corrected to obtain quantitatively accurate emission images. To measure attenuation effects, one typically acquires a PET transmission scan before or after the injection of radiotracer. In commercially available PET scanners, image reconstruction is performed sequentially in two steps regardless of the reconstruction method: 1. Attenuation correction factor computation (ACF) from transmission scans, 2. Emission image reconstruction using the computed ACFs. This two-step reconstruction scheme does not use all the information in the transmission and emission scans. Postinjection transmission scans contain emission contamination that includes information about emission parameters. Similarly, emission scans contain information about the attenuating medium. To use all the available information, we propose a joint estimation approach that estimates the attenuation map and the emission image simultaneously from these two scans. The penalized-likelihood objective function is nonconvex for this problem. We propose an algorithm based on paraboloidal surrogates that alternates between updating emission and attenuation parameters and is guaranteed to monotonically decrease the objective function.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85956/1/Fessler160.pd

    Statistical Emission Image Reconstruction for Randoms-Precorrected PET Scans Using Negative Sinogram Values

    Full text link
    Many conventional PET emission scans are corrected for accidental coincidence (AC) events, or randoms, by real-time subtraction of delayed-window coincidences, leaving only the randoms-precorrected data available for image reconstruction. The real-time precorrection compensates in mean for AC events but destroys Poisson statistics. Since the exact log-likelihood for randoms-precorrected data is inconvenient to maximize, practical approximations are desirable for statistical image reconstruction. Conventional approximations involve setting negative sinogram values to zero, which can induce positive systematic biases, particularly for scans with low counts per ray. We propose new likelihood approximations that allow negative sinogram values without requiring zero-thresholding. We also develop monotonic algorithms for the new models by using "optimization transfer" principles. Simulation results show that our new model, SP-, is free of systematic bias yet keeps low variance. Despite its simpler implementation, the new model performs comparably to the saddle-point (SD) model which has previously shown the best performance (as to systematic bias and variance) in randoms-precorrected PET emission reconstruction.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85893/1/Fessler185.pd

    Penalized-Likelihood Estimators and Noise Analysis for Randoms-Precorrected PET Transmission Scans

    Full text link
    This paper analyzes and compares image reconstruction methods based on practical approximations to the exact log likelihood of randoms precorrected positron emission tomography (PET) measurements. The methods apply to both emission and transmission tomography, however, in this paper the authors focus on transmission tomography. The results of experimental PET transmission scans and variance approximations demonstrate that the shifted Poisson (SP) method avoids the systematic bias of the conventional data-weighted least squares (WLS) method and leads to significantly lower variance than conventional statistical methods based on the log likelihood of the ordinary Poisson (OF) model. The authors develop covariance approximations to analyze the propagation of noise from attenuation maps into emission images via the attenuation correction factors (ACF's). Empirical pixel and region variances from real transmission data agree closely with the analytical predictions. Both the approximations and the empirical results show that the performance differences between the OP model and SP model are even larger, when considering noise propagation from the transmission images into the final emission images, than the differences in the attenuation maps themselves.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85852/1/Fessler84.pd

    Emission Image Reconstruction for Randoms-Precorrected PET Allowing Negative Sinogram Values

    Full text link
    Most positron emission tomography (PET) emission scans are corrected for accidental coincidence (AC) events by real-time subtraction of delayed-window coincidences, leaving only the randoms-precorrected data available for image reconstruction. The real-time randoms precorrection compensates in mean for AC events but destroys the Poisson statistics. The exact log-likelihood for randoms-precorrected data is inconvenient, so practical approximations are needed for maximum likelihood or penalized-likelihood image reconstruction. Conventional approximations involve setting negative sinogram values to zero, which can induce positive systematic biases, particularly for scans with low counts per ray. We propose new likelihood approximations that allow negative sinogram values without requiring zero-thresholding. With negative sinogram values, the log-likelihood functions can be nonconcave, complicating maximization; nevertheless, we develop monotonic algorithms for the new models by modifying the separable paraboloidal surrogates and the maximum-likelihood expectation-maximization (ML-EM) methods. These algorithms ascend to local maximizers of the objective function. Analysis and simulation results show that the new shifted Poisson (SP) model is nearly free of systematic bias yet keeps low variance. Despite its simpler implementation, the new SP performs comparably to the saddle-point model which has shown the best performance (as to systematic bias and variance) in randoms-precorrected PET emission reconstruction.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85994/1/Fessler61.pd

    Conjugate-Gradient Preconditioning Methods for Shift-Variant PET Image Reconstruction

    Full text link
    Gradient-based iterative methods often converge slowly for tomographic image reconstruction and image restoration problems, but can be accelerated by suitable preconditioners. Diagonal preconditioners offer some improvement in convergence rate, but do not incorporate the structure of the Hessian matrices in imaging problems. Circulant preconditioners can provide remarkable acceleration for inverse problems that are approximately shift-invariant, i.e., for those with approximately block-Toeplitz or block-circulant Hessians. However, in applications with nonuniform noise variance, such as arises from Poisson statistics in emission tomography and in quantum-limited optical imaging, the Hessian of the weighted least-squares objective function is quite shift-variant, and circulant preconditioners perform poorly. Additional shift-variance is caused by edge-preserving regularization methods based on nonquadratic penalty functions. This paper describes new preconditioners that approximate more accurately the Hessian matrices of shift-variant imaging problems. Compared to diagonal or circulant preconditioning, the new preconditioners lead to significantly faster convergence rates for the unconstrained conjugate-gradient (CG) iteration. We also propose a new efficient method for the line-search step required by CG methods. Applications to positron emission tomography (PET) illustrate the method.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85979/1/Fessler85.pd

    Monotonic Algorithms for Transmission Tomography

    Full text link
    Presents a framework for designing fast and monotonic algorithms for transmission tomography penalized-likelihood image reconstruction. The new algorithms are based on paraboloidal surrogate functions for the log likelihood, Due to the form of the log-likelihood function it is possible to find low curvature surrogate functions that guarantee monotonicity. Unlike previous methods, the proposed surrogate functions lead to monotonic algorithms even for the nonconvex log likelihood that arises due to background events, such as scatter and random coincidences. The gradient and the curvature of the likelihood terms are evaluated only once per iteration. Since the problem is simplified at each iteration, the CPU time is less than that of current algorithms which directly minimize the objective, yet the convergence rate is comparable. The simplicity, monotonicity, and speed of the new algorithms are quite attractive. The convergence rates of the algorithms are demonstrated using real and simulated PET transmission scans.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85831/1/Fessler83.pd

    Variational Gaussian approximation for Poisson data

    Get PDF
    The Poisson model is frequently employed to describe count data, but in a Bayesian context it leads to
 an analytically intractable posterior probability distribution. In this work, we analyze a variational Gaussian
 approximation to the posterior distribution arising from the Poisson model with a Gaussian prior. This is
 achieved by seeking an optimal Gaussian distribution minimizing the Kullback-Leibler divergence from
 the posterior distribution to the approximation, or
 equivalently maximizing the lower bound for the model evidence. We derive an explicit expression for
 the lower bound, and show the existence and uniqueness of the optimal Gaussian approximation. The lower
 bound functional can be viewed as a variant of classical Tikhonov regularization that penalizes also the
 covariance. Then we develop an efficient alternating direction maximization algorithm for solving
 the optimization problem, and analyze its convergence. We discuss strategies for reducing the computational
 complexity via low rank structure of the forward operator and the sparsity of the covariance. Further, as an
 application of the lower bound, we discuss hierarchical Bayesian modeling for selecting the
 hyperparameter in the prior distribution, and propose a monotonically convergent algorithm for determining
 the hyperparameter. We present extensive numerical experiments to illustrate the Gaussian approximation and the algorithms
    corecore