21 research outputs found

    Penalized-Likelihood Estimators and Noise Analysis for Randoms-Precorrected PET Transmission Scans

    Full text link
    This paper analyzes and compares image reconstruction methods based on practical approximations to the exact log likelihood of randoms precorrected positron emission tomography (PET) measurements. The methods apply to both emission and transmission tomography, however, in this paper the authors focus on transmission tomography. The results of experimental PET transmission scans and variance approximations demonstrate that the shifted Poisson (SP) method avoids the systematic bias of the conventional data-weighted least squares (WLS) method and leads to significantly lower variance than conventional statistical methods based on the log likelihood of the ordinary Poisson (OF) model. The authors develop covariance approximations to analyze the propagation of noise from attenuation maps into emission images via the attenuation correction factors (ACF's). Empirical pixel and region variances from real transmission data agree closely with the analytical predictions. Both the approximations and the empirical results show that the performance differences between the OP model and SP model are even larger, when considering noise propagation from the transmission images into the final emission images, than the differences in the attenuation maps themselves.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85852/1/Fessler84.pd

    Emission Image Reconstruction for Randoms-Precorrected PET Allowing Negative Sinogram Values

    Full text link
    Most positron emission tomography (PET) emission scans are corrected for accidental coincidence (AC) events by real-time subtraction of delayed-window coincidences, leaving only the randoms-precorrected data available for image reconstruction. The real-time randoms precorrection compensates in mean for AC events but destroys the Poisson statistics. The exact log-likelihood for randoms-precorrected data is inconvenient, so practical approximations are needed for maximum likelihood or penalized-likelihood image reconstruction. Conventional approximations involve setting negative sinogram values to zero, which can induce positive systematic biases, particularly for scans with low counts per ray. We propose new likelihood approximations that allow negative sinogram values without requiring zero-thresholding. With negative sinogram values, the log-likelihood functions can be nonconcave, complicating maximization; nevertheless, we develop monotonic algorithms for the new models by modifying the separable paraboloidal surrogates and the maximum-likelihood expectation-maximization (ML-EM) methods. These algorithms ascend to local maximizers of the objective function. Analysis and simulation results show that the new shifted Poisson (SP) model is nearly free of systematic bias yet keeps low variance. Despite its simpler implementation, the new SP performs comparably to the saddle-point model which has shown the best performance (as to systematic bias and variance) in randoms-precorrected PET emission reconstruction.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85994/1/Fessler61.pd

    Statistical Emission Image Reconstruction for Randoms-Precorrected PET Scans Using Negative Sinogram Values

    Full text link
    Many conventional PET emission scans are corrected for accidental coincidence (AC) events, or randoms, by real-time subtraction of delayed-window coincidences, leaving only the randoms-precorrected data available for image reconstruction. The real-time precorrection compensates in mean for AC events but destroys Poisson statistics. Since the exact log-likelihood for randoms-precorrected data is inconvenient to maximize, practical approximations are desirable for statistical image reconstruction. Conventional approximations involve setting negative sinogram values to zero, which can induce positive systematic biases, particularly for scans with low counts per ray. We propose new likelihood approximations that allow negative sinogram values without requiring zero-thresholding. We also develop monotonic algorithms for the new models by using "optimization transfer" principles. Simulation results show that our new model, SP-, is free of systematic bias yet keeps low variance. Despite its simpler implementation, the new model performs comparably to the saddle-point (SD) model which has previously shown the best performance (as to systematic bias and variance) in randoms-precorrected PET emission reconstruction.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85893/1/Fessler185.pd

    Objective Functions for Tomographic Reconstruction from Randoms-Precorrected PET Scans

    Full text link
    In PET, usually the data are precorrected for accidental coincidence (AC) events by real-time subtraction of the delayed window coincidences. Randoms subtraction compensates in mean for AC events but destroys the Poisson statistics. Furthermore, for transmission tomography the weighted least-squares (WLS) method leads to systematic biases, especially at low count rates. We propose a new “shifted” Poisson (SP) model for precorrected PET data, which properly matches the first and second order moments of the measurement statistics. Using simulations and analytic approximations, we show that estimators based on the “ordinary” Poisson (OF) model for the precorrected data lead to higher standard deviations than the proposed method. Moreover, if one zero-thresholds the data before applying the maximization algorithm, the OP model results in systematic bias. It is shown that the proposed SP model leads to penalized-likelihood estimates free of systematic bias, even for zero-thresholded data. The proposed SP model does not increase the computation requirements compared to OP model and it is robust to errors in the estimates of the AC event rates.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85842/1/Fessler143.pd

    New statistical models for randoms-precorrected PET scans

    Full text link
    PET measurements are usually precorrected for accidental coincidence events by real-time subtraction of the delayed window coincidences. Randoms subtraction compensates in mean for accidental coincidences but destroys the Poisson statistics. We propose and analyze two new approximations to the exact log-likelihood of the precorrected measurements, one based on a "shifted Poisson" model, the other based on saddle-point approximations to the measurement probability mass function (pmf). The methods apply to both emission and transmission tomography; however in this paper we focus on transmission tomography. We compare the new models to conventional data-weighted least squares (WLS) and conventional maximum likelihood (based on the ordinary Poisson (OP) model) using simulations and analytic approximations. The results demonstrate that the proposed methods avoid the systematic bias of the WLS method, and lead to significantly lower variance than the conventional OP method. The saddle-point method provides a more accurate approximation to the exact log-likelihood than the WLS, OP and shifted Poisson alternatives. However, the simpler shifted Poisson method yielded comparable bias-variance performance in the simulations. The new methods offer improved image reconstruction in PET through more realistic statistical modeling, yet with negligible increase in computation over the conventional OP method.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85945/1/Fessler92.pd

    Maximum-Likelihood Dual-Energy TomographicImage Reconstruction

    Full text link
    Dual-energy (DE) X-ray computed tomography (CT) has shown promise for material characterization and for providing quantitatively accurate CT values in a variety of applications. However, DE-CT has not been used routinely in medicine to date, primarily due to dose considerations. Most methods for DE-CT have used the filtered backprojection method for image reconstruction, leading to suboptimal noise/dose properties. This paper describes a statistical (maximum-likelihood) method for dual-energy X-ray CT that accommodates a wide variety of potential system configurations and measurement noise models. Regularized methods (such as penalized-likelihood or Bayesian estimation) are straightforward extensions. One version of the algorithm monotonically decreases the negative log-likelihood cost function each iteration. An ordered-subsets variation of the algorithm provides a fast and practical version.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85934/1/Fessler172.pd

    Conjugate-Gradient Preconditioning Methods for Shift-Variant PET Image Reconstruction

    Full text link
    Gradient-based iterative methods often converge slowly for tomographic image reconstruction and image restoration problems, but can be accelerated by suitable preconditioners. Diagonal preconditioners offer some improvement in convergence rate, but do not incorporate the structure of the Hessian matrices in imaging problems. Circulant preconditioners can provide remarkable acceleration for inverse problems that are approximately shift-invariant, i.e., for those with approximately block-Toeplitz or block-circulant Hessians. However, in applications with nonuniform noise variance, such as arises from Poisson statistics in emission tomography and in quantum-limited optical imaging, the Hessian of the weighted least-squares objective function is quite shift-variant, and circulant preconditioners perform poorly. Additional shift-variance is caused by edge-preserving regularization methods based on nonquadratic penalty functions. This paper describes new preconditioners that approximate more accurately the Hessian matrices of shift-variant imaging problems. Compared to diagonal or circulant preconditioning, the new preconditioners lead to significantly faster convergence rates for the unconstrained conjugate-gradient (CG) iteration. We also propose a new efficient method for the line-search step required by CG methods. Applications to positron emission tomography (PET) illustrate the method.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85979/1/Fessler85.pd

    Fast approach to evaluate MAP reconstruction for lesion detection and localization

    Full text link

    Edge-Preserving Tomographic Reconstruction with Nonlocal Regularization

    Full text link
    Tomographic image reconstruction using statistical methods can provide more accurate system modeling, statistical models, and physical constraints than the conventional filtered backprojection (FBP) method. Because of the ill posedness of the reconstruction problem, a roughness penalty is often imposed on the solution to control noise. To avoid smoothing of edges, which are important image attributes, various edge-preserving regularization methods have been proposed. Most of these schemes rely on information from local neighborhoods to determine the presence of edges. In this paper, we propose a cost function that incorporates nonlocal boundary information into the regularization method. We use an alternating minimization algorithm with deterministic annealing to minimize the proposed cost function, jointly estimating region boundaries and object pixel values. We apply variational techniques implemented using level-sets methods to update the boundary estimates; then, using the most recent boundary estimate, we minimize a space-variant quadratic cost function to update the image estimate. For the positron emission tomography transmission reconstruction application, we compare the bias-variance tradeoff of this method with that of a "conventional" penalized-likelihood algorithm with local Huber roughness penalty.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85989/1/Fessler73.pd

    Theoretical Evaluation of the Detectability of Random Lesions in Bayesian Emission Reconstruction

    Full text link
    Detecting cancerous lesion is an important task in positron emission tomography (PET). Bayesian methods based on the maximum a posteriori principle (also called penalized maximum likelihood methods) have been developed to deal with the low signal to noise ratio in the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the prior parameters in Bayesian reconstruction control the resolution and noise trade-off and hence affect detectability of lesions in reconstructed images. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Most research has been based on Monte Carlo simulations, which are very time consuming. Building on the recent progress on the theoretical analysis of image properties of statistical reconstructions and the development of numerical observers, here we develop a theoretical approach for fast computation of lesion detectability in Bayesian reconstruction. The results can be used to choose the optimum hyperparameter for the maximum lesion detectability. New in this work is the use of theoretical expressions that explicitly model the statistical variation of the lesion and background without assuming that the object variation is (locally) stationary. The theoretical results are validated using Monte Carlo simulations. The comparisons show good agreement between the theoretical predications and the Monte Carlo results
    corecore