334 research outputs found

    Theoretical Evaluation of the Detectability of Random Lesions in Bayesian Emission Reconstruction

    Full text link
    Detecting cancerous lesion is an important task in positron emission tomography (PET). Bayesian methods based on the maximum a posteriori principle (also called penalized maximum likelihood methods) have been developed to deal with the low signal to noise ratio in the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the prior parameters in Bayesian reconstruction control the resolution and noise trade-off and hence affect detectability of lesions in reconstructed images. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Most research has been based on Monte Carlo simulations, which are very time consuming. Building on the recent progress on the theoretical analysis of image properties of statistical reconstructions and the development of numerical observers, here we develop a theoretical approach for fast computation of lesion detectability in Bayesian reconstruction. The results can be used to choose the optimum hyperparameter for the maximum lesion detectability. New in this work is the use of theoretical expressions that explicitly model the statistical variation of the lesion and background without assuming that the object variation is (locally) stationary. The theoretical results are validated using Monte Carlo simulations. The comparisons show good agreement between the theoretical predications and the Monte Carlo results

    Theoretical study of lesion detectability of MAP reconstruction using computer observers

    Full text link

    Comparison of Lesion Detection and Quantification in MAP Reconstruction with Gaussian and Non-Gaussian Priors

    Get PDF
    Statistical image reconstruction methods based on maximum a posteriori (MAP) principle have been developed for emission tomography. The prior distribution of the unknown image plays an important role in MAP reconstruction. The most commonly used prior are Gaussian priors, whose logarithm has a quadratic form. Gaussian priors are relatively easy to analyze. It has been shown that the effect of a Gaussian prior can be approximated by linear filtering a maximum likelihood (ML) reconstruction. As a result, sharp edges in reconstructed images are not preserved. To preserve sharp transitions, non-Gaussian priors have been proposed. However, their effect on clinical tasks is less obvious. In this paper, we compare MAP reconstruction with Gaussian and non-Gaussian priors for lesion detection and region of interest quantification using computer simulation. We evaluate three representative priors: Gaussian prior, Huber prior, and Geman-McClure prior. We simulate imaging a prostate tumor using positron emission tomography (PET). The detectability of a known tumor in either a fixed background or a random background is measured using a channelized Hotelling observer. The bias-variance tradeoff curves are calculated for quantification of the total tumor activity. The results show that for the detection and quantification tasks, the Gaussian prior is as effective as non-Gaussian priors

    Development and implementation of efficient noise suppression methods for emission computed tomography

    Get PDF
    In PET and SPECT imaging, iterative reconstruction is now widely used due to its capability of incorporating into the reconstruction process a physics model and Bayesian statistics involved in photon detection. Iterative reconstruction methods rely on regularization terms to suppress image noise and render radiotracer distribution with good image quality. The choice of regularization method substantially affects the appearances of reconstructed images, and is thus a critical aspect of the reconstruction process. Major contributions of this work include implementation and evaluation of various new regularization methods. Previously, our group developed a preconditioned alternating projection algorithm (PAPA) to optimize the emission computed tomography (ECT) objective function with the non-differentiable total variation (TV) regularizer. The algorithm was modified to optimize the proposed reconstruction objective functions. First, two novel TV-based regularizers—high-order total variation (HOTV) and infimal convolution total variation (ICTV)—were proposed as alternative choices to the customary TV regularizer in SPECT reconstruction, to reduce “staircase” artifacts produced by TV. We have evaluated both proposed reconstruction methods (HOTV-PAPA and ICTV-PAPA), and compared them with the TV regularized reconstruction (TV-PAPA) and the clinical standard, Gaussian post-filtered, expectation-maximization reconstruction method (GPF-EM) using both Monte Carlo-simulated data and anonymized clinical data. Model-observer studies using Monte Carlo-simulated data indicate that ICTV-PAPA is able to reconstruct images with similar or better lesion detectability, compared with clinical standard GPF-EM methods, but at lower detected count levels. This implies that switching from GPF-EM to ICTV-PAPA can reduce patient dose while maintaining image quality for diagnostic use. Second, the 1 norm of discrete cosine transform (DCT)-induced framelet regularization was studied. We decomposed the image into high and low spatial-frequency components, and then preferentially penalized the high spatial-frequency components. The DCT-induced framelet transform of the natural radiotracer distribution image is sparse. By using this property, we were able to effectively suppress image noise without overly compromising spatial resolution or image contrast. Finally, the fractional norm of the first-order spatial gradient was introduced as a regularizer. We implemented 2/3 and 1/2 norms to suppress image spatial variability. Due to the strong penalty of small differences between neighboring pixels, fractional-norm regularizers suffer from similar cartoon-like artifacts as with the TV regularizer. However, when penalty weights are properly selected, fractional-norm regularizers outperform TV in terms of noise suppression and contrast recovery

    Quantitative Techniques for PET/CT: A Clinical Assessment of the Impact of PSF and TOF

    Get PDF
    Tomographic reconstruction has been a challenge for many imaging applications, and it is particularly problematic for count-limited modalities such as Positron Emission Tomography (PET). Recent advances in PET, including the incorporation of time-of-flight (TOF) information and modeling the variation of the point response across the imaging field (PSF), have resulted in significant improvements in image quality. While the effects of these techniques have been characterized with simulations and mathematical modeling, there has been relatively little work investigating the potential impact of such methods in the clinical setting. The objective of this work is to quantify these techniques in the context of realistic lesion detection and localization tasks for a medical environment. Mathematical observers are used to first identify optimal reconstruction parameters and then later to evaluate the performance of the reconstructions. The effect on the reconstruction algorithms is then evaluated for various patient sizes and imaging conditions. The findings for the mathematical observers are compared to, and validated by, the performance of three experienced nuclear medicine physicians completing the same task

    Master of Science

    Get PDF
    thesisPositron emission tomography (PET) images can be reconstructed using a wide variety of techniques. Two aspects of image reconstruction are addressed in this thesis: the number of subsets used for the block-iterative ordered-subsets expectation-maximization (OSEM) reconstruction algorithm, and using smaller in-plane pixels. Both of these aspects of PET image reconstruction affect image quality. Although image quality in PET is difficult to quantify, it can be evaluated objectively using task-basked assessments such as lesion detection studies. The objective of this work was to evaluate both the effect of the number of OSEM subsets and pixel size on general oncologic PET lesion detection. Experimental phantom data were taken from the Utah PET Lesion Detection Database Resource, modeling whole-body oncologic 18F-FDG PET imaging of a 92kg patient. The data comprised multiple scans on a Biograph mCT time-of-flight (TOF) scanner, with up to 23 sources modeling lesions (diam. 6-16 mm) distributed throughout the phantom for each scan. Two observer studies were performed as part of this thesis. In the first study, images were reconstructed with maximum-likelihood expectation-maximization (MLEM) and with OSEM using 12 different numbers of subsets (i.e., 2-84 subsets). Localization receiver operating characteristics (LROC) analysis was applied using a mathematical observer. The probability of correct localization (PLOC) and the area under the LROC (ALROC) curve were used as figures-of merit in order to quantify lesion detection performance. The results demonstrated an overall decline in lesion detection performance as the number of subsets increased. This loss of image quality can be controlled using a moderate number of subsets (i.e., 12-14 or fewer). In the second study, images were reconstructed with 2.036 mm and 4.073 mm in-plane pixels. Similar LROC analysis methods were applied to determine lesion detection performance for each pixel size. The results of this study demonstrated that images with ~2 mm pixels provided higher lesion detection performance than those with ~4 mm pixels. The primary drawback of using smaller pixels (i.e. ~2 mm) was a 4-fold increase in reconstruction time and data storage requirements. Overall, this work demonstrated that reconstructing with moderate subsets or with smaller voxel sizes may provide important benefits for general PET cancer imaging

    Fast approach to evaluate MAP reconstruction for lesion detection and localization

    Full text link

    Optimization of Bayesian emission tomographic reconstruction for region-of-interest quantitation

    Full text link
    Region of interest (ROI) quantitation is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Bayesian methods based on the maximum a posteriori principle (or called penalized maximum likelihood methods) have been developed for emission image reconstructions to deal with the low signal to noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the smoothing parameter of the image prior in Bayesian reconstruction controls the resolution and noise trade-off and hence affects ROI quantitation. In this paper we present an approach for choosing the optimum smoothing parameter in Bayesian reconstruction for ROI quantitation. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Building on the recent progress on deriving the approximate expressions for the local impulse response function and the covariance matrix, we derived simplied theoretical expressions for the bias, the variance, and the ensemble mean squared error (EMSE) of the ROI quantitation. One problem in evaluating ROI quantitation is that the truth is often required for calculating the bias. This is overcome by using ensemble distribution of the activity inside the ROI and computing the average EMSE. The resulting expressions allow fast evaluation of the image quality for different smoothing parameters. The optimum smoothing parameter of the image prior can then be selected to minimize the EMSE

    Incorporating accurate statistical modeling in PET: reconstruction for whole-body imaging

    Get PDF
    Tese de doutoramento em BiofĂ­sica, apresentada Ă  Universidade de Lisboa atravĂ©s da Faculdade de CiĂȘncias, 2007The thesis is devoted to image reconstruction in 3D whole-body PET imaging. OSEM ( Ordered Subsets Expectation maximization ) is a statistical algorithm that assumes Poisson data. However, corrections for physical effects (attenuation, scattered and random coincidences) and detector efficiency remove the Poisson characteristics of these data. The Fourier Rebinning (FORE), that combines 3D imaging with fast 2D reconstructions, requires corrected data. Thus, if it will be used or whenever data are corrected prior to OSEM, the need to restore the Poisson-like characteristics is present. Restoring Poisson-like data, i.e., making the variance equal to the mean, was achieved through the use of weighted OSEM algorithms. One of them is the NECOSEM, relying on the NEC weighting transformation. The distinctive feature of this algorithm is the NEC multiplicative factor, defined as the ratio between the mean and the variance. With real clinical data this is critical, since there is only one value collected for each bin the data value itself. For simulated data, if we keep track of the values for these two statistical moments, the exact values for the NEC weights can be calculated. We have compared the performance of five different weighted algorithms (FORE+AWOSEM, FORE+NECOSEM, ANWOSEM3D, SPOSEM3D and NECOSEM3D) on the basis of tumor detectablity. The comparison was done for simulated and clinical data. In the former case an analytical simulator was used. This is the ideal situation, since all the weighting factors can be exactly determined. For comparing the performance of the algorithms, we used the Non-Prewhitening Matched Filter (NPWMF) numerical observer. With some knowledge obtained from the simulation study we proceeded to the reconstruction of clinical data. In that case, it was necessary to devise a strategy for estimating the NEC weighting factors. The comparison between reconstructed images was done by a physician largely familiar with whole-body PET imaging
    • 

    corecore