1,876 research outputs found

    Incorporating accurate statistical modeling in PET: reconstruction for whole-body imaging

    Get PDF
    Tese de doutoramento em Biofísica, apresentada à Universidade de Lisboa através da Faculdade de Ciências, 2007The thesis is devoted to image reconstruction in 3D whole-body PET imaging. OSEM ( Ordered Subsets Expectation maximization ) is a statistical algorithm that assumes Poisson data. However, corrections for physical effects (attenuation, scattered and random coincidences) and detector efficiency remove the Poisson characteristics of these data. The Fourier Rebinning (FORE), that combines 3D imaging with fast 2D reconstructions, requires corrected data. Thus, if it will be used or whenever data are corrected prior to OSEM, the need to restore the Poisson-like characteristics is present. Restoring Poisson-like data, i.e., making the variance equal to the mean, was achieved through the use of weighted OSEM algorithms. One of them is the NECOSEM, relying on the NEC weighting transformation. The distinctive feature of this algorithm is the NEC multiplicative factor, defined as the ratio between the mean and the variance. With real clinical data this is critical, since there is only one value collected for each bin the data value itself. For simulated data, if we keep track of the values for these two statistical moments, the exact values for the NEC weights can be calculated. We have compared the performance of five different weighted algorithms (FORE+AWOSEM, FORE+NECOSEM, ANWOSEM3D, SPOSEM3D and NECOSEM3D) on the basis of tumor detectablity. The comparison was done for simulated and clinical data. In the former case an analytical simulator was used. This is the ideal situation, since all the weighting factors can be exactly determined. For comparing the performance of the algorithms, we used the Non-Prewhitening Matched Filter (NPWMF) numerical observer. With some knowledge obtained from the simulation study we proceeded to the reconstruction of clinical data. In that case, it was necessary to devise a strategy for estimating the NEC weighting factors. The comparison between reconstructed images was done by a physician largely familiar with whole-body PET imaging

    Grouped-Coordinate Ascent Algorithms for Penalized-Likelihood Transmission Image Reconstruction

    Full text link
    Presents a new class of algorithms for penalized-likelihood reconstruction of attenuation maps from low-count transmission scans. We derive the algorithms by applying to the transmission log-likelihood a version of the convexity technique developed by De Pierro for emission tomography. The new class includes the single-coordinate ascent (SCA) algorithm and Lange's convex algorithm for transmission tomography as special cases. The new grouped-coordinate ascent (GCA) algorithms in the class overcome several limitations associated with previous algorithms. (1) Fewer exponentiations are required than in the transmission maximum likelihood-expectation maximization (ML-EM) algorithm or in the SCA algorithm. (2) The algorithms intrinsically accommodate nonnegativity constraints, unlike many gradient-based methods. (3) The algorithms are easily parallelizable, unlike the SCA algorithm and perhaps line-search algorithms. We show that the GCA algorithms converge faster than the SCA algorithm, even on conventional workstations. An example from a low-count positron emission tomography (PET) transmission scan illustrates the method.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/86021/1/Fessler93.pd

    Joint Image Reconstruction and Segmentation Using the Potts Model

    Full text link
    We propose a new algorithmic approach to the non-smooth and non-convex Potts problem (also called piecewise-constant Mumford-Shah problem) for inverse imaging problems. We derive a suitable splitting into specific subproblems that can all be solved efficiently. Our method does not require a priori knowledge on the gray levels nor on the number of segments of the reconstruction. Further, it avoids anisotropic artifacts such as geometric staircasing. We demonstrate the suitability of our method for joint image reconstruction and segmentation. We focus on Radon data, where we in particular consider limited data situations. For instance, our method is able to recover all segments of the Shepp-Logan phantom from 77 angular views only. We illustrate the practical applicability on a real PET dataset. As further applications, we consider spherical Radon data as well as blurred data

    Noise-Robust Image Reconstruction Based on Minimizing Extended Class of Power-Divergence Measures

    Get PDF
    The problem of tomographic image reconstruction can be reduced to an optimization problem of finding unknown pixel values subject to minimizing the difference between the measured and forward projections. Iterative image reconstruction algorithms provide significant improvements over transform methods in computed tomography. In this paper, we present an extended class of power-divergence measures (PDMs), which includes a large set of distance and relative entropy measures, and propose an iterative reconstruction algorithm based on the extended PDM (EPDM) as an objective function for the optimization strategy. For this purpose, we introduce a system of nonlinear differential equations whose Lyapunov function is equivalent to the EPDM. Then, we derive an iterative formula by multiplicative discretization of the continuous-time system. Since the parameterized EPDM family includes the Kullback–Leibler divergence, the resulting iterative algorithm is a natural extension of the maximum-likelihood expectation-maximization (MLEM) method. We conducted image reconstruction experiments using noisy projection data and found that the proposed algorithm outperformed MLEM and could reconstruct high-quality images that were robust to measured noise by properly selecting parameters

    Stochastic Optimisation Methods Applied to PET Image Reconstruction

    Get PDF
    Positron Emission Tomography (PET) is a medical imaging technique that is used to pro- vide functional information regarding physiological processes. Statistical PET reconstruc- tion attempts to estimate the distribution of radiotracer in the body but this methodology is generally computationally demanding because of the use of iterative algorithms. These algorithms are often accelerated by the utilisation of data subsets, which may result in con- vergence to a limit set rather than the unique solution. Methods exist to relax the update step sizes of subset algorithms but they introduce additional heuristic parameters that may result in extended reconstruction times. This work investigates novel methods to modify subset algorithms to converge to the unique solution while maintaining the acceleration benefits of subset methods. This work begins with a study of an automatic method for increasing subset sizes, called AutoSubsets. This algorithm measures the divergence between two distinct data subset update directions and, if significant, the subset size is increased for future updates. The algorithm is evaluated using both projection and list mode data. The algorithm’s use of small initial subsets benefits early reconstruction but unfortunately, at later updates, the subsets size increases too early, which impedes convergence rates. The main part of this work investigates the application of stochastic variance reduction optimisation algorithms to PET image reconstruction. These algorithms reduce variance due to the use of subsets by incorporating previously computed subset gradients into the update direction. The algorithms are adapted for the application to PET reconstruction. This study evaluates the reconstruction performance of these algorithms when applied to various 3D non-TOF PET simulated, phantom and patient data sets. The impact of a number of algorithm parameters are explored, which includes: subset selection methodologies, the number of subsets, step size methodologies and preconditioners. The results indicate that these stochastic variance reduction algorithms demonstrate superior performance after only a few epochs when compared to a standard PET reconstruction algorithm

    Comparison of different image reconstruction algorithms for Digital Breast Tomosynthesis and assessment of their potential to reduce radiation dose

    Get PDF
    Tese de mestrado, Engenharia Física, 2022, Universidade de Lisboa, Faculdade de CiênciasDigital Breast Tomosynthesis is a three-dimensional medical imaging technique that allows the view of sectional parts of the breast. Obtaining multiple slices of the breast constitutes an advantage in contrast to conventional mammography examination in view of the increased potential in breast cancer detectability. Conventional mammography, despite being a screening success, has undesirable specificity, sensitivity, and high recall rates owing to the overlapping of tissues. Although this new technique promises better diagnostic results, the acquisition methods and image reconstruction algorithms are still under research. Several articles suggest the use of analytic algorithms. However, more recent articles highlight the iterative algorithm’s potential for increasing image quality when compared to the former. The scope of this dissertation was to test the hypothesis of achieving higher quality images using iterative algorithms acquired with lower doses than those using analytic algorithms. In a first stage, the open-source Tomographic Iterative GPU-based Reconstruction (TIGRE) Toolbox for fast and accurate 3D x-ray image reconstruction was used to reconstruct the images acquired using an acrylic phantom. The algorithms used from the toolbox were the Feldkamp, Davis, and Kress, the Simultaneous Algebraic Reconstruction Technique, and the Maximum Likelihood Expectation Maximization algorithm. In a second and final state, the possibility of further reducing the radiation dose using image postprocessing tools was evaluated. A Total Variation Minimization filter was applied to the images reconstructed with the TIGRE toolbox algorithm that provided the best image quality. These were then compared to the images of the commercial unit used for the image acquisitions. With the use of image quality parameters, it was found that the Maximum Likelihood Expectation Maximization algorithm performance was the best of the three for lower radiation doses, especially with the filter. In sum, the result showed the potential of the algorithm in obtaining images with quality for low doses
    • …
    corecore