3,891 research outputs found

    Numerical methods for coupled reconstruction and registration in digital breast tomosynthesis.

    Get PDF
    Digital Breast Tomosynthesis (DBT) provides an insight into the fine details of normal fibroglandular tissues and abnormal lesions by reconstructing a pseudo-3D image of the breast. In this respect, DBT overcomes a major limitation of conventional X-ray mam- mography by reducing the confounding effects caused by the superposition of breast tissue. In a breast cancer screening or diagnostic context, a radiologist is interested in detecting change, which might be indicative of malignant disease. To help automate this task image registration is required to establish spatial correspondence between time points. Typically, images, such as MRI or CT, are first reconstructed and then registered. This approach can be effective if reconstructing using a complete set of data. However, for ill-posed, limited-angle problems such as DBT, estimating the deformation is com- plicated by the significant artefacts associated with the reconstruction, leading to severe inaccuracies in the registration. This paper presents a mathematical framework, which couples the two tasks and jointly estimates both image intensities and the parameters of a transformation. Under this framework, we compare an iterative method and a simultaneous method, both of which tackle the problem of comparing DBT data by combining reconstruction of a pair of temporal volumes with their registration. We evaluate our methods using various computational digital phantoms, uncom- pressed breast MR images, and in-vivo DBT simulations. Firstly, we compare both iter- ative and simultaneous methods to the conventional, sequential method using an affine transformation model. We show that jointly estimating image intensities and parametric transformations gives superior results with respect to reconstruction fidelity and regis- tration accuracy. Also, we incorporate a non-rigid B-spline transformation model into our simultaneous method. The results demonstrate a visually plausible recovery of the deformation with preservation of the reconstruction fidelity

    Grouped-Coordinate Ascent Algorithms for Penalized-Likelihood Transmission Image Reconstruction

    Full text link
    Presents a new class of algorithms for penalized-likelihood reconstruction of attenuation maps from low-count transmission scans. We derive the algorithms by applying to the transmission log-likelihood a version of the convexity technique developed by De Pierro for emission tomography. The new class includes the single-coordinate ascent (SCA) algorithm and Lange's convex algorithm for transmission tomography as special cases. The new grouped-coordinate ascent (GCA) algorithms in the class overcome several limitations associated with previous algorithms. (1) Fewer exponentiations are required than in the transmission maximum likelihood-expectation maximization (ML-EM) algorithm or in the SCA algorithm. (2) The algorithms intrinsically accommodate nonnegativity constraints, unlike many gradient-based methods. (3) The algorithms are easily parallelizable, unlike the SCA algorithm and perhaps line-search algorithms. We show that the GCA algorithms converge faster than the SCA algorithm, even on conventional workstations. An example from a low-count positron emission tomography (PET) transmission scan illustrates the method.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/86021/1/Fessler93.pd

    Segmented Attenuation Correction for PET

    Full text link
    The authors describe a hybrid measured/calculated method for attenuation correction in positron emission tomography (PET). This unified reconstruction/segmentation method is based on a penalized weighted least-squares objective function that is minimized using iterative coordinate-descent. Two penalty functions are compared: one for a discrete object parameterization, the other for a continuous parameterization. Simulations demonstrate that the methods can reduce the additional emission image variance typically introduced by noisy attenuation correction factors.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85882/1/Fessler121.pd

    Hybrid Poissoflolynomial Objective Functions for Tomographic Image Reconstruction from Transmission Scans

    Full text link
    This paper describes rapidly converging algorithms for computing attenuation maps from Poisson transmission measurements using penalized-likelihood objective functions. We demonstrate that an under-relaxed cyclic coordinate-ascent algorithm converges faster than the convex algorithm of Lange (see ibid., vol.4, no.10, p.1430-1438, 1995), which in turn converges faster than the expectation-maximization (EM) algorithm for transmission tomography. To further reduce computation, one could replace the log-likelihood objective with a quadratic approximation. However, we show with simulations and analysis that the quadratic objective function leads to biased estimates for low-count measurements. Therefore we introduce hybrid Poisson/polynomial objective functions that use the exact Poisson log-likelihood for detector measurements with low counts, but use computationally efficient quadratic or cubic approximations for the high-count detector measurements. We demonstrate that the hybrid objective functions reduce computation time without increasing estimation bias.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/86023/1/Fessler100.pd

    Locally adaptive image denoising by a statistical multiresolution criterion

    Full text link
    We demonstrate how one can choose the smoothing parameter in image denoising by a statistical multiresolution criterion, both globally and locally. Using inhomogeneous diffusion and total variation regularization as examples for localized regularization schemes, we present an efficient method for locally adaptive image denoising. As expected, the smoothing parameter serves as an edge detector in this framework. Numerical examples illustrate the usefulness of our approach. We also present an application in confocal microscopy

    Joint Estimation of Attenuation and Emission Images from PET Scans

    Full text link
    In modern PET scanners, image reconstruction is performed sequentially in two steps regardless of the reconstruction method: 1. Attenuation correction factor computation (ACF) from transmission scans, 2. Emission image reconstruction using the computed ACFs. This reconstruction scheme does not use all the information in the transmission and emission scans. Post-injection transmission scans contain emission contamination which includes information about emission parameters. Conversely emission scans contain information about the attenuating medium. To use all the available information, the authors propose a joint estimation approach that estimates the attenuation map and the emission image from these two scans. The penalized-likelihood objective function is nonconvex for this problem. The authors propose an algorithm based on paraboloidal surrogates that alternates between emission and attenuation parameters and is guaranteed to monotonically decrease the objective function.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85807/1/Fessler159.pd

    Monotonic Algorithms for Transmission Tomography

    Full text link
    Presents a framework for designing fast and monotonic algorithms for transmission tomography penalized-likelihood image reconstruction. The new algorithms are based on paraboloidal surrogate functions for the log likelihood, Due to the form of the log-likelihood function it is possible to find low curvature surrogate functions that guarantee monotonicity. Unlike previous methods, the proposed surrogate functions lead to monotonic algorithms even for the nonconvex log likelihood that arises due to background events, such as scatter and random coincidences. The gradient and the curvature of the likelihood terms are evaluated only once per iteration. Since the problem is simplified at each iteration, the CPU time is less than that of current algorithms which directly minimize the objective, yet the convergence rate is comparable. The simplicity, monotonicity, and speed of the new algorithms are quite attractive. The convergence rates of the algorithms are demonstrated using real and simulated PET transmission scans.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85831/1/Fessler83.pd

    Algorithms for Joint Estimation of Attenuation and Emission Images in PET

    Full text link
    In positron emission tomography (PET), positron emission from radiolabeled compounds yields two high energy photons emitted in opposing directions. However, often the photons are not detected due to attenuation within the patient. This attenuation is nonuniform and must be corrected to obtain quantitatively accurate emission images. To measure attenuation effects, one typically acquires a PET transmission scan before or after the injection of radiotracer. In commercially available PET scanners, image reconstruction is performed sequentially in two steps regardless of the reconstruction method: 1. Attenuation correction factor computation (ACF) from transmission scans, 2. Emission image reconstruction using the computed ACFs. This two-step reconstruction scheme does not use all the information in the transmission and emission scans. Postinjection transmission scans contain emission contamination that includes information about emission parameters. Similarly, emission scans contain information about the attenuating medium. To use all the available information, we propose a joint estimation approach that estimates the attenuation map and the emission image simultaneously from these two scans. The penalized-likelihood objective function is nonconvex for this problem. We propose an algorithm based on paraboloidal surrogates that alternates between updating emission and attenuation parameters and is guaranteed to monotonically decrease the objective function.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85956/1/Fessler160.pd
    • …
    corecore