314 research outputs found

    Enhanced imaging of microcalcifications in digital breast tomosynthesis through improved image-reconstruction algorithms

    Full text link
    PURPOSE: We develop a practical, iterative algorithm for image-reconstruction in under-sampled tomographic systems, such as digital breast tomosynthesis (DBT). METHOD: The algorithm controls image regularity by minimizing the image total pp-variation (TpV), a function that reduces to the total variation when p=1.0p=1.0 or the image roughness when p=2.0p=2.0. Constraints on the image, such as image positivity and estimated projection-data tolerance, are enforced by projection onto convex sets (POCS). The fact that the tomographic system is under-sampled translates to the mathematical property that many widely varied resultant volumes may correspond to a given data tolerance. Thus the application of image regularity serves two purposes: (1) reduction of the number of resultant volumes out of those allowed by fixing the data tolerance, finding the minimum image TpV for fixed data tolerance, and (2) traditional regularization, sacrificing data fidelity for higher image regularity. The present algorithm allows for this dual role of image regularity in under-sampled tomography. RESULTS: The proposed image-reconstruction algorithm is applied to three clinical DBT data sets. The DBT cases include one with microcalcifications and two with masses. CONCLUSION: Results indicate that there may be a substantial advantage in using the present image-reconstruction algorithm for microcalcification imaging.Comment: Submitted to Medical Physic

    How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray CT

    Get PDF
    We introduce phase-diagram analysis, a standard tool in compressed sensing, to the X-ray CT community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In compressed sensing a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling: First we demonstrate that there are cases where X-ray CT empirically performs comparable with an optimal compressed sensing strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared to standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization.Comment: 24 pages, 13 figure

    Toward optimal X-ray flux utilization in breast CT

    Full text link
    A realistic computer-simulation of a breast computed tomography (CT) system and subject is constructed. The model is used to investigate the optimal number of views for the scan given a fixed total X-ray fluence. The reconstruction algorithm is based on accurate solution to a constrained, TV-minimization problem, which has received much interest recently for sparse-view CT data.Comment: accepted to the 11th International Meeting on Fully Three-Dimensional Image Reconstruction in Radiology and Nuclear Medicine 201

    Convex optimization problem prototyping for image reconstruction in computed tomography with the Chambolle-Pock algorithm

    Get PDF
    The primal-dual optimization algorithm developed in Chambolle and Pock (CP), 2011 is applied to various convex optimization problems of interest in computed tomography (CT) image reconstruction. This algorithm allows for rapid prototyping of optimization problems for the purpose of designing iterative image reconstruction algorithms for CT. The primal-dual algorithm is briefly summarized in the article, and its potential for prototyping is demonstrated by explicitly deriving CP algorithm instances for many optimization problems relevant to CT. An example application modeling breast CT with low-intensity X-ray illumination is presented.Comment: Resubmitted to Physics in Medicine and Biology. Text has been modified according to referee comments, and typos in the equations have been correcte

    Noise properties of CT images reconstructed by use of constrained total-variation, data-discrepancy minimization

    Get PDF
    PURPOSE: The authors develop and investigate iterative image reconstruction algorithms based on data-discrepancy minimization with a total-variation (TV) constraint. The various algorithms are derived with different data-discrepancy measures reflecting the maximum likelihood (ML) principle. Simulations demonstrate the iterative algorithms and the resulting image statistical properties for low-dose CT data acquired with sparse projection view angle sampling. Of particular interest is to quantify improvement of image statistical properties by use of the ML data fidelity term. METHODS: An incremental algorithm framework is developed for this purpose. The instances of the incremental algorithms are derived for solving optimization problems including a data fidelity objective function combined with a constraint on the image TV. For the data fidelity term the authors, compare application of the maximum likelihood principle, in the form of weighted least-squares (WLSQ) and Poisson-likelihood (PL), with the use of unweighted least-squares (LSQ). RESULTS: The incremental algorithms are applied to projection data generated by a simulation modeling the breast computed tomography (bCT) imaging application. The only source of data inconsistency in the bCT projections is due to noise, and a Poisson distribution is assumed for the transmitted x-ray photon intensity. In the simulations involving the incremental algorithms an ensemble of images, reconstructed from 1000 noise realizations of the x-ray transmission data, is used to estimate the image statistical properties. The WLSQ and PL incremental algorithms are seen to reduce image variance as compared to that of LSQ without sacrificing image bias. The difference is also seen at few iterations—short of numerical convergence of the corresponding optimization problems. CONCLUSIONS: The proposed incremental algorithms prove effective and efficient for iterative image reconstruction in low-dose CT applications particularly with sparse-view projection data

    Problem Identification and Decomposition within the Requirements Generation Process

    Get PDF
    Only recently has the real importance of the requirements generation process and its requisite activities been recognized. That importance is underscored by the evolving partitions and refinements of the once all-encompassing (and somewhat miss-named) Requirements Analysis phase of the software development lifecycle. Continuing along that evolutionary line, we propose an additional refinement to the requirements generation model that focuses on problem identification and its decomposition into an associated set of user needs that drive the requirements generation process. Problem identification stresses the importance of recognizing and identifying the difference between a perceived state of the system and the desired one. We mention pre- and post-conditions that help identify and bound the problem and then present some methods and techniques that assist in refining that boundary and also in recognizing essential characteristics of the problem. We continue by presenting a process by which the identified problem and its characteristics are decomposed and translated into a set of user needs that provide the basis for the solution description, i.e, the set of requirements. Finally, to place problem identification and decomposition in perspective, we present them within the framework of the Requirements Generation Model

    A parametric level-set method for partially discrete tomography

    Get PDF
    This paper introduces a parametric level-set method for tomographic reconstruction of partially discrete images. Such images consist of a continuously varying background and an anomaly with a constant (known) grey-value. We represent the geometry of the anomaly using a level-set function, which we represent using radial basis functions. We pose the reconstruction problem as a bi-level optimization problem in terms of the background and coefficients for the level-set function. To constrain the background reconstruction we impose smoothness through Tikhonov regularization. The bi-level optimization problem is solved in an alternating fashion; in each iteration we first reconstruct the background and consequently update the level-set function. We test our method on numerical phantoms and show that we can successfully reconstruct the geometry of the anomaly, even from limited data. On these phantoms, our method outperforms Total Variation reconstruction, DART and P-DART.Comment: Paper submitted to 20th International Conference on Discrete Geometry for Computer Imager

    GPU-based Iterative Cone Beam CT Reconstruction Using Tight Frame Regularization

    Full text link
    X-ray imaging dose from serial cone-beam CT (CBCT) scans raises a clinical concern in most image guided radiation therapy procedures. It is the goal of this paper to develop a fast GPU-based algorithm to reconstruct high quality CBCT images from undersampled and noisy projection data so as to lower the imaging dose. For this purpose, we have developed an iterative tight frame (TF) based CBCT reconstruction algorithm. A condition that a real CBCT image has a sparse representation under a TF basis is imposed in the iteration process as regularization to the solution. To speed up the computation, a multi-grid method is employed. Our GPU implementation has achieved high computational efficiency and a CBCT image of resolution 512\times512\times70 can be reconstructed in ~5 min. We have tested our algorithm on a digital NCAT phantom and a physical Catphan phantom. It is found that our TF-based algorithm is able to reconstrct CBCT in the context of undersampling and low mAs levels. We have also quantitatively analyzed the reconstructed CBCT image quality in terms of modulation-transfer-function and contrast-to-noise ratio under various scanning conditions. The results confirm the high CBCT image quality obtained from our TF algorithm. Moreover, our algorithm has also been validated in a real clinical context using a head-and-neck patient case. Comparisons of the developed TF algorithm and the current state-of-the-art TV algorithm have also been made in various cases studied in terms of reconstructed image quality and computation efficiency.Comment: 24 pages, 8 figures, accepted by Phys. Med. Bio
    • …
    corecore