12 research outputs found

    Critical Parameter Values and Reconstruction Properties of Discrete Tomography: Application to Experimental Fluid Dynamics

    Full text link
    We analyze representative ill-posed scenarios of tomographic PIV with a focus on conditions for unique volume reconstruction. Based on sparse random seedings of a region of interest with small particles, the corresponding systems of linear projection equations are probabilistically analyzed in order to determine (i) the ability of unique reconstruction in terms of the imaging geometry and the critical sparsity parameter, and (ii) sharpness of the transition to non-unique reconstruction with ghost particles when choosing the sparsity parameter improperly. The sparsity parameter directly relates to the seeding density used for PIV in experimental fluids dynamics that is chosen empirically to date. Our results provide a basic mathematical characterization of the PIV volume reconstruction problem that is an essential prerequisite for any algorithm used to actually compute the reconstruction. Moreover, we connect the sparse volume function reconstruction problem from few tomographic projections to major developments in compressed sensing.Comment: 22 pages, submitted to Fundamenta Informaticae. arXiv admin note: text overlap with arXiv:1208.589

    Empirical average-case relation between undersampling and sparsity in X-ray CT

    Get PDF
    Abstract. In x-ray computed tomography (CT) it is generally acknowledged that reconstruction methods exploiting image sparsity allow reconstruction from a significantly reduced number of projections. The use of such recon-struction methods is motivated by recent progress in compressed sensing (CS). However, the CS framework provides neither guarantees of accurate CT re-construction, nor any relation between sparsity and a sufficient number of measurements for recovery, i.e., perfect reconstruction from noise-free data. We consider reconstruction through 1-norm minimization, as proposed in CS, from data obtained using a standard CT fan-beam sampling pattern. In em-pirical simulation studies we establish quantitatively a relation between the image sparsity and the sufficient number of measurements for recovery within image classes motivated by tomographic applications. We show empirically that the specific relation depends on the image class and in many cases ex-hibits a sharp phase transition as seen in CS, i.e. same-sparsity image require the same number of projections for recovery. Finally we demonstrate that th

    Testable uniqueness conditions for empirical assessment of undersampling levels in total variation-regularized X-ray CT

    Get PDF
    We study recoverability in fan-beam computed tomography (CT) with sparsity and total variation priors: how many underdetermined linear measurements suffice for recovering images of given sparsity? Results from compressed sensing (CS) establish such conditions for, e.g., random measurements, but not for CT. Recoverability is typically tested by checking whether a computed solution recovers the original. This approach cannot guarantee solution uniqueness and the recoverability decision therefore depends on the optimization algorithm. We propose new computational methods to test recoverability by verifying solution uniqueness conditions. Using both reconstruction and uniqueness testing we empirically study the number of CT measurements sufficient for recovery on new classes of sparse test images. We demonstrate an average-case relation between sparsity and sufficient sampling and observe a sharp phase transition as known from CS, but never established for CT. In addition to assessing recoverability more reliably, we show that uniqueness tests are often the faster option.Comment: 18 pages, 7 figures, submitte

    How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray CT

    Get PDF
    We introduce phase-diagram analysis, a standard tool in compressed sensing, to the X-ray CT community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In compressed sensing a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling: First we demonstrate that there are cases where X-ray CT empirically performs comparable with an optimal compressed sensing strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared to standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization.Comment: 24 pages, 13 figure

    Shearlet-based regularization in statistical inverse learning with an application to X-ray tomography

    Get PDF
    Statistical inverse learning theory, a field that lies at the intersection of inverse problems and statistical learning, has lately gained more and more attention. In an effort to steer this interplay more towards the variational regularization framework, convergence rates have recently been proved for a class of convex, pp-homogeneous regularizers with p∈(1,2]p \in (1,2], in the symmetric Bregman distance. Following this path, we take a further step towards the study of sparsity-promoting regularization and extend the aforementioned convergence rates to work with ℓp\ell^p-norm regularization, with p∈(1,2)p \in (1,2), for a special class of non-tight Banach frames, called shearlets, and possibly constrained to some convex set. The p=1p = 1 case is approached as the limit case (1,2)∋p→1(1,2) \ni p \rightarrow 1, by complementing numerical evidence with a (partial) theoretical analysis, based on arguments from Γ\Gamma-convergence theory. We numerically demonstrate our theoretical results in the context of X-ray tomography, under random sampling of the imaging angles, using both simulated and measured data

    Shearlet-based regularization in statistical inverse learning with an application to x-ray tomography

    Get PDF
    Statistical inverse learning theory, a field that lies at the intersection of inverse problems and statistical learning, has lately gained more and more attention. In an effort to steer this interplay more towards the variational regularization framework, convergence rates have recently been proved for a class of convex, p-homogeneous regularizers with p (1, 2], in the symmetric Bregman distance. Following this path, we take a further step towards the study of sparsity-promoting regularization and extend the aforementioned convergence rates to work with .," p -norm regularization, with p (1, 2), for a special class of non-tight Banach frames, called shearlets, and possibly constrained to some convex set. The p = 1 case is approached as the limit case (1, 2) p → 1, by complementing numerical evidence with a (partial) theoretical analysis, based on arguments from "-convergence theory. We numerically validate our theoretical results in the context of x-ray tomography, under random sampling of the imaging angles, using both simulated and measured data. This application allows to effectively verify the theoretical decay, in addition to providing a motivation for the extension to shearlet-based regularization

    SparseBeads data: benchmarking sparsity-regularized computed tomography

    Get PDF
    Sparsity regularization (SR) such as total variation (TV) minimization allows accurate image reconstruction in x-ray computed tomography (CT) from fewer projections than analytical methods. Exactly how few projections suffice and how this number may depend on the image remain poorly understood. Compressive sensing connects the critical number of projections to the image sparsity, but does not cover CT, however empirical results suggest a similar connection. The present work establishes for real CT data a connection between gradient sparsity and the sufficient number of projections for accurate TV-regularized reconstruction. A collection of 48 x-ray CT datasets called SparseBeads was designed for benchmarking SR reconstruction algorithms. Beadpacks comprising glass beads of five different sizes as well as mixtures were scanned in a micro-CT scanner to provide structured datasets with variable image sparsity levels, number of projections and noise levels to allow the systematic assessment of parameters affecting performance of SR reconstruction algorithms6. Using the SparseBeads data, TV-regularized reconstruction quality was assessed as a function of numbers of projections and gradient sparsity. The critical number of projections for satisfactory TV-regularized reconstruction increased almost linearly with the gradient sparsity. This establishes a quantitative guideline from which one may predict how few projections to acquire based on expected sample sparsity level as an aid in planning of dose- or time-critical experiments. The results are expected to hold for samples of similar characteristics, i.e. consisting of few, distinct phases with relatively simple structure. Such cases are plentiful in porous media, composite materials, foams, as well as non-destructive testing and metrology. For samples of other characteristics the proposed methodology may be used to investigate similar relations
    corecore