3 research outputs found

    Solving ill-posed inverse problems using iterative deep neural networks

    Full text link
    We propose a partially learned approach for the solution of ill posed inverse problems with not necessarily linear forward operators. The method builds on ideas from classical regularization theory and recent advances in deep learning to perform learning while making use of prior information about the inverse problem encoded in the forward operator, noise model and a regularizing functional. The method results in a gradient-like iterative scheme, where the "gradient" component is learned using a convolutional network that includes the gradients of the data discrepancy and regularizer as input in each iteration. We present results of such a partially learned gradient scheme on a non-linear tomographic inversion problem with simulated data from both the Sheep-Logan phantom as well as a head CT. The outcome is compared against FBP and TV reconstruction and the proposed method provides a 5.4 dB PSNR improvement over the TV reconstruction while being significantly faster, giving reconstructions of 512 x 512 volumes in about 0.4 seconds using a single GPU

    Artifacts and Visible Singularities in Limited Data X-Ray Tomography

    Get PDF

    Data-inspired advances in geometric measure theory: generalized surface and shape metrics

    Get PDF
    Modern geometric measure theory, developed largely to solve the Plateau problem, has generated a great deal of technical machinery which is unfortunately regarded as inaccessible by outsiders. Some of its tools (e.g., flat norm distance and decomposition in generalized surface space) hold interest from a theoretical perspective but computational infeasibility prevented practical use. Others, like nonasymptotic densities as shape signatures, have been developed independently for data analysis (e.g., the integral area invariant). The flat norm measures distance between currents (generalized surfaces) by decomposing them in a way that is robust to noise. The simplicial deformation theorem shows currents can be approximated on a simplicial complex, generalizing the classical cubical deformation theorem and proving sharper bounds than Sullivan's convex cellular deformation theorem. Computationally, the discretized flat norm can be expressed as a linear programming problem and solved in polynomial time. Furthermore, the solution is guaranteed to be integral for integral input if the complex satisfies a simple topological condition (absence of relative torsion). This discretized integrality result yields a similar statement for the continuous case: the flat norm decomposition of an integral 1-current in the plane can be taken to be integral, something previously unknown for 1-currents which are not boundaries of 2-currents. Nonasymptotic densities (integral area invariants) taken along the boundary of a shape are often enough to reconstruct the shape. This result is easy when the densities are known for arbitrarily small radii but that is not generally possible in practice. When only a single radius is used, variations on reconstruction results (modulo translation and rotation) of polygons and (a dense set of) smooth curves are presented.Comment: 123 pages, dissertation, includes chapters based on arXiv:1105.5104 and arXiv:1308.245
    corecore