9,135 research outputs found

    Low Complexity Regularization of Linear Inverse Problems

    Full text link
    Inverse problems and regularization theory is a central theme in contemporary signal processing, where the goal is to reconstruct an unknown signal from partial indirect, and possibly noisy, measurements of it. A now standard method for recovering the unknown signal is to solve a convex optimization problem that enforces some prior knowledge about its structure. This has proved efficient in many problems routinely encountered in imaging sciences, statistics and machine learning. This chapter delivers a review of recent advances in the field where the regularization prior promotes solutions conforming to some notion of simplicity/low-complexity. These priors encompass as popular examples sparsity and group sparsity (to capture the compressibility of natural signals and images), total variation and analysis sparsity (to promote piecewise regularity), and low-rank (as natural extension of sparsity to matrix-valued data). Our aim is to provide a unified treatment of all these regularizations under a single umbrella, namely the theory of partial smoothness. This framework is very general and accommodates all low-complexity regularizers just mentioned, as well as many others. Partial smoothness turns out to be the canonical way to encode low-dimensional models that can be linear spaces or more general smooth manifolds. This review is intended to serve as a one stop shop toward the understanding of the theoretical properties of the so-regularized solutions. It covers a large spectrum including: (i) recovery guarantees and stability to noise, both in terms of 2\ell^2-stability and model (manifold) identification; (ii) sensitivity analysis to perturbations of the parameters involved (in particular the observations), with applications to unbiased risk estimation ; (iii) convergence properties of the forward-backward proximal splitting scheme, that is particularly well suited to solve the corresponding large-scale regularized optimization problem

    Paving the way for transitions --- a case for Weyl geometry

    Get PDF
    This paper presents three aspects by which the Weyl geometric generalization of Riemannian geometry, and of Einstein gravity, sheds light on actual questions of physics and its philosophical reflection. After introducing the theory's principles, it explains how Weyl geometric gravity relates to Jordan-Brans-Dicke theory. We then discuss the link between gravity and the electroweak sector of elementary particle physics, as it looks from the Weyl geometric perspective. Weyl's hypothesis of a preferred scale gauge, setting Weyl scalar curvature to a constant, gets new support from the interplay of the gravitational scalar field and the electroweak one (the Higgs field). This has surprising consequences for cosmological models. In particular it leads to a static (Weyl geometric) spacetime with "inbuilt" cosmological redshift. This may be used for putting central features of the present cosmological model into a wider perspective.Comment: 54 pp, 2 figs. To appear in D. Lehmkuhl (ed.) "Towards a Theory of Spacetime Theories", Einstein Studies, Basel: Birkhaeuser), revised version June 201

    Computerized Analysis of Magnetic Resonance Images to Study Cerebral Anatomy in Developing Neonates

    Get PDF
    The study of cerebral anatomy in developing neonates is of great importance for the understanding of brain development during the early period of life. This dissertation therefore focuses on three challenges in the modelling of cerebral anatomy in neonates during brain development. The methods that have been developed all use Magnetic Resonance Images (MRI) as source data. To facilitate study of vascular development in the neonatal period, a set of image analysis algorithms are developed to automatically extract and model cerebral vessel trees. The whole process consists of cerebral vessel tracking from automatically placed seed points, vessel tree generation, and vasculature registration and matching. These algorithms have been tested on clinical Time-of- Flight (TOF) MR angiographic datasets. To facilitate study of the neonatal cortex a complete cerebral cortex segmentation and reconstruction pipeline has been developed. Segmentation of the neonatal cortex is not effectively done by existing algorithms designed for the adult brain because the contrast between grey and white matter is reversed. This causes pixels containing tissue mixtures to be incorrectly labelled by conventional methods. The neonatal cortical segmentation method that has been developed is based on a novel expectation-maximization (EM) method with explicit correction for mislabelled partial volume voxels. Based on the resulting cortical segmentation, an implicit surface evolution technique is adopted for the reconstruction of the cortex in neonates. The performance of the method is investigated by performing a detailed landmark study. To facilitate study of cortical development, a cortical surface registration algorithm for aligning the cortical surface is developed. The method first inflates extracted cortical surfaces and then performs a non-rigid surface registration using free-form deformations (FFDs) to remove residual alignment. Validation experiments using data labelled by an expert observer demonstrate that the method can capture local changes and follow the growth of specific sulcus

    On the use of SIFT features for face authentication

    Get PDF
    Several pattern recognition and classification techniques have been applied to the biometrics domain. Among them, an interesting technique is the Scale Invariant Feature Transform (SIFT), originally devised for object recognition. Even if SIFT features have emerged as a very powerful image descriptors, their employment in face analysis context has never been systematically investigated. This paper investigates the application of the SIFT approach in the context of face authentication. In order to determine the real potential and applicability of the method, different matching schemes are proposed and tested using the BANCA database and protocol, showing promising results

    Fast reconstruction of 3D blood flows from Doppler ultrasound images and reduced models

    Full text link
    This paper deals with the problem of building fast and reliable 3D reconstruction methods for blood flows for which partial information is given by Doppler ultrasound measurements. This task is of interest in medicine since it could enrich the available information used in the diagnosis of certain diseases which is currently based essentially on the measurements coming from ultrasound devices. The fast reconstruction of the full flow can be performed with state estimation methods that have been introduced in recent years and that involve reduced order models. One simple and efficient strategy is the so-called Parametrized Background Data-Weak approach (PBDW). It is a linear mapping that consists in a least squares fit between the measurement data and a linear reduced model to which a certain correction term is added. However, in the original approach, the reduced model is built a priori and independently of the reconstruction task (typically with a proper orthogonal decomposition or a greedy algorithm). In this paper, we investigate the construction of other reduced spaces which are built to be better adapted to the reconstruction task and which result in mappings that are sometimes nonlinear. We compare the performance of the different algorithms on numerical experiments involving synthetic Doppler measurements. The results illustrate the superiority of the proposed alternatives to the classical linear PBDW approach
    corecore