15 research outputs found
Projected gradient descent for non-convex sparse spike estimation
We propose a new algorithm for sparse spike estimation from Fourier
measurements. Based on theoretical results on non-convex optimization
techniques for off-the-grid sparse spike estimation, we present a projected
gradient descent algorithm coupled with a spectral initialization procedure.
Our algorithm permits to estimate the positions of large numbers of Diracs in
2d from random Fourier measurements. We present, along with the algorithm,
theoretical qualitative insights explaining the success of our algorithm. This
opens a new direction for practical off-the-grid spike estimation with
theoretical guarantees in imaging applications
The basins of attraction of the global minimizers of non-convex inverse problems with low-dimensional models in infinite dimension
Non-convex methods for linear inverse problems with low-dimensional models
have emerged as an alternative to convex techniques. We propose a theoretical
framework where both finite dimensional and infinite dimensional linear inverse
problems can be studied. We show how the size of the the basins of attraction
of the minimizers of such problems is linked with the number of available
measurements. This framework recovers known results about low-rank matrix
estimation and off-the-grid sparse spike estimation, and it provides new
results for Gaussian mixture estimation from linear measurements. keywords:
low-dimensional models, non-convex methods, low-rank matrix recovery,
off-the-grid sparse recovery, Gaussian mixture model estimation from linear
measurements
PROJECTED GRADIENT DESCENT FOR NON-CONVEX SPARSE SPIKE ESTIMATION
We propose an algorithm to perform sparse spike estimation from Fourier measurements. Based on theoretical results on non-convex optimization techniques for off-the-grid sparse spike estimation, we present a simple projected descent algorithm coupled with an initialization procedure. Our algorithm permits to estimate the positions of large numbers of Diracs in 2d from random Fourier measurements. This opens the way for practical estimation of such signals for imaging applications as the algorithm scales well with respect to the dimensions of the problem. We present, along with the algorithm, theoretical qualitative insights explaining the success of our algorithm
Recommended from our members
Efficient and dimension independent methods for neural network surrogate construction and training
In this dissertation I investigate how to efficiently construct neural network surrogates for parametric maps defined by PDEs, and how to use second order information to improve solutions to the related neural network training problem. Many-query problems arising in scientific applications (such as optimization, uncertainty quantification and inference problems) require evaluation of an input output mapping parametrized by a high dimensional nonlinear PDE model. The cost of these evaluations makes solution using the model prohibitive, and efficient accurate surrogates are the key to solving these problems in practice. In this work I investigate neural network surrogates that use model information to detect informed subspaces of the input and output where the parametric map can be represented efficiently. These compact representations require relatively few data to train and outperform conventional data-driven approaches which require large training data sets. Once a neural network is designed, training is a major issue. One seeks to find optimal weights for a neural network that generalize to data not seen during training. In this work I investigate how second order information can be efficiently exploited to design optimizers that have fast convergence and good generalization properties. These optimizers are shown to outperform conventional methods in numerical experiments.Computational Science, Engineering, and Mathematic
Information Geometry
This Special Issue of the journal Entropy, titled “Information Geometry I”, contains a collection of 17 papers concerning the foundations and applications of information geometry. Based on a geometrical interpretation of probability, information geometry has become a rich mathematical field employing the methods of differential geometry. It has numerous applications to data science, physics, and neuroscience. Presenting original research, yet written in an accessible, tutorial style, this collection of papers will be useful for scientists who are new to the field, while providing an excellent reference for the more experienced researcher. Several papers are written by authorities in the field, and topics cover the foundations of information geometry, as well as applications to statistics, Bayesian inference, machine learning, complex systems, physics, and neuroscience