31 research outputs found

    Accuracy guarantees for L1-recovery

    Full text link
    We discuss two new methods of recovery of sparse signals from noisy observation based on â„“1\ell_1- minimization. They are closely related to the well-known techniques such as Lasso and Dantzig Selector. However, these estimators come with efficiently verifiable guaranties of performance. By optimizing these bounds with respect to the method parameters we are able to construct the estimators which possess better statistical properties than the commonly used ones. We also show how these techniques allow to provide efficiently computable accuracy bounds for Lasso and Dantzig Selector. We link our performance estimations to the well known results of Compressive Sensing and justify our proposed approach with an oracle inequality which links the properties of the recovery algorithms and the best estimation performance when the signal support is known. We demonstrate how the estimates can be computed using the Non-Euclidean Basis Pursuit algorithm

    Large Deviations of Vector-valued Martingales in 2-Smooth Normed Spaces

    Full text link
    We derive exponential bounds on probabilities of large deviations for "light tail" martingales taking values in finite-dimensional normed spaces. Our primary emphasis is on the case where the bounds are dimension-independent or nearly so. We demonstrate that this is the case when the norm on the space can be approximated, within an absolute constant factor, by a norm which is differentiable on the unit sphere with a Lipschitz continuous gradient. We also present various examples of spaces possessing the latter property

    Deterministic and stochastic first order algorithms of large-scale convex optimization

    No full text
    International audienceDeterministic and stochastic first order algorithms of large-scale convex optimization Syllabus: First Order Methods of proximal type Nonsmooth Black Box setting: deterministic and stochastic Mirror Descent algorithm (MD). Convex-Concave Saddle Point problems via MD. Utilizing problem's structure: Mirror Prox algorithm (MP). Smooth/Bilinear Saddle Point reformulations of convex problems: calculus and examples. The Mirror Prox algorithm. Favorable geometry domains and good proximal setups. Conditional Gradient type First Order Methods for problems with difficult geometry: Convex problems with difficult geometry. Smooth minimization, norm-regularized smooth minimization. Nonsmooth minimization

    First Order Methods for Nonsmooth Convex Large-Scale Optimization, II: Utilizing Problem's Structure

    No full text
    International audienceWe present several state-of-the-art First Order methods for "well-structured" large-scale nonsmooth convex programs. In contrast to their "black-boxoriented" prototypes considered in Chapter 1, the methods in question utilize the problem structure in order to convert the original nonsmooth minimization problem into a saddle point problem with smooth convex-concave cost function. This reformulation allows to accelerate signi cantly the solution process. As in Chapter 1, our emphasis is on methods which, under favorable circumstances, exhibit (nearly) dimension-independent convergence rate. Along with investigating the general "well-structured" situation, we outline possibilities to further accelerate First Order methods by randomization

    Discovery of harmonic oscillations in white noise

    No full text
    International audienc

    Contents lists available at ScienceDirect Applied and Computational Harmonic Analysis

    Get PDF
    www.elsevier.com/locate/acha Nonparametric denoising signals of unknown local structure, II: Nonparametric function recover

    Optimisation convexe pour estimation et tests non paramétriques

    No full text
    International audienceNous étudions le problème suivant: étant donné 1) un ensemble convexe compacte X, une application affine X -> A (X) et une famille paramétrique de densités (p_z(.)) de probabilité; 2) N observations i.i.d. de la variable aléatoire W, distribuée avec la densité p_y(.) avec y=A(x) pour un certain x de X (inconnu); estimer la valeur g(x) d'une forme linéaire g en x. Pour quelques familles importantes de densité p_y, sans autres hypothèses sur X et A, nous développons des algorithmes efficaces de calculs d'estimateurs minmax à un facteur absolu près. Nous montrons comment ce type d'algorithmes peut être utilisé dans d'autres problèmes d'estimation et de test, notamment, dans ceux de détection de changement dans des problèmes inverses et d'estimation de signal en norme euclidienne

    Nonparametric estimation by convex optimization

    No full text
    International audienc

    On statistical applications of l1-recovery

    No full text
    International audienceWe consider the problem of recovery of a sparse signal x∈RMx\in R^M from noisy observation y=Ax+σξy=Ax+\sigma\xi, where A∈Rn×MA\in R^{n\times M}, ξ∼N(0,In)\xi\sim N(0,I_n) and nn may be (much) smaller than MM. We propose new methods of for recovery of xx based on ℓ1\ell_1-minimization. Though they are intimately related to well-known techniques such as Lasso and Dantzig Selector, they often possess better statistical properties. These new procedures are based on veri able su cient conditions of exact recovery in Compressive Sensing and come with effi ciently veri fiable guaranties of performance. We also discuss fast implementation of the estimation routines, based on Non-Euclidean Basis Pursuit algorithm
    corecore