1,286 research outputs found

    Discussion: The Dantzig selector: Statistical estimation when pp is much larger than nn

    Get PDF
    Discussion of ``The Dantzig selector: Statistical estimation when pp is much larger than nn'' [math/0506081]Comment: Published in at http://dx.doi.org/10.1214/009053607000000442 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Generalized Dantzig Selector: Application to the k-support norm

    Full text link
    We propose a Generalized Dantzig Selector (GDS) for linear models, in which any norm encoding the parameter structure can be leveraged for estimation. We investigate both computational and statistical aspects of the GDS. Based on conjugate proximal operator, a flexible inexact ADMM framework is designed for solving GDS, and non-asymptotic high-probability bounds are established on the estimation error, which rely on Gaussian width of unit norm ball and suitable set encompassing estimation error. Further, we consider a non-trivial example of the GDS using kk-support norm. We derive an efficient method to compute the proximal operator for kk-support norm since existing methods are inapplicable in this setting. For statistical analysis, we provide upper bounds for the Gaussian widths needed in the GDS analysis, yielding the first statistical recovery guarantee for estimation with the kk-support norm. The experimental results confirm our theoretical analysis.Comment: Updates to boun

    Accuracy guarantees for L1-recovery

    Full text link
    We discuss two new methods of recovery of sparse signals from noisy observation based on â„“1\ell_1- minimization. They are closely related to the well-known techniques such as Lasso and Dantzig Selector. However, these estimators come with efficiently verifiable guaranties of performance. By optimizing these bounds with respect to the method parameters we are able to construct the estimators which possess better statistical properties than the commonly used ones. We also show how these techniques allow to provide efficiently computable accuracy bounds for Lasso and Dantzig Selector. We link our performance estimations to the well known results of Compressive Sensing and justify our proposed approach with an oracle inequality which links the properties of the recovery algorithms and the best estimation performance when the signal support is known. We demonstrate how the estimates can be computed using the Non-Euclidean Basis Pursuit algorithm

    Templates for Convex Cone Problems with Applications to Sparse Signal Recovery

    Full text link
    This paper develops a general framework for solving a variety of convex cone problems that frequently arise in signal processing, machine learning, statistics, and other fields. The approach works as follows: first, determine a conic formulation of the problem; second, determine its dual; third, apply smoothing; and fourth, solve using an optimal first-order method. A merit of this approach is its flexibility: for example, all compressed sensing problems can be solved via this approach. These include models with objective functionals such as the total-variation norm, ||Wx||_1 where W is arbitrary, or a combination thereof. In addition, the paper also introduces a number of technical contributions such as a novel continuation scheme, a novel approach for controlling the step size, and some new results showing that the smooth and unsmoothed problems are sometimes formally equivalent. Combined with our framework, these lead to novel, stable and computationally efficient algorithms. For instance, our general implementation is competitive with state-of-the-art methods for solving intensively studied problems such as the LASSO. Further, numerical experiments show that one can solve the Dantzig selector problem, for which no efficient large-scale solvers exist, in a few hundred iterations. Finally, the paper is accompanied with a software release. This software is not a single, monolithic solver; rather, it is a suite of programs and routines designed to serve as building blocks for constructing complete algorithms.Comment: The TFOCS software is available at http://tfocs.stanford.edu This version has updated reference
    • …
    corecore