37 research outputs found

    Improved bounds for Hadwiger's covering problem via thin-shell estimates

    Full text link
    A central problem in discrete geometry, known as Hadwiger's covering problem, asks what the smallest natural number N(n)N\left(n\right) is such that every convex body in Rn{\mathbb R}^{n} can be covered by a union of the interiors of at most N(n)N\left(n\right) of its translates. Despite continuous efforts, the best general upper bound known for this number remains as it was more than sixty years ago, of the order of (2nn)nln⁑n{2n \choose n}n\ln n. In this note, we improve this bound by a sub-exponential factor. That is, we prove a bound of the order of (2nn)eβˆ’cn{2n \choose n}e^{-c\sqrt{n}} for some universal constant c>0c>0. Our approach combines ideas from previous work by Artstein-Avidan and the second named author with tools from Asymptotic Geometric Analysis. One of the key steps is proving a new lower bound for the maximum volume of the intersection of a convex body KK with a translate of βˆ’K-K; in fact, we get the same lower bound for the volume of the intersection of KK and βˆ’K-K when they both have barycenter at the origin. To do so, we make use of measure concentration, and in particular of thin-shell estimates for isotropic log-concave measures. Using the same ideas, we establish an exponentially better bound for N(n)N\left(n\right) when restricting our attention to convex bodies that are ψ2\psi_{2}. By a slightly different approach, an exponential improvement is established also for classes of convex bodies with positive modulus of convexity

    Tyler's and Maronna's M-estimators: Non-Asymptotic Concentration Results

    Full text link
    Tyler's and Maronna's M-estimators, as well as their regularized variants, are popular robust methods to estimate the scatter or covariance matrix of a multivariate distribution. In this work, we study the non-asymptotic behavior of these estimators, for data sampled from a distribution that satisfies one of the following properties: 1) independent sub-Gaussian entries, up to a linear transformation; 2) log-concave distributions; 3) distributions satisfying a convex concentration property. Our main contribution is the derivation of tight non-asymptotic concentration bounds of these M-estimators around a suitably scaled version of the data sample covariance matrix. Prior to our work, non-asymptotic bounds were derived only for Elliptical and Gaussian distributions. Our proof uses a variety of tools from non asymptotic random matrix theory and high dimensional geometry. Finally, we illustrate the utility of our results on two examples of practical interest: sparse covariance and sparse precision matrix estimation

    A Unified Approach to Discrepancy Minimization

    Get PDF
    We study a unified approach and algorithm for constructive discrepancy minimization based on a stochastic process. By varying the parameters of the process, one can recover various state-of-the-art results. We demonstrate the flexibility of the method by deriving a discrepancy bound for smoothed instances, which interpolates between known bounds for worst-case and random instances
    corecore