7,853 research outputs found

    Concentration inequalities for order statistics

    Full text link
    This note describes non-asymptotic variance and tail bounds for order statistics of samples of independent identically distributed random variables. Those bounds are checked to be asymptotically tight when the sampling distribution belongs to a maximum domain of attraction. If the sampling distribution has non-decreasing hazard rate (this includes the Gaussian distribution), we derive an exponential Efron-Stein inequality for order statistics: an inequality connecting the logarithmic moment generating function of centered order statistics with exponential moments of Efron-Stein (jackknife) estimates of variance. We use this general connection to derive variance and tail bounds for order statistics of Gaussian sample. Those bounds are not within the scope of the Tsirelson-Ibragimov-Sudakov Gaussian concentration inequality. Proofs are elementary and combine R\'enyi's representation of order statistics and the so-called entropy approach to concentration inequalities popularized by M. Ledoux.Comment: 13 page

    Concentration inequalities for random tensors

    Get PDF
    We show how to extend several basic concentration inequalities for simple random tensors X=x1xdX = x_1 \otimes \cdots \otimes x_d where all xkx_k are independent random vectors in Rn\mathbb{R}^n with independent coefficients. The new results have optimal dependence on the dimension nn and the degree dd. As an application, we show that random tensors are well conditioned: (1o(1))nd(1-o(1)) n^d independent copies of the simple random tensor XRndX \in \mathbb{R}^{n^d} are far from being linearly dependent with high probability. We prove this fact for any degree d=o(n/logn)d = o(\sqrt{n/\log n}) and conjecture that it is true for any d=O(n)d = O(n).Comment: A few more typos were correcte

    Concentration inequalities for disordered models

    Full text link
    We use a generalization of Hoeffding's inequality to show concentration results for the free energy of disordered pinning models, assuming only that the disorder has a finite exponential moment. We also prove some concentration inequalities for directed polymers in random environment, which we use to establish a large deviations results for the end position of the polymer under the polymer measure.Comment: Revised versio

    Second-Order Matrix Concentration Inequalities

    Get PDF
    Matrix concentration inequalities give bounds for the spectral-norm deviation of a random matrix from its expected value. These results have a weak dimensional dependence that is sometimes, but not always, necessary. This paper identifies one of the sources of the dimensional term and exploits this insight to develop sharper matrix concentration inequalities. In particular, this analysis delivers two refinements of the matrix Khintchine inequality that use information beyond the matrix variance to reduce or eliminate the dimensional dependence.Comment: 27 pages. Revision corrects technical errors in several place

    Concentration Inequalities from Likelihood Ratio Method

    Full text link
    We explore the applications of our previously established likelihood-ratio method for deriving concentration inequalities for a wide variety of univariate and multivariate distributions. New concentration inequalities for various distributions are developed without the idea of minimizing moment generating functions.Comment: 43 pages, no figur

    Concentration Inequalities for Bounded Random Vectors

    Full text link
    We derive simple concentration inequalities for bounded random vectors, which generalize Hoeffding's inequalities for bounded scalar random variables. As applications, we apply the general results to multinomial and Dirichlet distributions to obtain multivariate concentration inequalities.Comment: 9 pages, no figure
    corecore