68,727 research outputs found

    Moment inequalities for functions of independent random variables

    Full text link
    A general method for obtaining moment inequalities for functions of independent random variables is presented. It is a generalization of the entropy method which has been used to derive concentration inequalities for such functions [Boucheron, Lugosi and Massart Ann. Probab. 31 (2003) 1583-1614], and is based on a generalized tensorization inequality due to Latala and Oleszkiewicz [Lecture Notes in Math. 1745 (2000) 147-168]. The new inequalities prove to be a versatile tool in a wide range of applications. We illustrate the power of the method by showing how it can be used to effortlessly re-derive classical inequalities including Rosenthal and Kahane-Khinchine-type inequalities for sums of independent random variables, moment inequalities for suprema of empirical processes and moment inequalities for Rademacher chaos and U-statistics. Some of these corollaries are apparently new. In particular, we generalize Talagrand's exponential inequality for Rademacher chaos of order 2 to any order. We also discuss applications for other complex functions of independent random variables, such as suprema of Boolean polynomials which include, as special cases, subgraph counting problems in random graphs.Comment: Published at http://dx.doi.org/10.1214/009117904000000856 in the Annals of Probability (http://www.imstat.org/aop/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Tail index estimation, concentration and adaptivity

    Get PDF
    This paper presents an adaptive version of the Hill estimator based on Lespki's model selection method. This simple data-driven index selection method is shown to satisfy an oracle inequality and is checked to achieve the lower bound recently derived by Carpentier and Kim. In order to establish the oracle inequality, we derive non-asymptotic variance bounds and concentration inequalities for Hill estimators. These concentration inequalities are derived from Talagrand's concentration inequality for smooth functions of independent exponentially distributed random variables combined with three tools of Extreme Value Theory: the quantile transform, Karamata's representation of slowly varying functions, and R\'enyi's characterisation of the order statistics of exponential samples. The performance of this computationally and conceptually simple method is illustrated using Monte-Carlo simulations

    Modified log-Sobolev inequalities for convex functions on the real line. Sufficient conditions

    Full text link
    We provide a mild sufficient condition for a probability measure on the real line to satisfy a modified log-Sobolev inequality for convex functions, interpolating between the classical log-Sobolev inequality and a Bobkov-Ledoux type inequality. As a consequence we obtain dimension-free two-level concentration results for convex function of independent random variables with sufficiently regular tail decay. We also provide a link between modified log-Sobolev inequalities for convex functions and weak transport-entropy inequalities, complementing recent work by Gozlan, Roberto, Samson, and Tetali.Comment: 25 pages; changes: references and comments about recent results by other Authors added, hypercontractive estimates in Section 3 added, a few typos corrected; accepted for publication in Studia Mathematic

    Concentration inequalities under sub-Gaussian and sub-exponential conditions

    Get PDF
    We prove analogues of the popular bounded difference inequality (also called McDiarmid’s inequality) for functions of independent random variables under sub-Gaussian and sub-exponential conditions. Applied to vector-valued concentration and the method of Rademacher complexities these inequalities allow an easy extension of uniform convergence results for PCA and linear regression to the case of potentially unbounded input- and output variables

    Concentration inequalities for non-Lipschitz functions with bounded derivatives of higher order

    Get PDF
    Building on the inequalities for homogeneous tetrahedral polynomials in independent Gaussian variables due to R. Lata{\l}a we provide a concentration inequality for non-necessarily Lipschitz functions f ⁣:RnRf\colon \R^n \to \R with bounded derivatives of higher orders, which hold when the underlying measure satisfies a family of Sobolev type inequalities \|g- \E g\|_p \le C(p)\|\nabla g\|_p. Such Sobolev type inequalities hold, e.g., if the underlying measure satisfies the log-Sobolev inequality (in which case C(p)CpC(p) \le C\sqrt{p}) or the Poincar\'e inequality (then C(p)CpC(p) \le Cp). Our concentration estimates are expressed in terms of tensor-product norms of the derivatives of ff. When the underlying measure is Gaussian and ff is a polynomial (non-necessarily tetrahedral or homogeneous), our estimates can be reversed (up to a constant depending only on the degree of the polynomial). We also show that for polynomial functions, analogous estimates hold for arbitrary random vectors with independent sub-Gaussian coordinates. We apply our inequalities to general additive functionals of random vectors (in particular linear eigenvalue statistics of random matrices) and the problem of counting cycles of fixed length in Erd\H{o}s-R{\'e}nyi random graphs, obtaining new estimates, optimal in a certain range of parameters
    corecore