10,817 research outputs found

    Concentration of Measure Inequalities in Information Theory, Communications and Coding (Second Edition)

    Full text link
    During the last two decades, concentration inequalities have been the subject of exciting developments in various areas, including convex geometry, functional analysis, statistical physics, high-dimensional statistics, pure and applied probability theory, information theory, theoretical computer science, and learning theory. This monograph focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. In addition to being a survey, this monograph also includes various new recent results derived by the authors. The first part of the monograph introduces classical concentration inequalities for martingales, as well as some recent refinements and extensions. The power and versatility of the martingale approach is exemplified in the context of codes defined on graphs and iterative decoding algorithms, as well as codes for wireless communication. The second part of the monograph introduces the entropy method, an information-theoretic technique for deriving concentration inequalities. The basic ingredients of the entropy method are discussed first in the context of logarithmic Sobolev inequalities, which underlie the so-called functional approach to concentration of measure, and then from a complementary information-theoretic viewpoint based on transportation-cost inequalities and probability in metric spaces. Some representative results on concentration for dependent random variables are briefly summarized, with emphasis on their connections to the entropy method. Finally, we discuss several applications of the entropy method to problems in communications and coding, including strong converses, empirical distributions of good channel codes, and an information-theoretic converse for concentration of measure.Comment: Foundations and Trends in Communications and Information Theory, vol. 10, no 1-2, pp. 1-248, 2013. Second edition was published in October 2014. ISBN to printed book: 978-1-60198-906-

    Modified log-Sobolev inequalities and two-level concentration

    Full text link
    We consider a generic modified logarithmic Sobolev inequality (mLSI) of the form EntÎŒ(ef)≀ρ2EÎŒefΓ(f)2\mathrm{Ent}_{\mu}(e^f) \le \tfrac{\rho}{2} \mathbb{E}_\mu e^f \Gamma(f)^2 for some difference operator Γ\Gamma, and show how it implies two-level concentration inequalities akin to the Hanson--Wright or Bernstein inequality. This can be applied to the continuous (e.\,g. the sphere or bounded perturbations of product measures) as well as discrete setting (the symmetric group, finite measures satisfying an approximate tensorization property, \ldots). Moreover, we use modified logarithmic Sobolev inequalities on the symmetric group SnS_n and for slices of the hypercube to prove Talagrand's convex distance inequality, and provide concentration inequalities for locally Lipschitz functions on SnS_n. Some examples of known statistics are worked out, for which we obtain the correct order of fluctuations, which is consistent with central limit theorems

    Learning from networked examples

    Get PDF
    Many machine learning algorithms are based on the assumption that training examples are drawn independently. However, this assumption does not hold anymore when learning from a networked sample because two or more training examples may share some common objects, and hence share the features of these shared objects. We show that the classic approach of ignoring this problem potentially can have a harmful effect on the accuracy of statistics, and then consider alternatives. One of these is to only use independent examples, discarding other information. However, this is clearly suboptimal. We analyze sample error bounds in this networked setting, providing significantly improved results. An important component of our approach is formed by efficient sample weighting schemes, which leads to novel concentration inequalities

    Transport Inequalities. A Survey

    Full text link
    This is a survey of recent developments in the area of transport inequalities. We investigate their consequences in terms of concentration and deviation inequalities and sketch their links with other functional inequalities and also large deviation theory.Comment: Proceedings of the conference Inhomogeneous Random Systems 2009; 82 pages
    • 

    corecore