95 research outputs found

    Concentration of Measure Inequalities in Information Theory, Communications and Coding (Second Edition)

    Full text link
    During the last two decades, concentration inequalities have been the subject of exciting developments in various areas, including convex geometry, functional analysis, statistical physics, high-dimensional statistics, pure and applied probability theory, information theory, theoretical computer science, and learning theory. This monograph focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. In addition to being a survey, this monograph also includes various new recent results derived by the authors. The first part of the monograph introduces classical concentration inequalities for martingales, as well as some recent refinements and extensions. The power and versatility of the martingale approach is exemplified in the context of codes defined on graphs and iterative decoding algorithms, as well as codes for wireless communication. The second part of the monograph introduces the entropy method, an information-theoretic technique for deriving concentration inequalities. The basic ingredients of the entropy method are discussed first in the context of logarithmic Sobolev inequalities, which underlie the so-called functional approach to concentration of measure, and then from a complementary information-theoretic viewpoint based on transportation-cost inequalities and probability in metric spaces. Some representative results on concentration for dependent random variables are briefly summarized, with emphasis on their connections to the entropy method. Finally, we discuss several applications of the entropy method to problems in communications and coding, including strong converses, empirical distributions of good channel codes, and an information-theoretic converse for concentration of measure.Comment: Foundations and Trends in Communications and Information Theory, vol. 10, no 1-2, pp. 1-248, 2013. Second edition was published in October 2014. ISBN to printed book: 978-1-60198-906-

    User-friendly tail bounds for sums of random matrices

    Get PDF
    This paper presents new probability inequalities for sums of independent, random, self-adjoint matrices. These results place simple and easily verifiable hypotheses on the summands, and they deliver strong conclusions about the large-deviation behavior of the maximum eigenvalue of the sum. Tail bounds for the norm of a sum of random rectangular matrices follow as an immediate corollary. The proof techniques also yield some information about matrix-valued martingales. In other words, this paper provides noncommutative generalizations of the classical bounds associated with the names Azuma, Bennett, Bernstein, Chernoff, Hoeffding, and McDiarmid. The matrix inequalities promise the same diversity of application, ease of use, and strength of conclusion that have made the scalar inequalities so valuable.Comment: Current paper is the version of record. The material on Freedman's inequality has been moved to a separate note; other martingale bounds are described in Caltech ACM Report 2011-0

    Moderate Deviations Analysis of Binary Hypothesis Testing

    Full text link
    This paper is focused on the moderate-deviations analysis of binary hypothesis testing. The analysis relies on a concentration inequality for discrete-parameter martingales with bounded jumps, where this inequality forms a refinement to the Azuma-Hoeffding inequality. Relations of the analysis to the moderate deviations principle for i.i.d. random variables and to the relative entropy are considered.Comment: Presented at the 2012 IEEE International Symposium on Information Theory (ISIT 2012) at MIT, Boston, July 2012. It appears in the Proceedings of ISIT 2012 on pages 826-83
    • …
    corecore