685 research outputs found

    On the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method

    Full text link
    This paper considers the entropy of the sum of (possibly dependent and non-identically distributed) Bernoulli random variables. Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived. The derivation of these bounds combines elements of information theory with the Chen-Stein method for Poisson approximation. The resulting bounds are easy to compute, and their applicability is exemplified. This conference paper presents in part the first half of the paper entitled "An information-theoretic perspective of the Poisson approximation via the Chen-Stein method" (see:arxiv:1206.6811). A generalization of the bounds that considers the accuracy of the Poisson approximation for the entropy of a sum of non-negative, integer-valued and bounded random variables is introduced in the full paper. It also derives lower bounds on the total variation distance, relative entropy and other measures that are not considered in this conference paper.Comment: A conference paper of 5 pages that appears in the Proceedings of the 2012 IEEE International Workshop on Information Theory (ITW 2012), pp. 542--546, Lausanne, Switzerland, September 201

    Relaxation of monotone coupling conditions: Poisson approximation and beyond

    Get PDF
    It is well-known that assumptions of monotonicity in size-bias couplings may be used to prove simple, yet powerful, Poisson approximation results. Here we show how these assumptions may be relaxed, establishing explicit Poisson approximation bounds (depending on the first two moments only) for random variables which satisfy an approximate version of these monotonicity conditions. These are shown to be effective for models where an underlying random variable of interest is contaminated with noise. We also give explicit Poisson approximation bounds for sums of associated or negatively associated random variables. Applications are given to epidemic models, extremes, and random sampling. Finally, we also show how similar techniques may be used to relax the assumptions needed in a Poincar\'e inequality and in a normal approximation result.Comment: 19 page

    Entropy and the Law of Small Numbers

    Get PDF
    Two new information-theoretic methods are introduced for establishing Poisson approximation inequalities. First, using only elementary information-theoretic techniques it is shown that, when Sn=βˆ‘i=1nXiS_n=\sum_{i=1}^nX_i is the sum of the (possibly dependent) binary random variables X1,X2,...,XnX_1,X_2,...,X_n, with E(Xi)=piE(X_i)=p_i and E(S_n)=\la, then \ben D(P_{S_n}\|\Pol)\leq \sum_{i=1}^n p_i^2 + \Big[\sum_{i=1}^nH(X_i) - H(X_1,X_2,..., X_n)\Big], \een where D(P_{S_n}\|{Po}(\la)) is the relative entropy between the distribution of SnS_n and the Poisson(\la) distribution. The first term in this bound measures the individual smallness of the XiX_i and the second term measures their dependence. A general method is outlined for obtaining corresponding bounds when approximating the distribution of a sum of general discrete random variables by an infinitely divisible distribution. Second, in the particular case when the XiX_i are independent, the following sharper bound is established, \ben D(P_{S_n}\|\Pol)\leq \frac{1}{\lambda} \sum_{i=1}^n \frac{p_i^3}{1-p_i}, % \label{eq:abs2} \een and it is also generalized to the case when the XiX_i are general integer-valued random variables. Its proof is based on the derivation of a subadditivity property for a new discrete version of the Fisher information, and uses a recent logarithmic Sobolev inequality for the Poisson distribution.Comment: 15 pages. To appear, IEEE Trans Inform Theor

    Approximating stationary distributions of fast mixing Glauber dynamics, with applications to exponential random graphs

    Full text link
    We provide a general bound on the Wasserstein distance between two arbitrary distributions of sequences of Bernoulli random variables. The bound is in terms of a mixing quantity for the Glauber dynamics of one of the sequences, and a simple expectation of the other. The result is applied to estimate, with explicit error, expectations of functions of random vectors for some Ising models and exponential random graphs in "high temperature" regimes.Comment: Ver3: 24 pages, major revision with new results; Ver2: updated reference; Ver1: 19 pages, 1 figur

    Compound poisson approximation via information functionals

    Full text link
    An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are derived for the distance between the distribution of a sum of independent integer-valued random variables and an appropriately chosen compound Poisson law. In the case where all summands have the same conditional distribution given that they are non-zero, a bound on the relative entropy distance between their sum and the compound Poisson distribution is derived, based on the data-processing property of relative entropy and earlier Poisson approximation results. When the summands have arbitrary distributions, corresponding bounds are derived in terms of the total variation distance. The main technical ingredient is the introduction of two "information functionals,'' and the analysis of their properties. These information functionals play a role analogous to that of the classical Fisher information in normal approximation. Detailed comparisons are made between the resulting inequalities and related bounds

    Nonnormal approximation by Stein's method of exchangeable pairs with application to the Curie--Weiss model

    Full text link
    Let (W,Wβ€²)(W,W') be an exchangeable pair. Assume that E(Wβˆ’Wβ€²βˆ£W)=g(W)+r(W),E(W-W'|W)=g(W)+r(W), where g(W)g(W) is a dominated term and r(W)r(W) is negligible. Let G(t)=∫0tg(s) dsG(t)=\int_0^tg(s)\,ds and define p(t)=c1eβˆ’c0G(t)p(t)=c_1e^{-c_0G(t)}, where c0c_0 is a properly chosen constant and c1=1/βˆ«βˆ’βˆžβˆžeβˆ’c0G(t) dtc_1=1/\int_{-\infty}^{\infty}e^{-c_0G(t)}\,dt. Let YY be a random variable with the probability density function pp. It is proved that WW converges to YY in distribution when the conditional second moment of (Wβˆ’Wβ€²)(W-W') given WW satisfies a law of large numbers. A Berry-Esseen type bound is also given. We use this technique to obtain a Berry-Esseen error bound of order 1/n1/\sqrt{n} in the noncentral limit theorem for the magnetization in the Curie-Weiss ferromagnet at the critical temperature. Exponential approximation with application to the spectrum of the Bernoulli-Laplace Markov chain is also discussed.Comment: Published in at http://dx.doi.org/10.1214/10-AAP712 the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Entry and Return times distribution

    Full text link
    This is a review article on the distributions of entry and return times in dynamical systems which discusses recent results for systems of positive entropy.Comment: To appear in "Dynamical Systems: An International Journal dedicated to the Statistical Properties of Dynamical Systems
    • …
    corecore