306 research outputs found

    Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures

    Full text link
    Sufficient conditions are developed, under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. Recently, one of the authors [O. Johnson, {\em Stoch. Proc. Appl.}, 2007] used a semigroup approach to show that the Poisson has maximal entropy among all ultra-log-concave distributions with fixed mean. We show via a non-trivial extension of this semigroup approach that the natural analog of the Poisson maximum entropy property remains valid if the compound Poisson distributions under consideration are log-concave, but that it fails in general. A parallel maximum entropy result is established for the family of compound binomial measures. Sufficient conditions for compound distributions to be log-concave are discussed and applications to combinatorics are examined; new bounds are derived on the entropy of the cardinality of a random independent set in a claw-free graph, and a connection is drawn to Mason's conjecture for matroids. The present results are primarily motivated by the desire to provide an information-theoretic foundation for compound Poisson approximation and associated limit theorems, analogous to the corresponding developments for the central limit theorem and for Poisson approximation. Our results also demonstrate new links between some probabilistic methods and the combinatorial notions of log-concavity and ultra-log-concavity, and they add to the growing body of work exploring the applications of maximum entropy characterizations to problems in discrete mathematics.Comment: 30 pages. This submission supersedes arXiv:0805.4112v1. Changes in v2: Updated references, typos correcte

    On the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method

    Full text link
    This paper considers the entropy of the sum of (possibly dependent and non-identically distributed) Bernoulli random variables. Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived. The derivation of these bounds combines elements of information theory with the Chen-Stein method for Poisson approximation. The resulting bounds are easy to compute, and their applicability is exemplified. This conference paper presents in part the first half of the paper entitled "An information-theoretic perspective of the Poisson approximation via the Chen-Stein method" (see:arxiv:1206.6811). A generalization of the bounds that considers the accuracy of the Poisson approximation for the entropy of a sum of non-negative, integer-valued and bounded random variables is introduced in the full paper. It also derives lower bounds on the total variation distance, relative entropy and other measures that are not considered in this conference paper.Comment: A conference paper of 5 pages that appears in the Proceedings of the 2012 IEEE International Workshop on Information Theory (ITW 2012), pp. 542--546, Lausanne, Switzerland, September 201

    Relative log-concavity and a pair of triangle inequalities

    Full text link
    The relative log-concavity ordering ≤lc\leq_{\mathrm{lc}} between probability mass functions (pmf's) on non-negative integers is studied. Given three pmf's f,g,hf,g,h that satisfy f≤lcg≤lchf\leq_{\mathrm{lc}}g\leq_{\mathrm{lc}}h, we present a pair of (reverse) triangle inequalities: if ∑iifi=∑iigi<∞,\sum_iif_i=\sum_iig_i<\infty, then D(f∣h)≥D(f∣g)+D(g∣h)D(f|h)\geq D(f|g)+D(g|h) and if ∑iigi=∑iihi<∞,\sum_iig_i=\sum_iih_i<\infty, then D(h∣f)≥D(h∣g)+D(g∣f),D(h|f)\geq D(h|g)+D(g|f), where D(⋅∣⋅)D(\cdot|\cdot) denotes the Kullback--Leibler divergence. These inequalities, interesting in themselves, are also applied to several problems, including maximum entropy characterizations of Poisson and binomial distributions and the best binomial approximation in relative entropy. We also present parallel results for continuous distributions and discuss the behavior of ≤lc\leq_{\mathrm{lc}} under convolution.Comment: Published in at http://dx.doi.org/10.3150/09-BEJ216 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    On the inclusion probabilities in some unequal probability sampling plans without replacement

    Full text link
    Comparison results are obtained for the inclusion probabilities in some unequal probability sampling plans without replacement. For either successive sampling or H\'{a}jek's rejective sampling, the larger the sample size, the more uniform the inclusion probabilities in the sense of majorization. In particular, the inclusion probabilities are more uniform than the drawing probabilities. For the same sample size, and given the same set of drawing probabilities, the inclusion probabilities are more uniform for rejective sampling than for successive sampling. This last result confirms a conjecture of H\'{a}jek (Sampling from a Finite Population (1981) Dekker). Results are also presented in terms of the Kullback--Leibler divergence, showing that the inclusion probabilities for successive sampling are more proportional to the drawing probabilities.Comment: Published in at http://dx.doi.org/10.3150/10-BEJ337 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
    • …
    corecore