114 research outputs found
Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
Sufficient conditions are developed, under which the compound Poisson
distribution has maximal entropy within a natural class of probability measures
on the nonnegative integers. Recently, one of the authors [O. Johnson, {\em
Stoch. Proc. Appl.}, 2007] used a semigroup approach to show that the Poisson
has maximal entropy among all ultra-log-concave distributions with fixed mean.
We show via a non-trivial extension of this semigroup approach that the natural
analog of the Poisson maximum entropy property remains valid if the compound
Poisson distributions under consideration are log-concave, but that it fails in
general. A parallel maximum entropy result is established for the family of
compound binomial measures. Sufficient conditions for compound distributions to
be log-concave are discussed and applications to combinatorics are examined;
new bounds are derived on the entropy of the cardinality of a random
independent set in a claw-free graph, and a connection is drawn to Mason's
conjecture for matroids. The present results are primarily motivated by the
desire to provide an information-theoretic foundation for compound Poisson
approximation and associated limit theorems, analogous to the corresponding
developments for the central limit theorem and for Poisson approximation. Our
results also demonstrate new links between some probabilistic methods and the
combinatorial notions of log-concavity and ultra-log-concavity, and they add to
the growing body of work exploring the applications of maximum entropy
characterizations to problems in discrete mathematics.Comment: 30 pages. This submission supersedes arXiv:0805.4112v1. Changes in
v2: Updated references, typos correcte
Relative log-concavity and a pair of triangle inequalities
The relative log-concavity ordering between probability
mass functions (pmf's) on non-negative integers is studied. Given three pmf's
that satisfy , we present a
pair of (reverse) triangle inequalities: if
then and if then
where denotes the
Kullback--Leibler divergence. These inequalities, interesting in themselves,
are also applied to several problems, including maximum entropy
characterizations of Poisson and binomial distributions and the best binomial
approximation in relative entropy. We also present parallel results for
continuous distributions and discuss the behavior of under
convolution.Comment: Published in at http://dx.doi.org/10.3150/09-BEJ216 the Bernoulli
(http://isi.cbs.nl/bernoulli/) by the International Statistical
Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
On the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method
This paper considers the entropy of the sum of (possibly dependent and
non-identically distributed) Bernoulli random variables. Upper bounds on the
error that follows from an approximation of this entropy by the entropy of a
Poisson random variable with the same mean are derived. The derivation of these
bounds combines elements of information theory with the Chen-Stein method for
Poisson approximation. The resulting bounds are easy to compute, and their
applicability is exemplified. This conference paper presents in part the first
half of the paper entitled "An information-theoretic perspective of the Poisson
approximation via the Chen-Stein method" (see:arxiv:1206.6811). A
generalization of the bounds that considers the accuracy of the Poisson
approximation for the entropy of a sum of non-negative, integer-valued and
bounded random variables is introduced in the full paper. It also derives lower
bounds on the total variation distance, relative entropy and other measures
that are not considered in this conference paper.Comment: A conference paper of 5 pages that appears in the Proceedings of the
2012 IEEE International Workshop on Information Theory (ITW 2012), pp.
542--546, Lausanne, Switzerland, September 201
Moments, Concentration, and Entropy of Log-Concave Distributions
We utilize and extend a simple and classical mechanism, combining
log-concavity and majorization in the convex order to derive moments,
concentration, and entropy inequalities for certain classes of log-concave
distributions.Comment: 19 page
- …