685 research outputs found
On the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method
This paper considers the entropy of the sum of (possibly dependent and
non-identically distributed) Bernoulli random variables. Upper bounds on the
error that follows from an approximation of this entropy by the entropy of a
Poisson random variable with the same mean are derived. The derivation of these
bounds combines elements of information theory with the Chen-Stein method for
Poisson approximation. The resulting bounds are easy to compute, and their
applicability is exemplified. This conference paper presents in part the first
half of the paper entitled "An information-theoretic perspective of the Poisson
approximation via the Chen-Stein method" (see:arxiv:1206.6811). A
generalization of the bounds that considers the accuracy of the Poisson
approximation for the entropy of a sum of non-negative, integer-valued and
bounded random variables is introduced in the full paper. It also derives lower
bounds on the total variation distance, relative entropy and other measures
that are not considered in this conference paper.Comment: A conference paper of 5 pages that appears in the Proceedings of the
2012 IEEE International Workshop on Information Theory (ITW 2012), pp.
542--546, Lausanne, Switzerland, September 201
Relaxation of monotone coupling conditions: Poisson approximation and beyond
It is well-known that assumptions of monotonicity in size-bias couplings may
be used to prove simple, yet powerful, Poisson approximation results. Here we
show how these assumptions may be relaxed, establishing explicit Poisson
approximation bounds (depending on the first two moments only) for random
variables which satisfy an approximate version of these monotonicity
conditions. These are shown to be effective for models where an underlying
random variable of interest is contaminated with noise. We also give explicit
Poisson approximation bounds for sums of associated or negatively associated
random variables. Applications are given to epidemic models, extremes, and
random sampling. Finally, we also show how similar techniques may be used to
relax the assumptions needed in a Poincar\'e inequality and in a normal
approximation result.Comment: 19 page
Entropy and the Law of Small Numbers
Two new information-theoretic methods are introduced for establishing Poisson
approximation inequalities. First, using only elementary information-theoretic
techniques it is shown that, when is the sum of the
(possibly dependent) binary random variables , with
and E(S_n)=\la, then \ben D(P_{S_n}\|\Pol)\leq \sum_{i=1}^n
p_i^2 + \Big[\sum_{i=1}^nH(X_i) - H(X_1,X_2,..., X_n)\Big], \een where
D(P_{S_n}\|{Po}(\la)) is the relative entropy between the distribution of
and the Poisson(\la) distribution. The first term in this bound
measures the individual smallness of the and the second term measures
their dependence. A general method is outlined for obtaining corresponding
bounds when approximating the distribution of a sum of general discrete random
variables by an infinitely divisible distribution.
Second, in the particular case when the are independent, the following
sharper bound is established, \ben D(P_{S_n}\|\Pol)\leq \frac{1}{\lambda}
\sum_{i=1}^n \frac{p_i^3}{1-p_i}, % \label{eq:abs2} \een and it is also
generalized to the case when the are general integer-valued random
variables. Its proof is based on the derivation of a subadditivity property for
a new discrete version of the Fisher information, and uses a recent logarithmic
Sobolev inequality for the Poisson distribution.Comment: 15 pages. To appear, IEEE Trans Inform Theor
Approximating stationary distributions of fast mixing Glauber dynamics, with applications to exponential random graphs
We provide a general bound on the Wasserstein distance between two arbitrary
distributions of sequences of Bernoulli random variables. The bound is in terms
of a mixing quantity for the Glauber dynamics of one of the sequences, and a
simple expectation of the other. The result is applied to estimate, with
explicit error, expectations of functions of random vectors for some Ising
models and exponential random graphs in "high temperature" regimes.Comment: Ver3: 24 pages, major revision with new results; Ver2: updated
reference; Ver1: 19 pages, 1 figur
Compound poisson approximation via information functionals
An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are derived for the distance between the distribution of a sum of independent integer-valued random variables and an appropriately chosen compound Poisson law. In the case where all summands have the same conditional distribution given that they are non-zero, a bound on the relative entropy distance between their sum and the compound Poisson distribution is derived, based on the data-processing property of relative entropy and earlier Poisson approximation results. When the summands have arbitrary distributions, corresponding bounds are derived in terms of the total variation distance. The main technical ingredient is the introduction of two "information functionals,'' and the analysis of their properties. These information functionals play a role analogous to that of the classical Fisher information in normal approximation. Detailed comparisons are made between the resulting inequalities and related bounds
Nonnormal approximation by Stein's method of exchangeable pairs with application to the Curie--Weiss model
Let be an exchangeable pair. Assume that
where is a dominated term and is negligible. Let
and define , where is a
properly chosen constant and .
Let be a random variable with the probability density function . It is
proved that converges to in distribution when the conditional second
moment of given satisfies a law of large numbers. A Berry-Esseen
type bound is also given. We use this technique to obtain a Berry-Esseen error
bound of order in the noncentral limit theorem for the
magnetization in the Curie-Weiss ferromagnet at the critical temperature.
Exponential approximation with application to the spectrum of the
Bernoulli-Laplace Markov chain is also discussed.Comment: Published in at http://dx.doi.org/10.1214/10-AAP712 the Annals of
Applied Probability (http://www.imstat.org/aap/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Entry and Return times distribution
This is a review article on the distributions of entry and return times in
dynamical systems which discusses recent results for systems of positive
entropy.Comment: To appear in "Dynamical Systems: An International Journal dedicated
to the Statistical Properties of Dynamical Systems
- β¦