1,950 research outputs found
Preservation of log-concavity on summation
We extend Hoggar's theorem that the sum of two independent discrete-valued
log-concave random variables is itself log-concave. We introduce conditions
under which the result still holds for dependent variables. We argue that these
conditions are natural by giving some applications. Firstly, we use our main
theorem to give simple proofs of the log-concavity of the Stirling numbers of
the second kind and of the Eulerian numbers. Secondly, we prove results
concerning the log-concavity of the sum of independent (not necessarily
log-concave) random variables
Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
Sufficient conditions are developed, under which the compound Poisson
distribution has maximal entropy within a natural class of probability measures
on the nonnegative integers. Recently, one of the authors [O. Johnson, {\em
Stoch. Proc. Appl.}, 2007] used a semigroup approach to show that the Poisson
has maximal entropy among all ultra-log-concave distributions with fixed mean.
We show via a non-trivial extension of this semigroup approach that the natural
analog of the Poisson maximum entropy property remains valid if the compound
Poisson distributions under consideration are log-concave, but that it fails in
general. A parallel maximum entropy result is established for the family of
compound binomial measures. Sufficient conditions for compound distributions to
be log-concave are discussed and applications to combinatorics are examined;
new bounds are derived on the entropy of the cardinality of a random
independent set in a claw-free graph, and a connection is drawn to Mason's
conjecture for matroids. The present results are primarily motivated by the
desire to provide an information-theoretic foundation for compound Poisson
approximation and associated limit theorems, analogous to the corresponding
developments for the central limit theorem and for Poisson approximation. Our
results also demonstrate new links between some probabilistic methods and the
combinatorial notions of log-concavity and ultra-log-concavity, and they add to
the growing body of work exploring the applications of maximum entropy
characterizations to problems in discrete mathematics.Comment: 30 pages. This submission supersedes arXiv:0805.4112v1. Changes in
v2: Updated references, typos correcte
A remarkable sequence of integers
A survey of properties of a sequence of coefficients appearing in the
evaluation of a quartic definite integral is presented. These properties are of
analytical, combinatorial and number-theoretical nature.Comment: 20 pages, 5 figure
Log-concavity and LC-positivity
A triangle of nonnegative numbers is LC-positive
if for each , the sequence of polynomials is
-log-concave. It is double LC-positive if both triangles and
are LC-positive. We show that if is LC-positive
then the log-concavity of the sequence implies that of the sequence
defined by , and if is
double LC-positive then the log-concavity of sequences and
implies that of the sequence defined by
. Examples of double LC-positive triangles
include the constant triangle and the Pascal triangle. We also give a
generalization of a result of Liggett that is used to prove a conjecture of
Pemantle on characteristics of negative dependence.Comment: 16 page
A proof of the Shepp-Olkin entropy monotonicity conjecture
Consider tossing a collection of coins, each fair or biased towards heads,
and take the distribution of the total number of heads that result. It is
natural to conjecture that this distribution should be 'more random' when each
coin is fairer. Indeed, Shepp and Olkin conjectured that the Shannon entropy of
this distribution is monotonically increasing in this case. We resolve this
conjecture, by proving that this intuition is correct. Our proof uses a
construction which was previously developed by the authors to prove a related
conjecture of Shepp and Olkin concerning concavity of entropy. We discuss
whether this result can be generalized to -R\'{e}nyi and -Tsallis
entropies, for a range of values of .Comment: 16 page
- …