1,950 research outputs found

    Preservation of log-concavity on summation

    Get PDF
    We extend Hoggar's theorem that the sum of two independent discrete-valued log-concave random variables is itself log-concave. We introduce conditions under which the result still holds for dependent variables. We argue that these conditions are natural by giving some applications. Firstly, we use our main theorem to give simple proofs of the log-concavity of the Stirling numbers of the second kind and of the Eulerian numbers. Secondly, we prove results concerning the log-concavity of the sum of independent (not necessarily log-concave) random variables

    Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures

    Full text link
    Sufficient conditions are developed, under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. Recently, one of the authors [O. Johnson, {\em Stoch. Proc. Appl.}, 2007] used a semigroup approach to show that the Poisson has maximal entropy among all ultra-log-concave distributions with fixed mean. We show via a non-trivial extension of this semigroup approach that the natural analog of the Poisson maximum entropy property remains valid if the compound Poisson distributions under consideration are log-concave, but that it fails in general. A parallel maximum entropy result is established for the family of compound binomial measures. Sufficient conditions for compound distributions to be log-concave are discussed and applications to combinatorics are examined; new bounds are derived on the entropy of the cardinality of a random independent set in a claw-free graph, and a connection is drawn to Mason's conjecture for matroids. The present results are primarily motivated by the desire to provide an information-theoretic foundation for compound Poisson approximation and associated limit theorems, analogous to the corresponding developments for the central limit theorem and for Poisson approximation. Our results also demonstrate new links between some probabilistic methods and the combinatorial notions of log-concavity and ultra-log-concavity, and they add to the growing body of work exploring the applications of maximum entropy characterizations to problems in discrete mathematics.Comment: 30 pages. This submission supersedes arXiv:0805.4112v1. Changes in v2: Updated references, typos correcte

    A remarkable sequence of integers

    Get PDF
    A survey of properties of a sequence of coefficients appearing in the evaluation of a quartic definite integral is presented. These properties are of analytical, combinatorial and number-theoretical nature.Comment: 20 pages, 5 figure

    Log-concavity and LC-positivity

    Get PDF
    A triangle {a(n,k)}0≤k≤n\{a(n,k)\}_{0\le k\le n} of nonnegative numbers is LC-positive if for each rr, the sequence of polynomials ∑k=rna(n,k)qk\sum_{k=r}^{n}a(n,k)q^k is qq-log-concave. It is double LC-positive if both triangles {a(n,k)}\{a(n,k)\} and {a(n,n−k)}\{a(n,n-k)\} are LC-positive. We show that if {a(n,k)}\{a(n,k)\} is LC-positive then the log-concavity of the sequence {xk}\{x_k\} implies that of the sequence {zn}\{z_n\} defined by zn=∑k=0na(n,k)xkz_n=\sum_{k=0}^{n}a(n,k)x_k, and if {a(n,k)}\{a(n,k)\} is double LC-positive then the log-concavity of sequences {xk}\{x_k\} and {yk}\{y_k\} implies that of the sequence {zn}\{z_n\} defined by zn=∑k=0na(n,k)xkyn−kz_n=\sum_{k=0}^{n}a(n,k)x_ky_{n-k}. Examples of double LC-positive triangles include the constant triangle and the Pascal triangle. We also give a generalization of a result of Liggett that is used to prove a conjecture of Pemantle on characteristics of negative dependence.Comment: 16 page

    A proof of the Shepp-Olkin entropy monotonicity conjecture

    Get PDF
    Consider tossing a collection of coins, each fair or biased towards heads, and take the distribution of the total number of heads that result. It is natural to conjecture that this distribution should be 'more random' when each coin is fairer. Indeed, Shepp and Olkin conjectured that the Shannon entropy of this distribution is monotonically increasing in this case. We resolve this conjecture, by proving that this intuition is correct. Our proof uses a construction which was previously developed by the authors to prove a related conjecture of Shepp and Olkin concerning concavity of entropy. We discuss whether this result can be generalized to qq-R\'{e}nyi and qq-Tsallis entropies, for a range of values of qq.Comment: 16 page
    • …
    corecore