18 research outputs found

    Negative correlation and log-concavity

    Full text link
    We give counterexamples and a few positive results related to several conjectures of R. Pemantle and D. Wagner concerning negative correlation and log-concavity properties for probability measures and relations between them. Most of the negative results have also been obtained, independently but somewhat earlier, by Borcea et al. We also give short proofs of a pair of results due to Pemantle and Borcea et al.; prove that "almost exchangeable" measures satisfy the "Feder-Mihail" property, thus providing a "non-obvious" example of a class of measures for which this important property can be shown to hold; and mention some further questions.Comment: 21 pages; only minor changes since previous version; accepted for publication in Random Structures and Algorithm

    Correlation bounds for fields and matroids

    Full text link
    Let GG be a finite connected graph, and let TT be a spanning tree of GG chosen uniformly at random. The work of Kirchhoff on electrical networks can be used to show that the events e1∈Te_1 \in T and e2∈Te_2 \in T are negatively correlated for any distinct edges e1e_1 and e2e_2. What can be said for such events when the underlying matroid is not necessarily graphic? We use Hodge theory for matroids to bound the correlation between the events e∈Be \in B, where BB is a randomly chosen basis of a matroid. As an application, we prove Mason's conjecture that the number of kk-element independent sets of a matroid forms an ultra-log-concave sequence in kk.Comment: 16 pages. Supersedes arXiv:1804.0307

    A BK inequality for randomly drawn subsets of fixed size

    Full text link
    The BK inequality (\cite{BK85}) says that,for product measures on {0,1}n\{0,1\}^n, the probability that two increasing events AA and BB `occur disjointly' is at most the product of the two individual probabilities. The conjecture in \cite{BK85} that this holds for {\em all} events was proved by Reimer (cite{R00}). Several other problems in this area remained open. For instance, although it is easy to see that non-product measures cannot satisfy the above inequality for {\em all} events,there are several such measures which, intuitively, should satisfy the inequality for all{\em increasing} events. One of the most natural candidates is the measure assigning equal probabilities to all configurations with exactly kk 1's (and probability 0 to all other configurations). The main contribution of this paper is a proof for these measures. We also point out how our result extends to weighted versions of these measures, and to products of such measures.Comment: Revised version for PTRF. Equation (13) corrected. Several, mainly stylistic, changes; more compac

    Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures

    Full text link
    Sufficient conditions are developed, under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. Recently, one of the authors [O. Johnson, {\em Stoch. Proc. Appl.}, 2007] used a semigroup approach to show that the Poisson has maximal entropy among all ultra-log-concave distributions with fixed mean. We show via a non-trivial extension of this semigroup approach that the natural analog of the Poisson maximum entropy property remains valid if the compound Poisson distributions under consideration are log-concave, but that it fails in general. A parallel maximum entropy result is established for the family of compound binomial measures. Sufficient conditions for compound distributions to be log-concave are discussed and applications to combinatorics are examined; new bounds are derived on the entropy of the cardinality of a random independent set in a claw-free graph, and a connection is drawn to Mason's conjecture for matroids. The present results are primarily motivated by the desire to provide an information-theoretic foundation for compound Poisson approximation and associated limit theorems, analogous to the corresponding developments for the central limit theorem and for Poisson approximation. Our results also demonstrate new links between some probabilistic methods and the combinatorial notions of log-concavity and ultra-log-concavity, and they add to the growing body of work exploring the applications of maximum entropy characterizations to problems in discrete mathematics.Comment: 30 pages. This submission supersedes arXiv:0805.4112v1. Changes in v2: Updated references, typos correcte
    corecore