6,828 research outputs found

    Expressions for the entropy of binomial-type distributions

    Get PDF
    We develop a general method for computing logarithmic and log-gamma expectations of distributions. As a result, we derive series expansions and integral representations of the entropy for several fundamental distributions, including the Poisson, binomial, beta-binomial, negative binomial, and hypergeometric distributions. Our results also establish connections between the entropy functions and to the Riemann zeta function and its generalizations

    A simple derivation and classification of common probability distributions based on information symmetry and measurement scale

    Full text link
    Commonly observed patterns typically follow a few distinct families of probability distributions. Over one hundred years ago, Karl Pearson provided a systematic derivation and classification of the common continuous distributions. His approach was phenomenological: a differential equation that generated common distributions without any underlying conceptual basis for why common distributions have particular forms and what explains the familial relations. Pearson's system and its descendants remain the most popular systematic classification of probability distributions. Here, we unify the disparate forms of common distributions into a single system based on two meaningful and justifiable propositions. First, distributions follow maximum entropy subject to constraints, where maximum entropy is equivalent to minimum information. Second, different problems associate magnitude to information in different ways, an association we describe in terms of the relation between information invariance and measurement scale. Our framework relates the different continuous probability distributions through the variations in measurement scale that change each family of maximum entropy distributions into a distinct family.Comment: 17 pages, 0 figure

    Statistical Models of Nuclear Fragmentation

    Full text link
    A method is presented that allows exact calculations of fragment multiplicity distributions for a canonical ensemble of non-interacting clusters. Fragmentation properties are shown to depend on only a few parameters. Fragments are shown to be copiously produced above the transition temperature. At this transition temperature, the calculated multiplicity distributions broaden and become strongly super-Poissonian. This behavior is compared to predictions from a percolation model. A corresponding microcanonical formalism is also presented.Comment: 12 pages, 5 figure

    How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems

    Get PDF
    The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems, by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there exists an ongoing controversy whether the notion of the maximum entropy principle can be extended in a meaningful way to non-extensive, non-ergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for non-ergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.Comment: 6 pages, 1 figure. To appear in PNA

    Generalized Transmuted Family of Distributions: Properties and Applications

    Get PDF
    We introduce and study general mathematical properties of a new generator of continuous distributions with two extra parameters called the Generalized Transmuted Family of Distributions. We investigate the shapes and present some special models. The new density function can be expressed as a linear combination of exponentiated densities in terms of the same baseline distribution. We obtain explicit expressions for the ordinary and incomplete moments and generating function, Bonferroni and Lorenz curves, asymptotic distribution of the extreme values, Shannon and R´enyi entropies and order statistics, which hold for any baseline model. Further, we introduce a bivariate extension of the new family. We discuss the different methods of estimation of the model parameters and illustrate the potential application of the model via real data. A brief simulation for evaluating Maximum likelihood estimator is done. Finally certain characterziations of our model are presented
    • …
    corecore