94,235 research outputs found

    Relative entropy and variational properties of generalized Gibbsian measures

    Get PDF
    We study the relative entropy density for generalized Gibbs measures. We first show its existence and obtain a familiar expression in terms of entropy and relative energy for a class of ``almost Gibbsian measures'' (almost sure continuity of conditional probabilities). For quasilocal measures, we obtain a full variational principle. For the joint measures of the random field Ising model, we show that the weak Gibbs property holds, with an almost surely rapidly decaying translation-invariant potential. For these measures we show that the variational principle fails as soon as the measures lose the almost Gibbs property. These examples suggest that the class of weakly Gibbsian measures is too broad from the perspective of a reasonable thermodynamic formalism.Comment: Published by the Institute of Mathematical Statistics (http://www.imstat.org) in the Annals of Probability (http://www.imstat.org/aop/) at http://dx.doi.org/10.1214/00911790400000034

    Parsimonious Description of Generalized Gibbs Measures : Decimation of the 2d-Ising Model

    Full text link
    In this paper, we detail and complete the existing characterizations of the decimation of the Ising model on Z2\Z^2 in the generalized Gibbs context. We first recall a few features of the Dobrushin program of restoration of Gibbsianness and present the construction of global specifications consistent with the extremal decimated measures. We use them to consider these renormalized measures as almost Gibbsian measures and to precise its convex set of DLR measures. We also recall the weakly Gibbsian description and complete it using a potential that admits a quenched correlation decay, i.e. a well-defined configuration-dependent length beyond which this potential decays exponentially. We use these results to incorporate these decimated measures in the new framework of parsimonious random fields that has been recently developed to investigate probability aspects related to neurosciences.Comment: 32 pages, preliminary versio

    Comparison between Suitable Priors for Additive Bayesian Networks

    Full text link
    Additive Bayesian networks are types of graphical models that extend the usual Bayesian generalized linear model to multiple dependent variables through the factorisation of the joint probability distribution of the underlying variables. When fitting an ABN model, the choice of the prior of the parameters is of crucial importance. If an inadequate prior - like a too weakly informative one - is used, data separation and data sparsity lead to issues in the model selection process. In this work a simulation study between two weakly and a strongly informative priors is presented. As weakly informative prior we use a zero mean Gaussian prior with a large variance, currently implemented in the R-package abn. The second prior belongs to the Student's t-distribution, specifically designed for logistic regressions and, finally, the strongly informative prior is again Gaussian with mean equal to true parameter value and a small variance. We compare the impact of these priors on the accuracy of the learned additive Bayesian network in function of different parameters. We create a simulation study to illustrate Lindley's paradox based on the prior choice. We then conclude by highlighting the good performance of the informative Student's t-prior and the limited impact of the Lindley's paradox. Finally, suggestions for further developments are provided.Comment: 8 pages, 4 figure

    On the Variational Principle for Generalized Gibbs Measures

    Get PDF
    We present a novel approach to establishing the variational principle for Gibbs and generalized (weak and almost) Gibbs states. Limitations of a thermodynamical formalism for generalized Gibbs states will be discussed. A new class of intuitively weak Gibbs measures is introduced, and a typical example is studied. Finally, we present a new example of a non-Gibbsian measure arising from an industrial application.Comment: To appear in Markov Processes and Related Fields, Proceedings workshop Gibbs-nonGibb
    corecore