6,254 research outputs found

    Bayesian Inference in Processing Experimental Data: Principles and Basic Applications

    Full text link
    This report introduces general ideas and some basic methods of the Bayesian probability theory applied to physics measurements. Our aim is to make the reader familiar, through examples rather than rigorous formalism, with concepts such as: model comparison (including the automatic Ockham's Razor filter provided by the Bayesian approach); parametric inference; quantification of the uncertainty about the value of physical quantities, also taking into account systematic effects; role of marginalization; posterior characterization; predictive distributions; hierarchical modelling and hyperparameters; Gaussian approximation of the posterior and recovery of conventional methods, especially maximum likelihood and chi-square fits under well defined conditions; conjugate priors, transformation invariance and maximum entropy motivated priors; Monte Carlo estimates of expectation, including a short introduction to Markov Chain Monte Carlo methods.Comment: 40 pages, 2 figures, invited paper for Reports on Progress in Physic

    Gravity Dual for a Model of Perception

    Full text link
    One of the salient features of human perception is its invariance under dilatation in addition to the Euclidean group, but its non-invariance under special conformal transformation. We investigate a holographic approach to the information processing in image discrimination with this feature. We claim that a strongly coupled analogue of the statistical model proposed by Bialek and Zee can be holographically realized in scale invariant but non-conformal Euclidean geometries. We identify the Bayesian probability distribution of our generalized Bialek-Zee model with the GKPW partition function of the dual gravitational system. We provide a concrete example of the geometric configuration based on a vector condensation model coupled with the Euclidean Einstein-Hilbert action. From the proposed geometry, we study sample correlation functions to compute the Bayesian probability distribution.Comment: 21 pages, v2: condition on conformal invariance of a free vector model correcte

    Bayesian Dropout

    Full text link
    Dropout has recently emerged as a powerful and simple method for training neural networks preventing co-adaptation by stochastically omitting neurons. Dropout is currently not grounded in explicit modelling assumptions which so far has precluded its adoption in Bayesian modelling. Using Bayesian entropic reasoning we show that dropout can be interpreted as optimal inference under constraints. We demonstrate this on an analytically tractable regression model providing a Bayesian interpretation of its mechanism for regularizing and preventing co-adaptation as well as its connection to other Bayesian techniques. We also discuss two general approximate techniques for applying Bayesian dropout for general models, one based on an analytical approximation and the other on stochastic variational techniques. These techniques are then applied to a Baysian logistic regression problem and are shown to improve performance as the model become more misspecified. Our framework roots dropout as a theoretically justified and practical tool for statistical modelling allowing Bayesians to tap into the benefits of dropout training.Comment: 21 pages, 3 figures. Manuscript prepared 2014 and awaiting submissio

    Robust Estimators under the Imprecise Dirichlet Model

    Full text link
    Walley's Imprecise Dirichlet Model (IDM) for categorical data overcomes several fundamental problems which other approaches to uncertainty suffer from. Yet, to be useful in practice, one needs efficient ways for computing the imprecise=robust sets or intervals. The main objective of this work is to derive exact, conservative, and approximate, robust and credible interval estimates under the IDM for a large class of statistical estimators, including the entropy and mutual information.Comment: 16 LaTeX page

    Harold Jeffreys's Theory of Probability Revisited

    Full text link
    Published exactly seventy years ago, Jeffreys's Theory of Probability (1939) has had a unique impact on the Bayesian community and is now considered to be one of the main classics in Bayesian Statistics as well as the initiator of the objective Bayes school. In particular, its advances on the derivation of noninformative priors as well as on the scaling of Bayes factors have had a lasting impact on the field. However, the book reflects the characteristics of the time, especially in terms of mathematical rigor. In this paper we point out the fundamental aspects of this reference work, especially the thorough coverage of testing problems and the construction of both estimation and testing noninformative priors based on functional divergences. Our major aim here is to help modern readers in navigating in this difficult text and in concentrating on passages that are still relevant today.Comment: This paper commented in: [arXiv:1001.2967], [arXiv:1001.2968], [arXiv:1001.2970], [arXiv:1001.2975], [arXiv:1001.2985], [arXiv:1001.3073]. Rejoinder in [arXiv:0909.1008]. Published in at http://dx.doi.org/10.1214/09-STS284 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Statistical Inference and the Plethora of Probability Paradigms: A Principled Pluralism

    Get PDF
    The major competing statistical paradigms share a common remarkable but unremarked thread: in many of their inferential applications, different probability interpretations are combined. How this plays out in different theories of inference depends on the type of question asked. We distinguish four question types: confirmation, evidence, decision, and prediction. We show that Bayesian confirmation theory mixes what are intuitively “subjective” and “objective” interpretations of probability, whereas the likelihood-based account of evidence melds three conceptions of what constitutes an “objective” probability
    • …
    corecore