84,730 research outputs found

    Yet Another Analysis of Dice Problems

    Get PDF
    During the MaxEnt 2002 workshop in Moscow, Idaho, Tony Vignaux asked again a few simple questions about using Maximum Entropy or Bayesian approaches for the famous Dice problems which have been analyzed many times through this workshop and also in other places. Here, there is another analysis of these problems. I hope that, this paper will answer a few questions of Tony and other participants of the workshop on the situations where we can use Maximum Entropy or Bayesian approaches or even the cases where we can actually use both of them. Keywords: Dice problems and probability theory, Maximum Likelihood, Bayesian inference, Maximum A Posteriori, Entropy, Maximum entropy, Maximum entropy in the mean.Comment: Presented at MaxEnt2002, the 22nd International Workshop on Bayesian and Maximum Entropy methods (Aug. 3-9, 2002, Moscow, Idaho, USA). To appear in Proceedings of American Institute of Physic

    Entropic Inference

    Full text link
    In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents. The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), includes as special cases both MaxEnt and Bayes' rule, and therefore unifies the two themes of these workshops -- the Maximum Entropy and the Bayesian methods -- into a single general inference scheme.Comment: Presented at MaxEnt 2010, the 30th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering (July 4-9, 2010, Chamonix, France

    A BAYESIAN ALTERNATIVE TO GENERALIZED CROSS ENTROPY SOLUTIONS FOR UNDERDETERMINED ECONOMETRIC MODELS

    Get PDF
    This paper presents a Bayesian alternative to Generalized Maximum Entropy (GME) and Generalized Cross Entropy (GCE) methods for deriving solutions to econometric models represented by underdetermined systems of equations. For certain types of econometric model specifications, the Bayesian approach provides fully equivalent results to GME-GCE techniques. However, in its general form, the proposed Bayesian methodology allows a more direct and straightforwardly interpretable formulation of available prior information and can reduce significantly the computational effort involved in finding solutions. The technique can be adapted to provide solutions in situations characterized by either informative or uninformative prior information.Underdetermined Equation Systems, Maximum Entropy, Bayesian Priors, Structural Estimation, Calibration, Research Methods/ Statistical Methods, C11, C13, C51,

    Information and Entropy

    Full text link
    What is information? Is it physical? We argue that in a Bayesian theory the notion of information must be defined in terms of its effects on the beliefs of rational agents. Information is whatever constrains rational beliefs and therefore it is the force that induces us to change our minds. This problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), which is designed for updating from arbitrary priors given information in the form of arbitrary constraints, includes as special cases both MaxEnt (which allows arbitrary constraints) and Bayes' rule (which allows arbitrary priors). Thus, ME unifies the two themes of these workshops -- the Maximum Entropy and the Bayesian methods -- into a single general inference scheme that allows us to handle problems that lie beyond the reach of either of the two methods separately. I conclude with a couple of simple illustrative examples.Comment: Presented at MaxEnt 2007, the 27th International Workshop on Bayesian Inference and Maximum Entropy Methods (July 8-13, 2007, Saratoga Springs, New York, USA

    Entropic Priors

    Full text link
    The method of Maximum (relative) Entropy (ME) is used to translate the information contained in the known form of the likelihood into a prior distribution for Bayesian inference. The argument is guided by intuition gained from the successful use of ME methods in statistical mechanics. For experiments that cannot be repeated the resulting "entropic prior" is formally identical with the Einstein fluctuation formula. For repeatable experiments, however, the expected value of the entropy of the likelihood turns out to be relevant information that must be included in the analysis. As an example the entropic prior for a Gaussian likelihood is calculated.Comment: Presented at MaxEnt'03, the 23d International Workshop on Bayesian Inference and Maximum Entropy Methods (August 3-8, 2003, Jackson Hole, WY, USA

    Maximum Entropy and Bayesian Data Analysis: Entropic Priors

    Full text link
    The problem of assigning probability distributions which objectively reflect the prior information available about experiments is one of the major stumbling blocks in the use of Bayesian methods of data analysis. In this paper the method of Maximum (relative) Entropy (ME) is used to translate the information contained in the known form of the likelihood into a prior distribution for Bayesian inference. The argument is inspired and guided by intuition gained from the successful use of ME methods in statistical mechanics. For experiments that cannot be repeated the resulting "entropic prior" is formally identical with the Einstein fluctuation formula. For repeatable experiments, however, the expected value of the entropy of the likelihood turns out to be relevant information that must be included in the analysis. The important case of a Gaussian likelihood is treated in detail.Comment: 23 pages, 2 figure

    Maximum entropy, fluctuations and priors

    Full text link
    The method of maximum entropy (ME) is extended to address the following problem: Once one accepts that the ME distribution is to be preferred over all others, the question is to what extent are distributions with lower entropy supposed to be ruled out. Two applications are given. The first is to the theory of thermodynamic fluctuations. The formulation is exact, covariant under changes of coordinates, and allows fluctuations of both the extensive and the conjugate intensive variables. The second application is to the construction of an objective prior for Bayesian inference. The prior obtained by following the ME method to its inevitable conclusion turns out to be a special case of what are currently known under the name of entropic priors.Comment: presented at MaxEnt 2000, the 20th International Workshop on Bayesian Inference and Maximum Entropy Methods (July 8-13, Gif-sur-Yvette, France)
    corecore