research

Information and Entropy

Abstract

What is information? Is it physical? We argue that in a Bayesian theory the notion of information must be defined in terms of its effects on the beliefs of rational agents. Information is whatever constrains rational beliefs and therefore it is the force that induces us to change our minds. This problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), which is designed for updating from arbitrary priors given information in the form of arbitrary constraints, includes as special cases both MaxEnt (which allows arbitrary constraints) and Bayes' rule (which allows arbitrary priors). Thus, ME unifies the two themes of these workshops -- the Maximum Entropy and the Bayesian methods -- into a single general inference scheme that allows us to handle problems that lie beyond the reach of either of the two methods separately. I conclude with a couple of simple illustrative examples.Comment: Presented at MaxEnt 2007, the 27th International Workshop on Bayesian Inference and Maximum Entropy Methods (July 8-13, 2007, Saratoga Springs, New York, USA

    Similar works

    Full text

    thumbnail-image

    Available Versions

    Last time updated on 05/06/2019
    Last time updated on 03/01/2020