18,826 research outputs found

    Information as Distinctions: New Foundations for Information Theory

    Full text link
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean logic of subsets is the normalized counting measure of the subsets (events). Thus logical entropy is a measure on the set of ordered pairs, and all the compound notions of entropy (join entropy, conditional entropy, and mutual information) arise in the usual way from the measure (e.g., the inclusion-exclusion principle)--just like the corresponding notions of probability. The usual Shannon entropy of a partition is developed by replacing the normalized count of distinctions (dits) by the average number of binary partitions (bits) necessary to make all the distinctions of the partition

    An Introduction to Logical Entropy and its Relation to Shannon Entropy

    Get PDF
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean logic of subsets is the normalized counting measure of the subsets (events). Thus logical entropy is a measure on the set of ordered pairs, and all the compound notions of entropy (join entropy, conditional entropy, and mutual information) arise in the usual way from the measure (e.g., the inclusion-exclusion principle)--just like the corresponding notions of probability. The usual Shannon entropy of a partition is developed by replacing the normalized count of distinctions (dits) by the average number of binary partitions (bits) necessary to make all the distinctions of the partition

    On the relation between plausibility logic and the maximum-entropy principle: a numerical study

    Full text link
    What is the relationship between plausibility logic and the principle of maximum entropy? When does the principle give unreasonable or wrong results? When is it appropriate to use the rule `expectation = average'? Can plausibility logic give the same answers as the principle, and better answers if those of the principle are unreasonable? To try to answer these questions, this study offers a numerical collection of plausibility distributions given by the maximum-entropy principle and by plausibility logic for a set of fifteen simple problems: throwing dice.Comment: 24 pages of main text and references, 8 pages of tables, 7 pages of additional reference

    Bayesian Updating, Model Class Selection and Robust Stochastic Predictions of Structural Response

    Get PDF
    A fundamental issue when predicting structural response by using mathematical models is how to treat both modeling and excitation uncertainty. A general framework for this is presented which uses probability as a multi-valued conditional logic for quantitative plausible reasoning in the presence of uncertainty due to incomplete information. The fundamental probability models that represent the structure’s uncertain behavior are specified by the choice of a stochastic system model class: a set of input-output probability models for the structure and a prior probability distribution over this set that quantifies the relative plausibility of each model. A model class can be constructed from a parameterized deterministic structural model by stochastic embedding utilizing Jaynes’ Principle of Maximum Information Entropy. Robust predictive analyses use the entire model class with the probabilistic predictions of each model being weighted by its prior probability, or if structural response data is available, by its posterior probability from Bayes’ Theorem for the model class. Additional robustness to modeling uncertainty comes from combining the robust predictions of each model class in a set of competing candidates weighted by the prior or posterior probability of the model class, the latter being computed from Bayes’ Theorem. This higherlevel application of Bayes’ Theorem automatically applies a quantitative Ockham razor that penalizes the data-fit of more complex model classes that extract more information from the data. Robust predictive analyses involve integrals over highdimensional spaces that usually must be evaluated numerically. Published applications have used Laplace's method of asymptotic approximation or Markov Chain Monte Carlo algorithms

    A quantum-mechanical Maxwell's demon

    Get PDF
    A Maxwell's demon is a device that gets information and trades it in for thermodynamic advantage, in apparent (but not actual) contradiction to the second law of thermodynamics. Quantum-mechanical versions of Maxwell's demon exhibit features that classical versions do not: in particular, a device that gets information about a quantum system disturbs it in the process. In addition, the information produced by quantum measurement acts as an additional source of thermodynamic inefficiency. This paper investigates the properties of quantum-mechanical Maxwell's demons, and proposes experimentally realizable models of such devices.Comment: 13 pages, Te
    • …
    corecore