14 research outputs found

    Information Physics: The New Frontier

    Full text link
    At this point in time, two major areas of physics, statistical mechanics and quantum mechanics, rest on the foundations of probability and entropy. The last century saw several significant fundamental advances in our understanding of the process of inference, which make it clear that these are inferential theories. That is, rather than being a description of the behavior of the universe, these theories describe how observers can make optimal predictions about the universe. In such a picture, information plays a critical role. What is more is that little clues, such as the fact that black holes have entropy, continue to suggest that information is fundamental to physics in general. In the last decade, our fundamental understanding of probability theory has led to a Bayesian revolution. In addition, we have come to recognize that the foundations go far deeper and that Cox's approach of generalizing a Boolean algebra to a probability calculus is the first specific example of the more fundamental idea of assigning valuations to partially-ordered sets. By considering this as a natural way to introduce quantification to the more fundamental notion of ordering, one obtains an entirely new way of deriving physical laws. I will introduce this new way of thinking by demonstrating how one can quantify partially-ordered sets and, in the process, derive physical laws. The implication is that physical law does not reflect the order in the universe, instead it is derived from the order imposed by our description of the universe. Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science.Comment: 17 pages, 6 figures. Knuth K.H. 2010. Information physics: The new frontier. J.-F. Bercher, P. Bessi\`ere, and A. Mohammad-Djafari (eds.) Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2010), Chamonix, France, July 201

    Generalized probabilities in statistical theories

    Get PDF
    In this review article we present different formal frameworks for the description of generalized probabilities in statistical theories. We discuss the particular cases of probabilities appearing in classical and quantum mechanics, possible generalizations of the approaches of A. N. Kolmogorov and R. T. Cox to non-commutative models, and the approach to generalized probabilities based on convex sets

    Maximum Joint Entropy and Information-Based Collaboration of Automated Learning Machines

    Full text link
    We are working to develop automated intelligent agents, which can act and react as learning machines with minimal human intervention. To accomplish this, an intelligent agent is viewed as a question-asking machine, which is designed by coupling the processes of inference and inquiry to form a model-based learning unit. In order to select maximally-informative queries, the intelligent agent needs to be able to compute the relevance of a question. This is accomplished by employing the inquiry calculus, which is dual to the probability calculus, and extends information theory by explicitly requiring context. Here, we consider the interaction between two question-asking intelligent agents, and note that there is a potential information redundancy with respect to the two questions that the agents may choose to pose. We show that the information redundancy is minimized by maximizing the joint entropy of the questions, which simultaneously maximizes the relevance of each question while minimizing the mutual information between them. Maximum joint entropy is therefore an important principle of information-based collaboration, which enables intelligent agents to efficiently learn together.Comment: 8 pages, 1 figure, to appear in the proceedings of MaxEnt 2011 held in Waterloo, Canad

    A discussion on the origin of quantum probabilities

    Get PDF
    We study the origin of quantum probabilities as arising from non-boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorvian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case).Comment: Improved versio

    Measuring on Lattices

    Full text link
    Previous derivations of the sum and product rules of probability theory relied on the algebraic properties of Boolean logic. Here they are derived within a more general framework based on lattice theory. The result is a new foundation of probability theory that encompasses and generalizes both the Cox and Kolmogorov formulations. In this picture probability is a bi-valuation defined on a lattice of statements that quantifies the degree to which one statement implies another. The sum rule is a constraint equation that ensures that valuations are assigned so as to not violate associativity of the lattice join and meet. The product rule is much more interesting in that there are actually two product rules: one is a constraint equation arises from associativity of the direct products of lattices, and the other a constraint equation derived from associativity of changes of context. The generality of this formalism enables one to derive the traditionally assumed condition of additivity in measure theory, as well introduce a general notion of product. To illustrate the generic utility of this novel lattice-theoretic foundation of measure, the sum and product rules are applied to number theory. Further application of these concepts to understand the foundation of quantum mechanics is described in a joint paper in this proceedings.Comment: 13 pages, 7 figures, Presented at the 29th International Workshop on Bayesian and Maximum Entropy Methods in Science and Engineering: MaxEnt 200

    Lattice conditional independence models and Hibi ideals

    Get PDF
    Lattice conditional independence models [Andersson and Perlman, Lattice models for conditional independence in a multivariate normal distribution, Ann. Statist. 21 (1993), 1318–1358] are a class of models developed first for the Gaussian case in which a distributive lattice classifies all the conditional independence statements. The main result is that these models can equivalently be described via a transitive directed acyclic graph (TDAG) in which, as is normal for causal models, the conditional independence is in terms of conditioning on ancestors in the graph. We demonstrate that a parallel stream of research in algebra, the theory of Hibi ideals, not only maps directly to the lattice conditional independence models but gives a vehicle to generalise the theory from the linear Gaussian case. Given a distributive lattice (i) each conditional independence statement is associated with a Hibi relation defined on the lattice, (ii) the directed graph is given by chains in the lattice which correspond to chains of conditional independence, (iii) the elimination ideal of product terms in the chains gives the Hibi ideal and (iv) the TDAG can be recovered from a special bipartite graph constructed via the Alexander dual of the Hibi ideal. It is briefly demonstrated that there are natural applications to statistical log-linear models, time series and Shannon information flow

    A Potential Foundation for Emergent Space-Time

    Get PDF
    We present a novel derivation of both the Minkowski metric and Lorentz transformations from the consistent quantification of a causally ordered set of events with respect to an embedded observer. Unlike past derivations, which have relied on assumptions such as the existence of a 4-dimensional manifold, symmetries of space-time, or the constant speed of light, we demonstrate that these now familiar mathematics can be derived as the unique means to consistently quantify a network of events. This suggests that space-time need not be physical, but instead the mathematics of space and time emerges as the unique way in which an observer can consistently quantify events and their relationships to one another. The result is a potential foundation for emergent space-time.Comment: The paper was originally titled "The Physics of Events: A Potential Foundation for Emergent Space-Time". We changed the title (and abstract) to be more direct when the paper was accepted for publication at the Journal of Mathematical Physics. 24 pages, 15 figure
    corecore