10,392 research outputs found

    Information Physics: The New Frontier

    Full text link
    At this point in time, two major areas of physics, statistical mechanics and quantum mechanics, rest on the foundations of probability and entropy. The last century saw several significant fundamental advances in our understanding of the process of inference, which make it clear that these are inferential theories. That is, rather than being a description of the behavior of the universe, these theories describe how observers can make optimal predictions about the universe. In such a picture, information plays a critical role. What is more is that little clues, such as the fact that black holes have entropy, continue to suggest that information is fundamental to physics in general. In the last decade, our fundamental understanding of probability theory has led to a Bayesian revolution. In addition, we have come to recognize that the foundations go far deeper and that Cox's approach of generalizing a Boolean algebra to a probability calculus is the first specific example of the more fundamental idea of assigning valuations to partially-ordered sets. By considering this as a natural way to introduce quantification to the more fundamental notion of ordering, one obtains an entirely new way of deriving physical laws. I will introduce this new way of thinking by demonstrating how one can quantify partially-ordered sets and, in the process, derive physical laws. The implication is that physical law does not reflect the order in the universe, instead it is derived from the order imposed by our description of the universe. Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science.Comment: 17 pages, 6 figures. Knuth K.H. 2010. Information physics: The new frontier. J.-F. Bercher, P. Bessi\`ere, and A. Mohammad-Djafari (eds.) Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2010), Chamonix, France, July 201

    A Potential Foundation for Emergent Space-Time

    Get PDF
    We present a novel derivation of both the Minkowski metric and Lorentz transformations from the consistent quantification of a causally ordered set of events with respect to an embedded observer. Unlike past derivations, which have relied on assumptions such as the existence of a 4-dimensional manifold, symmetries of space-time, or the constant speed of light, we demonstrate that these now familiar mathematics can be derived as the unique means to consistently quantify a network of events. This suggests that space-time need not be physical, but instead the mathematics of space and time emerges as the unique way in which an observer can consistently quantify events and their relationships to one another. The result is a potential foundation for emergent space-time.Comment: The paper was originally titled "The Physics of Events: A Potential Foundation for Emergent Space-Time". We changed the title (and abstract) to be more direct when the paper was accepted for publication at the Journal of Mathematical Physics. 24 pages, 15 figure

    Semantics out of context: nominal absolute denotations for first-order logic and computation

    Full text link
    Call a semantics for a language with variables absolute when variables map to fixed entities in the denotation. That is, a semantics is absolute when the denotation of a variable a is a copy of itself in the denotation. We give a trio of lattice-based, sets-based, and algebraic absolute semantics to first-order logic. Possibly open predicates are directly interpreted as lattice elements / sets / algebra elements, subject to suitable interpretations of the connectives and quantifiers. In particular, universal quantification "forall a.phi" is interpreted using a new notion of "fresh-finite" limit and using a novel dual to substitution. The interest of this semantics is partly in the non-trivial and beautiful technical details, which also offer certain advantages over existing semantics---but also the fact that such semantics exist at all suggests a new way of looking at variables and the foundations of logic and computation, which may be well-suited to the demands of modern computer science

    Lattice initial segments of the hyperdegrees

    Full text link
    We affirm a conjecture of Sacks [1972] by showing that every countable distributive lattice is isomorphic to an initial segment of the hyperdegrees, Dh\mathcal{D}_{h}. In fact, we prove that every sublattice of any hyperarithmetic lattice (and so, in particular, every countable locally finite lattice) is isomorphic to an initial segment of Dh\mathcal{D}_{h}. Corollaries include the decidability of the two quantifier theory of % \mathcal{D}_{h} and the undecidability of its three quantifier theory. The key tool in the proof is a new lattice representation theorem that provides a notion of forcing for which we can prove a version of the fusion lemma in the hyperarithmetic setting and so the preservation of ω1CK\omega _{1}^{CK}. Somewhat surprisingly, the set theoretic analog of this forcing does not preserve ω1\omega _{1}. On the other hand, we construct countable lattices that are not isomorphic to an initial segment of Dh\mathcal{D}_{h}
    • …
    corecore