65,569 research outputs found

    The Universe is not a Computer

    Full text link
    When we want to predict the future, we compute it from what we know about the present. Specifically, we take a mathematical representation of observed reality, plug it into some dynamical equations, and then map the time-evolved result back to real-world predictions. But while this computational process can tell us what we want to know, we have taken this procedure too literally, implicitly assuming that the universe must compute itself in the same manner. Physical theories that do not follow this computational framework are deemed illogical, right from the start. But this anthropocentric assumption has steered our physical models into an impossible corner, primarily because of quantum phenomena. Meanwhile, we have not been exploring other models in which the universe is not so limited. In fact, some of these alternate models already have a well-established importance, but are thought to be mathematical tricks without physical significance. This essay argues that only by dropping our assumption that the universe is a computer can we fully develop such models, explain quantum phenomena, and understand the workings of our universe. (This essay was awarded third prize in the 2012 FQXi essay contest; a new afterword compares and contrasts this essay with Robert Spekkens' first prize entry.)Comment: 10 pages with new afterword; matches published versio

    Kolmogorov Complexity in perspective. Part II: Classification, Information Processing and Duality

    Get PDF
    We survey diverse approaches to the notion of information: from Shannon entropy to Kolmogorov complexity. Two of the main applications of Kolmogorov complexity are presented: randomness and classification. The survey is divided in two parts published in a same volume. Part II is dedicated to the relation between logic and information system, within the scope of Kolmogorov algorithmic information theory. We present a recent application of Kolmogorov complexity: classification using compression, an idea with provocative implementation by authors such as Bennett, Vitanyi and Cilibrasi. This stresses how Kolmogorov complexity, besides being a foundation to randomness, is also related to classification. Another approach to classification is also considered: the so-called "Google classification". It uses another original and attractive idea which is connected to the classification using compression and to Kolmogorov complexity from a conceptual point of view. We present and unify these different approaches to classification in terms of Bottom-Up versus Top-Down operational modes, of which we point the fundamental principles and the underlying duality. We look at the way these two dual modes are used in different approaches to information system, particularly the relational model for database introduced by Codd in the 70's. This allows to point out diverse forms of a fundamental duality. These operational modes are also reinterpreted in the context of the comprehension schema of axiomatic set theory ZF. This leads us to develop how Kolmogorov's complexity is linked to intensionality, abstraction, classification and information system.Comment: 43 page

    An analysis of total correctness refinement models for partial relation semantics I

    Get PDF
    This is the first of a series of papers devoted to the thorough investigation of (total correctness) refinement based on an underlying partial relational model. In this paper we restrict attention to operation refinement. We explore four theories of refinement based on an underlying partial relation model for specifications, and we show that they are all equivalent. This, in particular, sheds some light on the relational completion operator (lifted-totalisation) due to Wookcock which underlines data refinement in, for example, the specification language Z. It further leads to two simple alternative models which are also equivalent to the others

    Institutionalising Ontology-Based Semantic Integration

    No full text
    We address what is still a scarcity of general mathematical foundations for ontology-based semantic integration underlying current knowledge engineering methodologies in decentralised and distributed environments. After recalling the first-order ontology-based approach to semantic integration and a formalisation of ontological commitment, we propose a general theory that uses a syntax-and interpretation-independent formulation of language, ontology, and ontological commitment in terms of institutions. We claim that our formalisation generalises the intuitive notion of ontology-based semantic integration while retaining its basic insight, and we apply it for eliciting and hence comparing various increasingly complex notions of semantic integration and ontological commitment based on differing understandings of semantics

    Non‐Classical Knowledge

    Get PDF
    The Knower paradox purports to place surprising a priori limitations on what we can know. According to orthodoxy, it shows that we need to abandon one of three plausible and widely-held ideas: that knowledge is factive, that we can know that knowledge is factive, and that we can use logical/mathematical reasoning to extend our knowledge via very weak single-premise closure principles. I argue that classical logic, not any of these epistemic principles, is the culprit. I develop a consistent theory validating all these principles by combining Hartry Field's theory of truth with a modal enrichment developed for a different purpose by Michael Caie. The only casualty is classical logic: the theory avoids paradox by using a weaker-than-classical K3 logic. I then assess the philosophical merits of this approach. I argue that, unlike the traditional semantic paradoxes involving extensional notions like truth, its plausibility depends on the way in which sentences are referred to--whether in natural languages via direct sentential reference, or in mathematical theories via indirect sentential reference by Gödel coding. In particular, I argue that from the perspective of natural language, my non-classical treatment of knowledge as a predicate is plausible, while from the perspective of mathematical theories, its plausibility depends on unresolved questions about the limits of our idealized deductive capacities

    Functorial Data Migration

    Get PDF
    In this paper we present a simple database definition language: that of categories and functors. A database schema is a small category and an instance is a set-valued functor on it. We show that morphisms of schemas induce three "data migration functors", which translate instances from one schema to the other in canonical ways. These functors parameterize projections, unions, and joins over all tables simultaneously and can be used in place of conjunctive and disjunctive queries. We also show how to connect a database and a functional programming language by introducing a functorial connection between the schema and the category of types for that language. We begin the paper with a multitude of examples to motivate the definitions, and near the end we provide a dictionary whereby one can translate database concepts into category-theoretic concepts and vice-versa.Comment: 30 page
    corecore