430 research outputs found
Monoidal computer III: A coalgebraic view of computability and complexity
Monoidal computer is a categorical model of intensional computation, where
many different programs correspond to the same input-output behavior. The
upshot of yet another model of computation is that a categorical formalism
should provide a much needed high level language for theory of computation,
flexible enough to allow abstracting away the low level implementation details
when they are irrelevant, or taking them into account when they are genuinely
needed. A salient feature of the approach through monoidal categories is the
formal graphical language of string diagrams, which supports visual reasoning
about programs and computations.
In the present paper, we provide a coalgebraic characterization of monoidal
computer. It turns out that the availability of interpreters and specializers,
that make a monoidal category into a monoidal computer, is equivalent with the
existence of a *universal state space*, that carries a weakly final state
machine for any pair of input and output types. Being able to program state
machines in monoidal computers allows us to represent Turing machines, to
capture their execution, count their steps, as well as, e.g., the memory cells
that they use. The coalgebraic view of monoidal computer thus provides a
convenient diagrammatic language for studying computability and complexity.Comment: 34 pages, 24 figures; in this version: added the Appendi
The Broadest Necessity
In this paper the logic of broad necessity is explored. Definitions of what it means for one modality to be broader than another are formulated, and it is proven, in the context of higher-order logic, that there is a broadest necessity, settling one of the central questions of this investigation. It is shown, moreover, that it is possible to give a reductive analysis of this necessity in extensional language. This relates more generally to a conjecture that it is not possible to define intensional connectives from extensional notions. This conjecture is formulated precisely in higher-order logic, and concrete cases in which it fails are examined. The paper ends with a discussion of the logic of broad necessity. It is shown that the logic of broad necessity is a normal modal logic between S4 and Triv, and that it is consistent with a natural axiomatic system of higher-order logic that it is exactly S4. Some philosophical reasons to think that the logic of broad necessity does not include the S5 principle are given
Intensional and Extensional Semantics of Bounded and Unbounded Nondeterminism
We give extensional and intensional characterizations of nondeterministic
functional programs: as structure preserving functions between biorders, and as
nondeterministic sequential algorithms on ordered concrete data structures
which compute them. A fundamental result establishes that the extensional and
intensional representations of non-deterministic programs are equivalent, by
showing how to construct a unique sequential algorithm which computes a given
monotone and stable function, and describing the conditions on sequential
algorithms which correspond to continuity with respect to each order.
We illustrate by defining may and must-testing denotational semantics for a
sequential functional language with bounded and unbounded choice operators. We
prove that these are computationally adequate, despite the non-continuity of
the must-testing semantics of unbounded nondeterminism. In the bounded case, we
prove that our continuous models are fully abstract with respect to may and
must-testing by identifying a simple universal type, which may also form the
basis for models of the untyped lambda-calculus. In the unbounded case we
observe that our model contains computable functions which are not denoted by
terms, by identifying a further "weak continuity" property of the definable
elements, and use this to establish that it is not fully abstract
Kolmogorov Complexity in perspective. Part II: Classification, Information Processing and Duality
We survey diverse approaches to the notion of information: from Shannon
entropy to Kolmogorov complexity. Two of the main applications of Kolmogorov
complexity are presented: randomness and classification. The survey is divided
in two parts published in a same volume. Part II is dedicated to the relation
between logic and information system, within the scope of Kolmogorov
algorithmic information theory. We present a recent application of Kolmogorov
complexity: classification using compression, an idea with provocative
implementation by authors such as Bennett, Vitanyi and Cilibrasi. This stresses
how Kolmogorov complexity, besides being a foundation to randomness, is also
related to classification. Another approach to classification is also
considered: the so-called "Google classification". It uses another original and
attractive idea which is connected to the classification using compression and
to Kolmogorov complexity from a conceptual point of view. We present and unify
these different approaches to classification in terms of Bottom-Up versus
Top-Down operational modes, of which we point the fundamental principles and
the underlying duality. We look at the way these two dual modes are used in
different approaches to information system, particularly the relational model
for database introduced by Codd in the 70's. This allows to point out diverse
forms of a fundamental duality. These operational modes are also reinterpreted
in the context of the comprehension schema of axiomatic set theory ZF. This
leads us to develop how Kolmogorov's complexity is linked to intensionality,
abstraction, classification and information system.Comment: 43 page
Relational Graph Models at Work
We study the relational graph models that constitute a natural subclass of
relational models of lambda-calculus. We prove that among the lambda-theories
induced by such models there exists a minimal one, and that the corresponding
relational graph model is very natural and easy to construct. We then study
relational graph models that are fully abstract, in the sense that they capture
some observational equivalence between lambda-terms. We focus on the two main
observational equivalences in the lambda-calculus, the theory H+ generated by
taking as observables the beta-normal forms, and H* generated by considering as
observables the head normal forms. On the one hand we introduce a notion of
lambda-K\"onig model and prove that a relational graph model is fully abstract
for H+ if and only if it is extensional and lambda-K\"onig. On the other hand
we show that the dual notion of hyperimmune model, together with
extensionality, captures the full abstraction for H*
Staged Compilation with Two-Level Type Theory
The aim of staged compilation is to enable metaprogramming in a way such that
we have guarantees about the well-formedness of code output, and we can also
mix together object-level and meta-level code in a concise and convenient
manner. In this work, we observe that two-level type theory (2LTT), a system
originally devised for the purpose of developing synthetic homotopy theory,
also serves as a system for staged compilation with dependent types. 2LTT has
numerous good properties for this use case: it has a concise specification,
well-behaved model theory, and it supports a wide range of language features
both at the object and the meta level. First, we give an overview of 2LTT's
features and applications in staging. Then, we present a staging algorithm and
prove its correctness. Our algorithm is "staging-by-evaluation", analogously to
the technique of normalization-by-evaluation, in that staging is given by the
evaluation of 2LTT syntax in a semantic domain. The staging algorithm together
with its correctness constitutes a proof of strong conservativity of 2LLT over
the object theory. To our knowledge, this is the first description of staged
compilation which supports full dependent types and unrestricted staging for
types
Towards an embedding of Graph Transformation in Intuitionistic Linear Logic
Linear logics have been shown to be able to embed both rewriting-based
approaches and process calculi in a single, declarative framework. In this
paper we are exploring the embedding of double-pushout graph transformations
into quantified linear logic, leading to a Curry-Howard style isomorphism
between graphs and transformations on one hand, formulas and proof terms on the
other. With linear implication representing rules and reachability of graphs,
and the tensor modelling parallel composition of graphs and transformations, we
obtain a language able to encode graph transformation systems and their
computations as well as reason about their properties
- …