132 research outputs found

    Matching typed and untyped realizability

    Get PDF
    AbstractRealizability interpretations of logics are given by saying what it means for computational objects of some kind to realize logical formulae. The computational objects in question might be drawn from an untyped universe of computation, such as a partial combinatory algebra, or they might be typed objects such as terms of a PCF-style programming language. In some instances, one can show that a particular untyped realizability interpretation matches a particular typed one, in the sense that they give the same set of realizable formulae. In this case, we have a very good fit indeed between the typed language and the untyped realizability model — we refer to this condition as (constructive) logical full abstraction.We give some examples of this situation for a variety of extensions of PCF. Of particular interest are some models that are logically fully abstract for typed languages including non-functional features. Our results establish connections between what is computable in various programming languages and what is true inside various realizability toposes. We consider some examples of logical formulae to illustrate these ideas, in particular their application to exact real-number computability

    A type system for Continuation Calculus

    Get PDF
    Continuation Calculus (CC), introduced by Geron and Geuvers, is a simple foundational model for functional computation. It is closely related to lambda calculus and term rewriting, but it has no variable binding and no pattern matching. It is Turing complete and evaluation is deterministic. Notions like "call-by-value" and "call-by-name" computation are available by choosing appropriate function definitions: e.g. there is a call-by-value and a call-by-name addition function. In the present paper we extend CC with types, to be able to define data types in a canonical way, and functions over these data types, defined by iteration. Data type definitions follow the so-called "Scott encoding" of data, as opposed to the more familiar "Church encoding". The iteration scheme comes in two flavors: a call-by-value and a call-by-name iteration scheme. The call-by-value variant is a double negation variant of call-by-name iteration. The double negation translation allows to move between call-by-name and call-by-value.Comment: In Proceedings CL&C 2014, arXiv:1409.259

    PML2: Integrated Program Verification in ML

    Get PDF
    We present the PML_2 language, which provides a uniform environment for programming, and for proving properties of programs in an ML-like setting. The language is Curry-style and call-by-value, it provides a control operator (interpreted in terms of classical logic), it supports general recursion and a very general form of (implicit, non-coercive) subtyping. In the system, equational properties of programs are expressed using two new type formers, and they are proved by constructing terminating programs. Although proofs rely heavily on equational reasoning, equalities are exclusively managed by the type-checker. This means that the user only has to choose which equality to use, and not where to use it, as is usually done in mathematical proofs. In the system, writing proofs mostly amounts to applying lemmas (possibly recursive function calls), and to perform case analyses (pattern matchings)

    Strong normalisation for applied lambda calculi

    Full text link
    We consider the untyped lambda calculus with constructors and recursively defined constants. We construct a domain-theoretic model such that any term not denoting bottom is strongly normalising provided all its `stratified approximations' are. From this we derive a general normalisation theorem for applied typed lambda-calculi: If all constants have a total value, then all typeable terms are strongly normalising. We apply this result to extensions of G\"odel's system T and system F extended by various forms of bar recursion for which strong normalisation was hitherto unknown.Comment: 14 pages, paper acceptet at electronic journal LMC

    Quantitative Models and Implicit Complexity

    Full text link
    We give new proofs of soundness (all representable functions on base types lies in certain complexity classes) for Elementary Affine Logic, LFPL (a language for polytime computation close to realistic functional programming introduced by one of us), Light Affine Logic and Soft Affine Logic. The proofs are based on a common semantical framework which is merely instantiated in four different ways. The framework consists of an innovative modification of realizability which allows us to use resource-bounded computations as realisers as opposed to including all Turing computable functions as is usually the case in realizability constructions. For example, all realisers in the model for LFPL are polynomially bounded computations whence soundness holds by construction of the model. The work then lies in being able to interpret all the required constructs in the model. While being the first entirely semantical proof of polytime soundness for light logi cs, our proof also provides a notable simplification of the original already semantical proof of polytime soundness for LFPL. A new result made possible by the semantic framework is the addition of polymorphism and a modality to LFPL thus allowing for an internal definition of inductive datatypes.Comment: 29 page

    Proving Properties of Typed Lambda-Terms Using Realizability, Covers, and Sheaves (Preliminary Version)

    Get PDF
    We present a general method for proving properties of typed λ-terms. This method is obtained by introducing a semantic notion of realizability which uses the notion of a cover algebra (as in abstract sheaf theory). For this, we introduce a new class of semantic structures equipped with preorders, called pre-applicative structures. These structures need not be extensional. In this framework, a general realizability theorem can be shown. Kleene\u27s recursive realizability and a variant of Kreisel\u27s modified realizability both fit into this framework. Applying this theorem to the special case of the term model, yields a general theorem for proving properties of typed λ-terms, in particular, strong normalization and confluence. This approach clarifies the reducibility method by showing that the closure conditions on candidates of reducibility can be viewed as sheaf conditions. The above approach is applied to the simply-typed λ-calculus (with types →, x, +, and ⊥), and to the second-order (polymorphic λ-calculus (with types → and ∀2), for which it yields a new theorem

    Constructions, inductive types and strong normalization

    Get PDF
    This thesis contains an investigation of Coquand's Calculus of Constructions, a basic impredicative Type Theory. We review syntactic properties of the calculus, in particular decidability of equality and type-checking, based on the equality-as-judgement presentation. We present a set-theoretic notion of model, CC-structures, and use this to give a new strong normalization proof based on a modification of the realizability interpretation. An extension of the core calculus by inductive types is investigated and we show, using the example of infinite trees, how the realizability semantics and the strong normalization argument can be extended to non-algebraic inductive types. We emphasize that our interpretation is sound for large eliminations, e.g. allows the definition of sets by recursion. Finally we apply the extended calculus to a non-trivial problem: the formalization of the strong normalization argument for Girard's System F. This formal proof has been developed and checked using the..

    A Quick Overview on the Quantum Control Approach to the Lambda Calculus

    Get PDF
    In this short overview, we start with the basics of quantum computing, explaining the difference between the quantum and the classical control paradigms. We give an overview of the quantum control line of research within the lambda calculus, ranging from untyped calculi up to categorical and realisability models. This is a summary of the last 10+ years of research in this area, starting from Arrighi and Dowek's seminal work until today.Comment: In Proceedings LSFA 2021, arXiv:2204.0341

    Normalization by Evaluation in the Delay Monad: A Case Study for Coinduction via Copatterns and Sized Types

    Get PDF
    In this paper, we present an Agda formalization of a normalizer for simply-typed lambda terms. The normalizer consists of two coinductively defined functions in the delay monad: One is a standard evaluator of lambda terms to closures, the other a type-directed reifier from values to eta-long beta-normal forms. Their composition, normalization-by-evaluation, is shown to be a total function a posteriori, using a standard logical-relations argument. The successful formalization serves as a proof-of-concept for coinductive programming and reasoning using sized types and copatterns, a new and presently experimental feature of Agda.Comment: In Proceedings MSFP 2014, arXiv:1406.153
    • …
    corecore