10 research outputs found
Realizability with constants
We extend Krivine's classical realizability to a simply typed calculus with some constants and primitives and a call-with-current-continuation construct similar to Felleisen's C operator. We show how the theory extends smoothly by associating an appropriate truth value to concrete types. As a consequence, results and methods from realizability still hold in this new context; this is especially interesting in the case of specification theorems because they hold a significant meaning from the programmer's point of view
Filter Models: Non-idempotent Intersection Types, Orthogonality and Polymorphism
This paper revisits models of typed lambda calculus based on filters of intersection types:
By using non-idempotent intersections, we simplify a methodology that produces modular proofs of strong normalisation based on filter models. Building such a model for some type theory shows that typed terms can be typed with intersections only, and are therefore strongly normalising. Non-idempotent intersections provide a decreasing measure proving a key termination property, simpler than the reducibility techniques used with idempotent intersections.
Such filter models are shown to be captured by orthogonality techniques: we formalise an abstract notion of orthogonality model inspired by classical realisability, and express a filter model as one of its instances, along with two term-models (one of which captures a now common technique for strong normalisation).
Applying the above range of model constructions to Curry-style System F describes at different levels of detail how the infinite polymorphism of System F can systematically be reduced to the finite polymorphism of intersection types
A Computational Interpretation of the Axiom of Determinacy in Arithmetic
We investigate the computational content of the axiom of determinacy (AD) in the setting of classical arithmetic in all finite types with the principle of dependent choices (DC). By employing the notion of realizability interpretation for arithmetic given by Berardi, Bezem and Coquand (1998), we interpret the negative translation of AD. Consequently, the combination of the negative translation with this realizability semantics can be seen as a model of DC, AD and the negation of the axiom of choice at higher types. In order to understand the computational content of AD, we explain, employing Coquand\u27s game theoretical semantics, how our realizer behaves
A Rewrite System for Proof Constructivization
International audienceProof constructivization is the problem of automatically extracting constructive proofs out of classical proofs. This process is required when classical theorem provers are integrated in intuitionistic proof assistants. We use the ability of rewrite systems to represent partial functions to implement heuristics for proof constructivization in Dedukti, a logical framework based on rewriting in which proofs are first-class objects which can be the subject of computation. We benchmark these heuristics on the proofs output by the automated theorem prover Zenon on the TPTP library of problems
Classical realizability as a classifier for nondeterminism
We show how the language of Krivine's classical realizability may be used to
specify various forms of nondeterminism and relate them with properties of
realizability models. More specifically, we introduce an abstract notion of
multi-evaluation relation which allows us to finely describe various
nondeterministic behaviours. This defines a hierarchy of computational models,
ordered by their degree of nondeterminism, similar to Sazonov's degrees of
parallelism. What we show is a duality between the structure of the
characteristic Boolean algebra of a realizability model and the degree of
nondeterminism in its underlying computational model. ACM Reference Format:
Guillaume Geoffroy. 2018. Classical realizability as a classifier for
nondeter-minism
Filter models: non-idempotent intersection types, orthogonality and polymorphism - long version
This paper revisits models of typed lambda-calculus based on filters of intersection types: By using non-idempotent intersections, we simplify a methodology that produces modular proofs of strong normalisation based on filter models. Non-idempotent intersections provide a decreasing measure proving a key termination property, simpler than the reducibility techniques used with idempotent intersections. Such filter models are shown to be captured by orthogonality techniques: we formalise an abstract notion of orthogonality model inspired by classical realisability, and express a filter model as one of its instances, along with two term-models (one of which captures a now common technique for strong normalisation). Applying the above range of model constructions to Curry-style System F describes at different levels of detail how the infinite polymorphism of System F can systematically be reduced to the finite polymorphism of intersection types
The stack calculus
We introduce a functional calculus with simple syntax and operational
semantics in which the calculi introduced so far in the Curry-Howard
correspondence for Classical Logic can be faithfully encoded. Our calculus
enjoys confluence without any restriction. Its type system enforces strong
normalization of expressions and it is a sound and complete system for full
implicational Classical Logic. We give a very simple denotational semantics
which allows easy calculations of the interpretation of expressions.Comment: In Proceedings LSFA 2012, arXiv:1303.713
Computability and analysis: the legacy of Alan Turing
We discuss the legacy of Alan Turing and his impact on computability and
analysis.Comment: 49 page