32 research outputs found
Permutability of proofs in intuitionistic sequent calculi
We prove a folklore theorem, that two derivations in a cut-free
sequent calculus for
intuitionistic propositional logic (based on Kleene's {\bf G3}) are
inter-permutable (using a set of
basic "permutation reduction rules'' derived from Kleene's work in 1952) iff
they determine
the same natural deduction. The basic rules form a confluent and weakly
normalising
rewriting system. We refer to Schwichtenberg's proof elsewhere
that a modification of this system is strongly normalising.União Europeia (UE) - Programa ESPRIT BRA 7232 GENTZEN.Centro de Matemática da Universidade do Minho (CMAT)
Permutative conversions in intuitionistic multiary sequent calculi with cuts
This work presents an extension with cuts of Schwichtenberg's multiary sequent calculus. We identify a set of permutative conversions on it, prove their termination and confluence and establish the permutability theorem. We present our sequent calculus as the typing system of the {\em generalised multiary -calculus} lambda-Jm, a new calculus introduced in this work. Lambda-Jm corresponds to an extension of -calculus with a notion of {\em generalised multiary application}, which may be seen as a function applied to a list of arguments and then explicitly substituted in another term. Proof-theoretically the corresponding typing rule encompasses, in a modular way, generalised eliminations of von Plato and Herbelin's head cuts.Fundação para a Ciência e a Tecnologia (FCT)
Proof search in constructive logics
We present an overview of some sequent calculi organised not for
"theorem-proving" but for proof search, where the proofs themselves
(and the avoidance of known proofs on backtracking) are objects of
interest. The main calculus discussed is that of Herbelin [1994] for
intuitionistic logic, which extends methods used in hereditary
Harrop logic programming; we give a brief discussion of similar
calculi for other logics. We also point out to some related work on
permutations in intuitionistic Gentzen sequent calculi that
clarifies the relationship between such calculi and natural
deduction.Centro de Matemática da Universidade do Minho (CMAT).União Europeia (UE) - Programa ESPRIT - BRA 7232 Gentzen
A calculus of multiary sequent terms
Multiary sequent terms were originally introduced as a tool for
proving termination of permutative conversions in cut-free sequent
calculus. This work develops the language of multiary sequent terms
into a term calculus for the computational (Curry-Howard)
interpretation of a fragment of sequent calculus with cuts and
cut-elimination rules. The system, named generalised multiary
lambda-calculus, is a rich extension of the lambda-calculus
where the computational content of the sequent calculus format is
explained through an enlarged form of the application constructor.
Such constructor exhibits the features of multiarity (the ability of
forming lists of arguments) and generality (the ability of
prescribing a kind of continuation). The system integrates in a
modular way the multiary lambda-calculus and an isomorphic copy
of the lambda-calculus with generalised application LambdaJ
(in particular, natural deduction is captured internally up to
isomorphism). In addition, the system: (i) comes with permutative
conversion rules, whose role is to eliminate the new features of
application;
(ii) is equipped with reduction rules --- either the mu-rule,
typical of the multiary setting, or rules for cut-elimination,
which enlarge the ordinary beta-rule.
This paper establishes the meta-theory of the system, with emphasis
on the role of the mu-rule, and including a study of the
interaction of reduction and permutative conversions.Fundação para a Ciência e a Tecnologia (FCT
Refocusing generalised normalisation
When defined with general elimination/application rules, natural
deduction and -calculus become closer to sequent
calculus. In order to get real isomorphism, normalisation has to
be defined in a ``multiary'' variant, in which reduction rules are
necessarily non-local (reason: nomalisation, like cut-elimination,
acts at the \emph{head} of applicative terms, but natural
deduction focuses at the \emph{tail} of such terms). Non-local
rules are bad, for instance, for the mechanization of the system.
A solution is to extend natural deduction even further to a
\emph{unified calculus} based on the unification of cut and
general elimination. In the unified calculus, a sequent term
behaves like in the sequent calculus, whereas the reduction steps
of a natural deduction term are interleaved with explicit steps
for bringing heads to focus. A variant of the calculus has the
symmetric role of improving sequent calculus in dealing with
tail-active permutative conversions
Permutability in proof terms for intuitionistic sequent calculus with cuts
This paper gives a comprehensive and coherent view on permutability in the intuitionistic sequent calculus with cuts. Specifically we show that, once permutability is packaged into appropriate global reduction procedures, it organizes the internal structure of the system and determines fragments with computational interest, both for the computation-as-proof-normalization and the computation-as-proof-search paradigms. The vehicle of the study is a lambda-calculus of multiary proof terms with generalized application, previously developed by the authors (the paper argues this system represents the simplest fragment of ordinary sequent calculus that does not fall into mere natural deduction). We start by adapting to our setting the concept of normal proof, developed by Mints, Dyckhoff, and Pinto, and by defining natural proofs, so that a proof is normal iff it is natural and cut-free. Natural proofs form a subsystem with a transparent Curry-Howard interpretation (a kind of formal vector notation for lambda-terms with vectors consisting of lists of lists of arguments), while searching for normal proofs corresponds to a slight relaxation of focusing (in the sense of LJT). Next, we define a process of permutative conversion to natural form, and show that its combination with cut elimination gives a concept of normalization for the sequent calculus. We derive a systematic picture of the full system comprehending a rich set of reduction procedures (cut elimination, flattening, permutative conversion, normalization, focalization), organizing the relevant subsystems and the important subclasses of cut-free, normal, and focused proofs.Partially financed by FCT through project UID/MAT/00013/2013, and by COST action CA15123 EUTYPES. The first and the last authors were partially financed by Fundação para
a Ciência e a Tecnologia (FCT) through project UID/MAT/00013/2013. The first author got
financial support by the COST action CA15123 EUTYPES.info:eu-repo/semantics/publishedVersio
Characterising strongly normalising intuitionistic sequent terms
This paper gives a characterisation, via intersection types, of the strongly normalising terms of an intuitionistic sequent calculus (where LJ easily embeds). The soundness of the typing
system is reduced to that of a well known typing system with intersection types for the ordinary lambda-calculus. The completeness of the typing system is obtained from subject expansion at root position. This paper's sequent term calculus integrates smoothly the lambda-terms with generalised application or explicit substitution. Strong normalisability of these terms as
sequent terms characterises their typeability in certain "natural'' typing systems with intersection types. The latter are in the natural deduction format, like systems previously studied by Matthes and Lengrand et al., except that they do not contain any extra, exceptional rules for typing generalised applications or substitution
Continuation-Passing Style and Strong Normalisation for Intuitionistic Sequent Calculi
The intuitionistic fragment of the call-by-name version of Curien and
Herbelin's \lambda\_mu\_{\~mu}-calculus is isolated and proved strongly
normalising by means of an embedding into the simply-typed lambda-calculus. Our
embedding is a continuation-and-garbage-passing style translation, the
inspiring idea coming from Ikeda and Nakazawa's translation of Parigot's
\lambda\_mu-calculus. The embedding strictly simulates reductions while usual
continuation-passing-style transformations erase permutative reduction steps.
For our intuitionistic sequent calculus, we even only need "units of garbage"
to be passed. We apply the same method to other calculi, namely successive
extensions of the simply-typed λ-calculus leading to our intuitionistic
system, and already for the simplest extension we consider (λ-calculus
with generalised application), this yields the first proof of strong
normalisation through a reduction-preserving embedding. The results obtained
extend to second and higher-order calculi
Decidability for Non-Standard Conversions in Typed Lambda-Calculi
This thesis studies the decidability of conversions in typed lambda-calculi, along with the algorithms allowing for this decidability. Our study takes in consideration conversions going beyond the traditional beta, eta, or permutative conversions (also called commutative conversions). To decide these conversions, two classes of algorithms compete, the algorithms based on rewriting, here the goal is to decompose and orient the conversion so as to obtain a convergent system, these algorithms then boil down to rewrite the terms until they reach an irreducible forms; and the "reduction free" algorithms where the conversion is decided recursively by a detour via a meta-language. Throughout this thesis, we strive to explain the latter thanks to the former