1,919 research outputs found
Revisiting the correspondence between cut-elimination and normalisation
Cut-free proofs in Herbelin's sequent calculus are in 1-1 correspondence with normal natural deduction proofs. For this reason Herbelin's sequent calculus has been considered a privileged middle-point between L-systems and natural deduction. However, this bijection does not extend to proofs containing cuts and Herbelin observed that his cut-elimination procedure is not isomorphic to -reduction.
In this paper we equip Herbelin's system with rewrite rules which, at the same time: (1) complete in a sense the cut elimination procedure firstly proposed by Herbelin; and (2) perform the intuitionistic "fragment'' of the tq-protocol - a cut-elimination procedure for classical logic defined by Danos, Joinet and Schellinx. Moreover we identify the subcalculus of our system which is isomorphic to natural deduction, the isomorphism being with respect not only to proofs but also to normalisation.
Our results show, for the implicational fragment of intuitionistic logic, how to embed natural deduction in the much wider world of sequent calculus and what a particular cut-elimination procedure normalisation is.Fundação para a Ciência e a Tecnologia (FCT)
Characterising strongly normalising intuitionistic sequent terms
This paper gives a characterisation, via intersection types, of the strongly normalising terms of an intuitionistic sequent calculus (where LJ easily embeds). The soundness of the typing
system is reduced to that of a well known typing system with intersection types for the ordinary lambda-calculus. The completeness of the typing system is obtained from subject expansion at root position. This paper's sequent term calculus integrates smoothly the lambda-terms with generalised application or explicit substitution. Strong normalisability of these terms as
sequent terms characterises their typeability in certain "natural'' typing systems with intersection types. The latter are in the natural deduction format, like systems previously studied by Matthes and Lengrand et al., except that they do not contain any extra, exceptional rules for typing generalised applications or substitution
Towards a canonical classical natural deduction system
This paper studies a new classical natural deduction system, presented as a typed calculus named \lml. It is designed to be
isomorphic to Curien-Herbelin's calculus, both at the level of proofs and reduction, and the isomorphism is based on the correct correspondence between cut (resp. left-introduction) in sequent calculus, and substitution (resp. elimination) in natural deduction. It is a combination of Parigot's -calculus with the idea
of ``coercion calculus'' due to Cervesato-Pfenning, accommodating let-expressions in a surprising way: they expand Parigot's syntactic class of named terms.
This calculus aims to be the simultaneous answer to three problems. The first problem is the lack of a canonical natural deduction
system for classical logic. \lml is not yet another classical calculus, but rather a canonical reflection in natural deduction of
the impeccable treatment of classical logic by sequent calculus. The second problem is the lack of a formalization of the usual semantics
of Curien-Herbelin's calculus, that explains co-terms and cuts as, respectively, contexts and hole-filling instructions. The mentioned
isomorphism is the required formalization, based on the precise notions of context and hole-expression offered by \lml. The third
problem is the lack of a robust process of ``read-back'' into natural deduction syntax of calculi in the sequent calculus format,
that affects mainly the recent proof-theoretic efforts of derivation of -calculi for call-by-value. An isomorphic counterpart
to the -subsystem of Curien-Herbelin's-calculus is derived, obtaining a new
-calculus for call-by-value, combining control and let-expressions.Fundação para a Ciência e a Tecnologia (FCT
Permutative conversions in intuitionistic multiary sequent calculi with cuts
This work presents an extension with cuts of Schwichtenberg's multiary sequent calculus. We identify a set of permutative conversions on it, prove their termination and confluence and establish the permutability theorem. We present our sequent calculus as the typing system of the {\em generalised multiary -calculus} lambda-Jm, a new calculus introduced in this work. Lambda-Jm corresponds to an extension of -calculus with a notion of {\em generalised multiary application}, which may be seen as a function applied to a list of arguments and then explicitly substituted in another term. Proof-theoretically the corresponding typing rule encompasses, in a modular way, generalised eliminations of von Plato and Herbelin's head cuts.Fundação para a Ciência e a Tecnologia (FCT)
Towards a canonical classical natural deduction system
Preprint submitted to Elsevier, 6 July 2012This paper studies a new classical natural deduction system, presented as a typed
calculus named lambda-mu- let. It is designed to be isomorphic to Curien and Herbelin's lambda-mu-mu~-calculus, both at the level of proofs and reduction, and the isomorphism is based on the correct correspondence between cut (resp. left-introduction) in sequent calculus, and
substitution (resp. elimination) in natural deduction. It is a combination of Parigot's lambda-mu -calculus with the idea of "coercion calculus" due to Cervesato and Pfenning, accommodating
let-expressions in a surprising way: they expand Parigot's syntactic class of named terms.
This calculus and the mentioned isomorphism Theta offer three missing components of
the proof theory of classical logic: a canonical natural deduction system; a robust process
of "read-back" of calculi in the sequent calculus format into natural deduction syntax;
a formalization of the usual semantics of the lambda-mu-mu~-calculus, that explains co-terms and cuts as, respectively, contexts and hole- filling instructions. lambda-mu-let is not yet another
classical calculus, but rather a canonical reflection in natural deduction of the impeccable
treatment of classical logic by sequent calculus; and provides the "read-back" map and
the formalized semantics, based on the precise notions of context and "hole-expression"
provided by lambda-mu-let.
We use "read-back" to achieve a precise connection with Parigot's lambda-mu , and to derive
lambda-calculi for call-by-value combining control and let-expressions in a logically founded
way. Finally, the semantics , when fully developed, can be inverted at each syntactic
category. This development gives us license to see sequent calculus as the semantics of
natural deduction; and uncovers a new syntactic concept in lambda-mu-mu~ ("co-context"), with
which one can give a new de nition of eta-reduction
What is the meaning of proofs? A Fregean distinction in proof-theoretic semantics
The origins of proof-theoretic semantics lie in the question of what
constitutes the meaning of the logical connectives and its response: the rules
of inference that govern the use of the connective. However, what if we go a
step further and ask about the meaning of a proof as a whole? In this paper we
address this question and lay out a framework to distinguish sense and
denotation of proofs. Two questions are central here. First of all, if we have
two (syntactically) different derivations, does this always lead to a
difference, firstly, in sense, and secondly, in denotation? The other question
is about the relation between different kinds of proof systems (here: natural
deduction vs. sequent calculi) with respect to this distinction. Do the
different forms of representing a proof necessarily correspond to a difference
in how the inferential steps are given? In our framework it will be possible to
identify denotation as well as sense of proofs not only within one proof system
but also between different kinds of proof systems. Thus, we give an account to
distinguish a mere syntactic divergence from a divergence in meaning and a
divergence in meaning from a divergence of proof objects analogous to Frege's
distinction for singular terms and sentences.Comment: Post-peer-review, pre-copyedit version of article, published version
available open access under DOI: 10.1007/s10992-020-09577-
Conservative extensions of the λ-calculus for the computational interpretation of sequent calculus
This thesis offers a study of the Curry-Howard correspondence for a certain fragment (the canonical fragment) of sequent calculus based on an investigation of the relationship between cut elimination in that fragment and normalisation. The output of this study may be summarised in a new assignment θ, to proofs in the canonical fragment, of terms from certain conservative extensions of the λ-calculus. This assignment, in a sense, is an optimal improvement over the traditional assignment φ, in that it is an isomorphism both in the sense of sound bijection of proofs and isomorphism of normalisation procedures.First, a systematic definition of calculi of cut-elimination for the canonical fragment is carried out. We study various right protocols, i.e. cut-elimination procedures which give priority to right permutation. We pay particular attention to the issue of what parts of the procedure are to be implicit, that is, performed by meta-operators in the style of natural deduction. Next, a comprehensive study of the relationship between normalisation and these calculi of cut-elimination is done, producing several new insight of independent interest, particularly concerning a generalisation of Prawitz’s mapping of normal natural deduction proofs into sequent calculus.This study suggests the definition of conservative extensions of natural deduction (and λ-calculus) based on the idea of a built-in distinction between applicative term and application, and also between head and tail application. These extensions offer perfect counterparts to the calculi in the canonical fragment, as established by the mentioned mapping θ . Conceptual rearrangements in proof- theory deriving from these extensions of natural deduction are discussed.Finally, we argue that, computationally, both the canonical fragment and natural deduction (in the extended sense introduced here) correspond to extensions of the λ-calculus with applicative terms; and that what distinguishes them is the way applicative terms are structured. In the canonical fragment, the head application of an applicative term is “focused” . This, in turn, explains the following observation: some reduction rules of calculi in the canonical fragment may be interpreted as transition rules for abstract call-by-name machines
Characterization of strong normalizability for a sequent lambda calculus with co-control
We study strong normalization in a lambda calculus of proof-terms
with co-control for the intuitionistic sequent calculus. In this sequent
lambda calculus, the management of formulas on the left hand
side of typing judgements is “dual" to the management of formulas
on the right hand side of the typing judgements in Parigot’s lambdamu
calculus - that is why our system has first-class “co-control".
The characterization of strong normalization is by means of intersection
types, and is obtained by analyzing the relationship with
another sequent lambda calculus, without co-control, for which a
characterization of strong normalizability has been obtained before.
The comparison of the two formulations of the sequent calculus,
with or without co-control, is of independent interest. Finally, since
it is known how to obtain bidirectional natural deduction systems
isomorphic to these sequent calculi, characterizations are obtained
of the strongly normalizing proof-terms of such natural deduction
systems.The authors would like to thank the anonymous
referees for their valuable comments and helpful suggestions.
This work was partly supported by FCT—Fundação para a Ciência
e a Tecnologia, within the project UID-MAT-00013/2013; by
COST Action CA15123 - The European research network on types
for programming and verification (EUTypes) via STSM; and by the
Ministry of Education, Science and Technological Development,
Serbia, under the projects ON174026 and III44006.info:eu-repo/semantics/publishedVersio
- …