783 research outputs found
Refocusing generalised normalisation
When defined with general elimination/application rules, natural
deduction and -calculus become closer to sequent
calculus. In order to get real isomorphism, normalisation has to
be defined in a ``multiary'' variant, in which reduction rules are
necessarily non-local (reason: nomalisation, like cut-elimination,
acts at the \emph{head} of applicative terms, but natural
deduction focuses at the \emph{tail} of such terms). Non-local
rules are bad, for instance, for the mechanization of the system.
A solution is to extend natural deduction even further to a
\emph{unified calculus} based on the unification of cut and
general elimination. In the unified calculus, a sequent term
behaves like in the sequent calculus, whereas the reduction steps
of a natural deduction term are interleaved with explicit steps
for bringing heads to focus. A variant of the calculus has the
symmetric role of improving sequent calculus in dealing with
tail-active permutative conversions
Towards a canonical classical natural deduction system
Preprint submitted to Elsevier, 6 July 2012This paper studies a new classical natural deduction system, presented as a typed
calculus named lambda-mu- let. It is designed to be isomorphic to Curien and Herbelin's lambda-mu-mu~-calculus, both at the level of proofs and reduction, and the isomorphism is based on the correct correspondence between cut (resp. left-introduction) in sequent calculus, and
substitution (resp. elimination) in natural deduction. It is a combination of Parigot's lambda-mu -calculus with the idea of "coercion calculus" due to Cervesato and Pfenning, accommodating
let-expressions in a surprising way: they expand Parigot's syntactic class of named terms.
This calculus and the mentioned isomorphism Theta offer three missing components of
the proof theory of classical logic: a canonical natural deduction system; a robust process
of "read-back" of calculi in the sequent calculus format into natural deduction syntax;
a formalization of the usual semantics of the lambda-mu-mu~-calculus, that explains co-terms and cuts as, respectively, contexts and hole- filling instructions. lambda-mu-let is not yet another
classical calculus, but rather a canonical reflection in natural deduction of the impeccable
treatment of classical logic by sequent calculus; and provides the "read-back" map and
the formalized semantics, based on the precise notions of context and "hole-expression"
provided by lambda-mu-let.
We use "read-back" to achieve a precise connection with Parigot's lambda-mu , and to derive
lambda-calculi for call-by-value combining control and let-expressions in a logically founded
way. Finally, the semantics , when fully developed, can be inverted at each syntactic
category. This development gives us license to see sequent calculus as the semantics of
natural deduction; and uncovers a new syntactic concept in lambda-mu-mu~ ("co-context"), with
which one can give a new de nition of eta-reduction
A calculus of multiary sequent terms
Multiary sequent terms were originally introduced as a tool for
proving termination of permutative conversions in cut-free sequent
calculus. This work develops the language of multiary sequent terms
into a term calculus for the computational (Curry-Howard)
interpretation of a fragment of sequent calculus with cuts and
cut-elimination rules. The system, named generalised multiary
lambda-calculus, is a rich extension of the lambda-calculus
where the computational content of the sequent calculus format is
explained through an enlarged form of the application constructor.
Such constructor exhibits the features of multiarity (the ability of
forming lists of arguments) and generality (the ability of
prescribing a kind of continuation). The system integrates in a
modular way the multiary lambda-calculus and an isomorphic copy
of the lambda-calculus with generalised application LambdaJ
(in particular, natural deduction is captured internally up to
isomorphism). In addition, the system: (i) comes with permutative
conversion rules, whose role is to eliminate the new features of
application;
(ii) is equipped with reduction rules --- either the mu-rule,
typical of the multiary setting, or rules for cut-elimination,
which enlarge the ordinary beta-rule.
This paper establishes the meta-theory of the system, with emphasis
on the role of the mu-rule, and including a study of the
interaction of reduction and permutative conversions.Fundação para a CiĂȘncia e a Tecnologia (FCT
Towards a canonical classical natural deduction system
This paper studies a new classical natural deduction system, presented as a typed calculus named \lml. It is designed to be
isomorphic to Curien-Herbelin's calculus, both at the level of proofs and reduction, and the isomorphism is based on the correct correspondence between cut (resp. left-introduction) in sequent calculus, and substitution (resp. elimination) in natural deduction. It is a combination of Parigot's -calculus with the idea
of ``coercion calculus'' due to Cervesato-Pfenning, accommodating let-expressions in a surprising way: they expand Parigot's syntactic class of named terms.
This calculus aims to be the simultaneous answer to three problems. The first problem is the lack of a canonical natural deduction
system for classical logic. \lml is not yet another classical calculus, but rather a canonical reflection in natural deduction of
the impeccable treatment of classical logic by sequent calculus. The second problem is the lack of a formalization of the usual semantics
of Curien-Herbelin's calculus, that explains co-terms and cuts as, respectively, contexts and hole-filling instructions. The mentioned
isomorphism is the required formalization, based on the precise notions of context and hole-expression offered by \lml. The third
problem is the lack of a robust process of ``read-back'' into natural deduction syntax of calculi in the sequent calculus format,
that affects mainly the recent proof-theoretic efforts of derivation of -calculi for call-by-value. An isomorphic counterpart
to the -subsystem of Curien-Herbelin's-calculus is derived, obtaining a new
-calculus for call-by-value, combining control and let-expressions.Fundação para a CiĂȘncia e a Tecnologia (FCT
On computational interpretations of the modal logic S4. I. Cut elimination
A language of constructions for minimal logic is the
-calculus, where cut-elimination is encoded as
-reduction. We examine corresponding languages for the
minimal version of the modal logic S4, with notions of reduction
that encodes cut-elimination for the corresponding sequent system.
It turns out that a natural interpretation of the latter
constructions is a -calculus extended by an idealized
version of Lisp\u27s \verb/eval/ and \verb/quote/ constructs.
In this first part, we analyze how cut-elimination works in the
standard sequent system for minimal S4, and where problems arise.
Bierman and De Paiva\u27s proposal is a natural language of constructions
for this logic, but their calculus lacks a few rules that are
essential to eliminate all cuts. The -calculus,
namelyBierman and De Paiva\u27s proposal extended with all needed rules,
is confluent. There is a polynomial-time algorithm to compute
principal typings of given terms, or answer that the given terms are
not typable. The typed -calculus terminates, and
normal forms are exactly constructions for cut-free proofs. Finally,
modulo some notion \sqeq of equivalence, there is a natural
Curry-Howard style isomorphism between typed
-terms and natural deduction proofs in minimal S4.
However, the -calculus has a non-operational
flavor, in that the extra rules include explicit garbage collection,
contraction and exchange rules. We shall propose another language of
constructions to repair this in Part II
Permutability in proof terms for intuitionistic sequent calculus with cuts
This paper gives a comprehensive and coherent view on permutability in the intuitionistic sequent calculus with cuts. Specifically we show that, once permutability is packaged into appropriate global reduction procedures, it organizes the internal structure of the system and determines fragments with computational interest, both for the computation-as-proof-normalization and the computation-as-proof-search paradigms. The vehicle of the study is a lambda-calculus of multiary proof terms with generalized application, previously developed by the authors (the paper argues this system represents the simplest fragment of ordinary sequent calculus that does not fall into mere natural deduction). We start by adapting to our setting the concept of normal proof, developed by Mints, Dyckhoff, and Pinto, and by defining natural proofs, so that a proof is normal iff it is natural and cut-free. Natural proofs form a subsystem with a transparent Curry-Howard interpretation (a kind of formal vector notation for lambda-terms with vectors consisting of lists of lists of arguments), while searching for normal proofs corresponds to a slight relaxation of focusing (in the sense of LJT). Next, we define a process of permutative conversion to natural form, and show that its combination with cut elimination gives a concept of normalization for the sequent calculus. We derive a systematic picture of the full system comprehending a rich set of reduction procedures (cut elimination, flattening, permutative conversion, normalization, focalization), organizing the relevant subsystems and the important subclasses of cut-free, normal, and focused proofs.Partially financed by FCT through project UID/MAT/00013/2013, and by COST action CA15123 EUTYPES. The first and the last authors were partially financed by Fundação para
a CiĂȘncia e a Tecnologia (FCT) through project UID/MAT/00013/2013. The first author got
financial support by the COST action CA15123 EUTYPES.info:eu-repo/semantics/publishedVersio
A Lambda-calculus Structure Isomorphic to Gentzen-style Sequent Calculus Structure
International audienceWe consider a lambda-calculus for which applicative terms have no longer the form (...((u u_1) u_2) ... u_n) but the form (u [u_1 ; ... ; u_n]), for which [u_1 ; ... ; u_n] is a list of terms. While the structure of the usual lambda-calculus is isomorphic to the structure of natural deduction, this new structure is isomorphic to the structure of Gentzen-style sequent calculus. To express the basis of the isomorphism, we consider intuitionistic logic with the implication as sole connective. However we do not consider Gentzen's calculus LJ, but a calculus LJT which leads to restrict the notion of cut-free proofs in LJ. We need also to explicitly consider, in a simply typed version of this lambda-calculus, a substitution operator and a list concatenation operator. By this way, each elementary step of cut-elimination exactly matches with a beta-reduction, a substitution propagation step or a concatenation computation step. Though it is possible to extend the isomorphism to classical logic and to other connectives, we do not treat of it in this paper
Deduction modulo theory
This paper is a survey on Deduction modulo theor
Dual-Context Calculi for Modal Logic
We present natural deduction systems and associated modal lambda calculi for
the necessity fragments of the normal modal logics K, T, K4, GL and S4. These
systems are in the dual-context style: they feature two distinct zones of
assumptions, one of which can be thought as modal, and the other as
intuitionistic. We show that these calculi have their roots in in sequent
calculi. We then investigate their metatheory, equip them with a confluent and
strongly normalizing notion of reduction, and show that they coincide with the
usual Hilbert systems up to provability. Finally, we investigate a categorical
semantics which interprets the modality as a product-preserving functor.Comment: Full version of article previously presented at LICS 2017 (see
arXiv:1602.04860v4 or doi: 10.1109/LICS.2017.8005089
- âŠ