579 research outputs found

    Dual-Context Calculi for Modal Logic

    Get PDF
    We present natural deduction systems and associated modal lambda calculi for the necessity fragments of the normal modal logics K, T, K4, GL and S4. These systems are in the dual-context style: they feature two distinct zones of assumptions, one of which can be thought as modal, and the other as intuitionistic. We show that these calculi have their roots in in sequent calculi. We then investigate their metatheory, equip them with a confluent and strongly normalizing notion of reduction, and show that they coincide with the usual Hilbert systems up to provability. Finally, we investigate a categorical semantics which interprets the modality as a product-preserving functor.Comment: Full version of article previously presented at LICS 2017 (see arXiv:1602.04860v4 or doi: 10.1109/LICS.2017.8005089

    Semantic A-translation and Super-consistency entail Classical Cut Elimination

    Get PDF
    We show that if a theory R defined by a rewrite system is super-consistent, the classical sequent calculus modulo R enjoys the cut elimination property, which was an open question. For such theories it was already known that proofs strongly normalize in natural deduction modulo R, and that cut elimination holds in the intuitionistic sequent calculus modulo R. We first define a syntactic and a semantic version of Friedman's A-translation, showing that it preserves the structure of pseudo-Heyting algebra, our semantic framework. Then we relate the interpretation of a theory in the A-translated algebra and its A-translation in the original algebra. This allows to show the stability of the super-consistency criterion and the cut elimination theorem

    An inverse of the evaluation functional for typed Lambda-calculus

    Get PDF
    In any model of typed λ-calculus conianing some basic arithmetic, a functional p - * (procedure—* expression) will be defined which inverts the evaluation functional for typed X-terms, Combined with the evaluation functional, p-e yields an efficient normalization algorithm. The method is extended to X-calculi with constants and is used to normalize (the X-representations of) natural deduction proofs of (higher order) arithmetic. A consequence of theoretical interest is a strong completeness theorem for βη-reduction, generalizing results of Friedman [1] and Statman [31: If two Xterms have the same value in some model containing representations of the primitive recursive functions (of level 1) then they are provably equal in the βη- calculus

    Predicativism about Classes

    Get PDF

    Relating Church-Style and Curry-Style Subtyping

    Full text link
    Type theories with higher-order subtyping or singleton types are examples of systems where computation rules for variables are affected by type information in the context. A complication for these systems is that bounds declared in the context do not interact well with the logical relation proof of completeness or termination. This paper proposes a natural modification to the type syntax for F-Omega-Sub, adding variable's bound to the variable type constructor, thereby separating the computational behavior of the variable from the context. The algorithm for subtyping in F-Omega-Sub can then be given on types without context or kind information. As a consequence, the metatheory follows the general approach for type systems without computational information in the context, including a simple logical relation definition without Kripke-style indexing by context. This new presentation of the system is shown to be equivalent to the traditional presentation without bounds on the variable type constructor.Comment: In Proceedings ITRS 2010, arXiv:1101.410

    Cut Elimination for a Logic with Induction and Co-induction

    Full text link
    Proof search has been used to specify a wide range of computation systems. In order to build a framework for reasoning about such specifications, we make use of a sequent calculus involving induction and co-induction. These proof principles are based on a proof theoretic (rather than set-theoretic) notion of definition. Definitions are akin to logic programs, where the left and right rules for defined atoms allow one to view theories as "closed" or defining fixed points. The use of definitions and free equality makes it possible to reason intentionally about syntax. We add in a consistent way rules for pre and post fixed points, thus allowing the user to reason inductively and co-inductively about properties of computational system making full use of higher-order abstract syntax. Consistency is guaranteed via cut-elimination, where we give the first, to our knowledge, cut-elimination procedure in the presence of general inductive and co-inductive definitions.Comment: 42 pages, submitted to the Journal of Applied Logi
    corecore