233 research outputs found

    A mechanization of sorted higher-order logic based on the resolution principle

    Get PDF
    The usage of sorts in first-order automated deduction has brought greater conciseness of representation and a considerable gain in efficiency by reducing the search spaces involved. This suggests that sort information can be employed in higher-order theorem proving with similar results. This thesis develops a sorted higher-order logic SUM HOL suitable for automatic theorem proving applications. SUM HOL is based on a sorted Lambda-calculus SUM A->, which is obtained by extending Church';s simply typed Lambda-calculus by a higher-order sort concept including term declarations and functional base sorts. The term declaration mechanism studied here is powerful enough to allow convenient formalization of a large body of mathematics, since it offers natural primitives for domains and codomains of functions, and allows to treat function restriction. Furthermore, it subsumes most other mechanisms for the declaration of sort information known from the literature, and can thus serve as a general framework for the study of sorted higher-order logics. For instance, the term declaration mechanism of SUM HOL subsumes the subsorting mechanism as a derived notion, and hence justifies our special form of subsort inference. We present sets of transformations for sorted higher-order unification and pre-unification, and prove the nondeterministic completeness of the algorithm induced by these transformations. The main technical difficulty of unification in ! is that the analysis of general bindings is much more involved than in the unsorted case, since in the presence of term declarations well-sortedness is not a structural property. This difficulty is overcome by a structure theorem that links the structure of a formula to the structure of its sorting derivation. We develop two notions of set-theoretic semantics for SUM HOL. General SUM-models are a direct generalization of Henkin';s general models to the sorted setting. Since no known machine-oriented calculus can adequately mechanize full extensionality, we generalize general SUM-models further to SUM-model structures, which allow full extensionality to fail. The notions of SUM-model structures and general SUM-models allow us to prove model existence theorems for them. These model-theoretic variants of Andrews unifying principle for type theory'; can be used as a powerful tool in completeness proofs of higher-order calculi. Finally, we use our pre-unification algorithms as a central inference procedure for a sorted higherorder resolution calculus in the spirit of Huet';s Constrained Resolution. This calculus is proven sound and complete with respect to our semantics. It differs from Huet';s calculus by allowing early unification strategies and using variable dependencies. For the completeness proof we make use of our model existence theorem, and prove a strong lifting lemma

    A mechanization of sorted higher-order logic based on the resolution principle

    Get PDF
    The usage of sorts in first-order automated deduction has brought greater conciseness of representation and a considerable gain in efficiency by reducing the search spaces involved. This suggests that sort information can be employed in higher-order theorem proving with similar results. This thesis develops a sorted higher-order logic SUM HOL suitable for automatic theorem proving applications. SUM HOL is based on a sorted Lambda-calculus SUM A->, which is obtained by extending Church\u27;s simply typed Lambda-calculus by a higher-order sort concept including term declarations and functional base sorts. The term declaration mechanism studied here is powerful enough to allow convenient formalization of a large body of mathematics, since it offers natural primitives for domains and codomains of functions, and allows to treat function restriction. Furthermore, it subsumes most other mechanisms for the declaration of sort information known from the literature, and can thus serve as a general framework for the study of sorted higher-order logics. For instance, the term declaration mechanism of SUM HOL subsumes the subsorting mechanism as a derived notion, and hence justifies our special form of subsort inference. We present sets of transformations for sorted higher-order unification and pre-unification, and prove the nondeterministic completeness of the algorithm induced by these transformations. The main technical difficulty of unification in ! is that the analysis of general bindings is much more involved than in the unsorted case, since in the presence of term declarations well-sortedness is not a structural property. This difficulty is overcome by a structure theorem that links the structure of a formula to the structure of its sorting derivation. We develop two notions of set-theoretic semantics for SUM HOL. General SUM-models are a direct generalization of Henkin\u27;s general models to the sorted setting. Since no known machine-oriented calculus can adequately mechanize full extensionality, we generalize general SUM-models further to SUM-model structures, which allow full extensionality to fail. The notions of SUM-model structures and general SUM-models allow us to prove model existence theorems for them. These model-theoretic variants of Andrews unifying principle for type theory\u27; can be used as a powerful tool in completeness proofs of higher-order calculi. Finally, we use our pre-unification algorithms as a central inference procedure for a sorted higherorder resolution calculus in the spirit of Huet\u27;s Constrained Resolution. This calculus is proven sound and complete with respect to our semantics. It differs from Huet\u27;s calculus by allowing early unification strategies and using variable dependencies. For the completeness proof we make use of our model existence theorem, and prove a strong lifting lemma

    Multi-level Contextual Type Theory

    Full text link
    Contextual type theory distinguishes between bound variables and meta-variables to write potentially incomplete terms in the presence of binders. It has found good use as a framework for concise explanations of higher-order unification, characterize holes in proofs, and in developing a foundation for programming with higher-order abstract syntax, as embodied by the programming and reasoning environment Beluga. However, to reason about these applications, we need to introduce meta^2-variables to characterize the dependency on meta-variables and bound variables. In other words, we must go beyond a two-level system granting only bound variables and meta-variables. In this paper we generalize contextual type theory to n levels for arbitrary n, so as to obtain a formal system offering bound variables, meta-variables and so on all the way to meta^n-variables. We obtain a uniform account by collapsing all these different kinds of variables into a single notion of variabe indexed by some level k. We give a decidable bi-directional type system which characterizes beta-eta-normal forms together with a generalized substitution operation.Comment: In Proceedings LFMTP 2011, arXiv:1110.668

    Polymorphic Rewriting Conserves Algebraic Strong Normalization and Confluence

    Get PDF
    We study combinations of many-sorted algebraic term rewriting systems and polymorphic lambda term rewriting. Algebraic and lambda terms are mixed by adding the symbols of the algebraic signature to the polymorphic lambda calculus, as higher-order constants. We show that if a many-sorted algebraic rewrite system R is strongly normalizing (terminating, noetherian), then R + β + η + type-β + type-η rewriting of mixed terms is also strongly normalizing. We obtain this results using a technique which generalizes Girard\u27s candidats de reductibilité , introduced in the original proof of strong normalization for the polymorphic lambda calculus. We also show that if a many-sorted algebraic rewrite system R has the Church-Rosser property (is confluent), then R + β + type-β + type-η rewriting of mixed terms has the Church- Rosser property too. Combining the two results, we conclude that if R is canonical (complete) on algebraic terms, then R + β + type-β + type-η is canonical on mixed terms. η reduction does not commute with a1gebraic reduction, in general. However, using long β- normal forms, we show that if R is canonical then R + β + η + type-β + type-η convertibility is still decidable

    A Bi-Directional Refinement Algorithm for the Calculus of (Co)Inductive Constructions

    Full text link
    The paper describes the refinement algorithm for the Calculus of (Co)Inductive Constructions (CIC) implemented in the interactive theorem prover Matita. The refinement algorithm is in charge of giving a meaning to the terms, types and proof terms directly written by the user or generated by using tactics, decision procedures or general automation. The terms are written in an "external syntax" meant to be user friendly that allows omission of information, untyped binders and a certain liberal use of user defined sub-typing. The refiner modifies the terms to obtain related well typed terms in the internal syntax understood by the kernel of the ITP. In particular, it acts as a type inference algorithm when all the binders are untyped. The proposed algorithm is bi-directional: given a term in external syntax and a type expected for the term, it propagates as much typing information as possible towards the leaves of the term. Traditional mono-directional algorithms, instead, proceed in a bottom-up way by inferring the type of a sub-term and comparing (unifying) it with the type expected by its context only at the end. We propose some novel bi-directional rules for CIC that are particularly effective. Among the benefits of bi-directionality we have better error message reporting and better inference of dependent types. Moreover, thanks to bi-directionality, the coercion system for sub-typing is more effective and type inference generates simpler unification problems that are more likely to be solved by the inherently incomplete higher order unification algorithms implemented. Finally we introduce in the external syntax the notion of vector of placeholders that enables to omit at once an arbitrary number of arguments. Vectors of placeholders allow a trivial implementation of implicit arguments and greatly simplify the implementation of primitive and simple tactics

    Type Directed Specification Refinement

    Get PDF
    Specification languages serve a fundamentally different purpose than general-purpose programming languages, and their type systems reflect these needs. Specification type systems must record and track more information for us to reason about a system adequately, and this added expressiveness may lead to an undecidable typing analysis. System level design begins with a high-level specification that is continually refined and expanded with implementation details, constraints, and typing information, down to a concrete specification. During this refinement process, the system is underspecified, and many static analyses aren't applicable until the system is fully specified. However, partial specifications contain valuable information that can inform the refinement process--we can locally inspect parts of the specification from a typing perspective to look for inferrable information or inconsistencies early on to aid the refinement process. This work defines a typing analysis that gathers constraints and typing information to inform the specification refinement process. It explores localized techniques such as local type inference and tracking of values as a means of influencing the specification refinement process

    Categorical programming language

    Get PDF

    Progress Report : 1991 - 1994

    Get PDF
    corecore