159 research outputs found

    Principal Typings in a Restricted Intersection Type System for Beta Normal Forms with De Bruijn Indices

    Full text link
    The lambda-calculus with de Bruijn indices assembles each alpha-class of lambda-terms in a unique term, using indices instead of variable names. Intersection types provide finitary type polymorphism and can characterise normalisable lambda-terms through the property that a term is normalisable if and only if it is typeable. To be closer to computations and to simplify the formalisation of the atomic operations involved in beta-contractions, several calculi of explicit substitution were developed mostly with de Bruijn indices. Versions of explicit substitutions calculi without types and with simple type systems are well investigated in contrast to versions with more elaborate type systems such as intersection types. In previous work, we introduced a de Bruijn version of the lambda-calculus with an intersection type system and proved that it preserves subject reduction, a basic property of type systems. In this paper a version with de Bruijn indices of an intersection type system originally introduced to characterise principal typings for beta-normal forms is presented. We present the characterisation in this new system and the corresponding versions for the type inference and the reconstruction of normal forms from principal typings algorithms. We briefly discuss the failure of the subject reduction property and some possible solutions for it

    SCC: A Service Centered Calculus

    Get PDF
    We seek for a small set of primitives that might serve as a basis for formalising and programming service oriented applications over global computers. As an outcome of this study we introduce here SCC, a process calculus that features explicit notions of service definition, service invocation and session handling. Our proposal has been influenced by Orc, a programming model for structured orchestration of services, but the SCC’s session handling mechanism allows for the definition of structured interaction protocols, more complex than the basic request-response provided by Orc. We present syntax and operational semantics of SCC and a number of simple but nontrivial programming examples that demonstrate flexibility of the chosen set of primitives. A few encodings are also provided to relate our proposal with existing ones

    Efficient Data Structures for Automated Theorem Proving in Expressive Higher-Order Logics

    Get PDF
    Church's Simple Theory of Types (STT), also referred to as classical higher-order logik, is an elegant and expressive formal system built on top of the simply typed λ-calculus. Its mechanisms of explicit binding and quantification over arbitrary sets and functions allow the representation of complex mathematical concepts and formulae in a concise and unambiguous manner. Higher-order automated theorem proving (ATP) has recently made major progress and several sophisticated ATP systems for higher-order logic have been developed, including Satallax, Osabelle/HOL and LEO-II. Still, higher-order theorem proving is not as mature as its first-order counterpart, and robust implementation techniques for efficient data structures are scarce. In this thesis, a higher-order term representation based upon the polymorphically typed λ-calculus is presented. This term representation employs spine notation, explicit substitutions and perfect term sharing for efficient term traversal, fast β-normalization and reuse of already constructed terms, respectively. An evaluation of the term representation is performed on the basis of a heterogeneous benchmark set. It shows that while the presented term data structure performs quite well in general, the normalization results indicate that a context dependent choice of reduction strategies is beneficial. A term indexing data structure for fast term retrieval based on various low-level criteria is presented and discussed. It supports symbol-based term retrieval, indexing of terms via structural properties, and subterm indexing

    A feasible algorithm for typing in Elementary Affine Logic

    Full text link
    We give a new type inference algorithm for typing lambda-terms in Elementary Affine Logic (EAL), which is motivated by applications to complexity and optimal reduction. Following previous references on this topic, the variant of EAL type system we consider (denoted EAL*) is a variant without sharing and without polymorphism. Our algorithm improves over the ones already known in that it offers a better complexity bound: if a simple type derivation for the term t is given our algorithm performs EAL* type inference in polynomial time.Comment: 20 page

    A Type Checker for a Logical Framework with Union and Intersection Types

    Get PDF
    We present the syntax, semantics, and typing rules of Bull, a prototype theorem prover based on the Delta-Framework, i.e. a fully-typed lambda-calculus decorated with union and intersection types, as described in previous papers by the authors. Bull also implements a subtyping algorithm for the Type Theory Xi of Barbanera-Dezani-de'Liguoro. Bull has a command-line interface where the user can declare axioms, terms, and perform computations and some basic terminal-style features like error pretty-printing, subexpressions highlighting, and file loading. Moreover, it can typecheck a proof or normalize it. These terms can be incomplete, therefore the typechecking algorithm uses unification to try to construct the missing subterms. Bull uses the syntax of Berardi's Pure Type Systems to improve the compactness and the modularity of the kernel. Abstract and concrete syntax are mostly aligned and similar to the concrete syntax of Coq. Bull uses a higher-order unification algorithm for terms, while typechecking and partial type inference are done by a bidirectional refinement algorithm, similar to the one found in Matita and Beluga. The refinement can be split into two parts: the essence refinement and the typing refinement. Binders are implemented using commonly-used de Bruijn indices. We have defined a concrete language syntax that will allow the user to write Delta-terms. We have defined the reduction rules and an evaluator. We have implemented from scratch a refiner which does partial typechecking and type reconstruction. We have experimented Bull with classical examples of the intersection and union literature, such as the ones formalized by Pfenning with his Refinement Types in LF. We hope that this research vein could be useful to experiment, in a proof theoretical setting, forms of polymorphism alternatives to Girard's parametric one

    Elaboration in Dependent Type Theory

    Full text link
    To be usable in practice, interactive theorem provers need to provide convenient and efficient means of writing expressions, definitions, and proofs. This involves inferring information that is often left implicit in an ordinary mathematical text, and resolving ambiguities in mathematical expressions. We refer to the process of passing from a quasi-formal and partially-specified expression to a completely precise formal one as elaboration. We describe an elaboration algorithm for dependent type theory that has been implemented in the Lean theorem prover. Lean's elaborator supports higher-order unification, type class inference, ad hoc overloading, insertion of coercions, the use of tactics, and the computational reduction of terms. The interactions between these components are subtle and complex, and the elaboration algorithm has been carefully designed to balance efficiency and usability. We describe the central design goals, and the means by which they are achieved

    Functions-as-Constructors Higher-Order Unification

    Get PDF
    Unification is a central operation in the construction of a range of computational logic systems based on first-order and higher-order logics. First-order unification has a number of properties that dominates the way it is incorporated within such systems. In particular, first-order unification is decidable, unary, and can be performed on untyped term structures. None of these three properties hold for full higher-order unification: unification is undecidable, unifiers can be incomparable, and term-level typing can dominate the search for unifiers. The so-called pattern subset of higher-order unification was designed to be a small extension to first-order unification that respected the basic laws governing lambda-binding (the equalities of alpha, beta, and eta-conversion) but which also satisfied those three properties. While the pattern fragment of higher-order unification has been popular in various implemented systems and in various theoretical considerations, it is too weak for a number of applications. In this paper, we define an extension of pattern unification that is motivated by some existing applications and which satisfies these three properties. The main idea behind this extension is that the arguments to a higher-order, free variable can be more than just distinct bound variables: they can also be terms constructed from (sufficient numbers of) such variables using term constructors and where no argument is a subterm of any other argument. We show that this extension to pattern unification satisfies the three properties mentioned above
    corecore