176 research outputs found

    Compositional Explanation of Types and Algorithmic Debugging of Type Errors

    Get PDF
    The type systems of most typed functional programming languages are based on the Hindley-Milner type system. A practical problem with these type systems is that it is often hard to understand why a program is not type correct or a function does not have the intended type. We suggest that at the core of this problem is the difficulty of explaining why a given expression has a certain type. The type system is not defined compositionally. We propose to explain types using a variant of the Hindley-Milner type system that defines a compositional type explanation graph of principal typings. We describe how the programmer understands types by interactive navigation through the explanation graph. Furthermore, the explanation graph can be the foundation for algorithmic debugging of type errors, that is, semi-automatic localisation of the source of a type error without even having to understand the type inference steps. We implemented a prototype of a tool to explore the usefulness of the proposed methods

    Definitions by Rewriting in the Calculus of Constructions

    Get PDF
    The main novelty of this paper is to consider an extension of the Calculus of Constructions where predicates can be defined with a general form of rewrite rules. We prove the strong normalization of the reduction relation generated by the beta-rule and the user-defined rules under some general syntactic conditions including confluence. As examples, we show that two important systems satisfy these conditions: a sub-system of the Calculus of Inductive Constructions which is the basis of the proof assistant Coq, and the Natural Deduction Modulo a large class of equational theories.Comment: Best student paper (Kleene Award

    Light types for polynomial time computation in lambda-calculus

    Full text link
    We propose a new type system for lambda-calculus ensuring that well-typed programs can be executed in polynomial time: Dual light affine logic (DLAL). DLAL has a simple type language with a linear and an intuitionistic type arrow, and one modality. It corresponds to a fragment of Light affine logic (LAL). We show that contrarily to LAL, DLAL ensures good properties on lambda-terms: subject reduction is satisfied and a well-typed term admits a polynomial bound on the reduction by any strategy. We establish that as LAL, DLAL allows to represent all polytime functions. Finally we give a type inference procedure for propositional DLAL.Comment: 20 pages (including 10 pages of appendix). (revised version; in particular section 5 has been modified). A short version is to appear in the proceedings of the conference LICS 2004 (IEEE Computer Society Press

    Putting type annotations to work

    Get PDF

    Type-Inference Based Short Cut Deforestation (nearly) without Inlining

    Get PDF
    Deforestation optimises a functional program by transforming it into another one that does not create certain intermediate data structures. In [ICFP'99] we presented a type-inference based deforestation algorithm which performs extensive inlining. However, across module boundaries only limited inlining is practically feasible. Furthermore, inlining is a non-trivial transformation which is therefore best implemented as a separate optimisation pass. To perform short cut deforestation (nearly) without inlining, Gill suggested to split definitions into workers and wrappers and inline only the small wrappers, which transfer the information needed for deforestation. We show that Gill's use of a function build limits deforestation and note that his reasons for using build do not apply to our approach. Hence we develop a more general worker/wrapper scheme without build. We give a type-inference based algorithm which splits definitions into workers and wrappers. Finally, we show that we can deforest more expressions with the worker/wrapper scheme than the algorithm with inlining

    Compositional explanation of types and algorithmic debugging of type errors

    Get PDF

    Practical Subtyping for System F with Sized (Co-)Induction

    Full text link
    We present a rich type system with subtyping for an extension of System F. Our type constructors include sum and product types, universal and existential quantifiers, inductive and coinductive types. The latter two size annotations allowing the preservation of size invariants. For example it is possible to derive the termination of the quicksort by showing that partitioning a list does not increase its size. The system deals with complex programs involving mixed induction and coinduction, or even mixed (co-)induction and polymorphism (as for Scott-encoded datatypes). One of the key ideas is to completely separate the induction on sizes from the notion of recursive programs. We use the size change principle to check that the proof is well-founded, not that the program terminates. Termination is obtained by a strong normalization proof. Another key idea is the use symbolic witnesses to handle quantifiers of all sorts. To demonstrate the practicality of our system, we provide an implementation that accepts all the examples discussed in the paper and much more

    Investigations in intersection types : confluence, and semantics of expansion in the -calculus, and a type error slicing method

    Get PDF
    Type systems were invented in the early 1900s to provide foundations for Mathematics where types were used to avoid paradoxes. Type systems have then been developed and extended throughout the years to serve different purposes such as efficiency or expressiveness. The λ-calculus is used in programming languages, logic, mathematics, and linguistics. Intersection types are a kind of types used for building semantic models of the λ-calculus and for static analysis of computer programs. The confluence property was used to prove the λ-calculus’ consistency and the uniqueness of normal forms. Confluence is useful to show that logics are sensibly designed, and to make equality decision procedures for use in theorem provers. Some proofs of the λ-calculus’ confluence are based on syntactic concepts (reduction relations and λ-term sets) and some on semantic concepts (type interpretations). Part I of this thesis presents an original syntactic proof that is a simplification of a semantic proof based on a sound type interpretation w.r.t. an intersection type system. Our proof can be seen as bridging some semantic and syntactic proofs. Expansion is an operation on typings (pairs of type environments and result types) in type systems for the λ-calculus. It was introduced to prove that the principal typing property (i.e., that every typable term has a strongest typing) holds in intersection type systems. Expansion variables were introduced to simplify the expansion mechanism. Part II of this thesis presents a complete realisability semantics w.r.t. an intersection type system with infinitely many expansion variables. This represents the first study on semantics of expansion. Providing sound (and complete) realisability semantics allows one to study the algorithmic behaviour of typed λ-terms through their types w.r.t. a type system. We believe such semantics will cast some light on the not yet well understood expansion operation. Intersection types were used in a type error slicer for the SML programming language. Existing compilers for many languages have confusing type error messages. Type error slicing (TES) helps the programmer by isolating the part of a program contributing to a type error (a slice). TES was initially done for a tiny toy language (the λ-calculus with polymorphic let-expressions). Extending TES to a full language is extremely challenging, and for SML we needed a number of innovations. Some issues would be faced for any language, and some are SML-specific but representative of the complexity of language-specific issues likely to be faced for other languages. Part III of this thesis solves both kinds of issues and presents an original, simple, and general constraint system for providing type error slices for ill-typed programs. We believe TES helps demystify language features known to confuse users
    corecore