145 research outputs found

    A type-directed, dictionary-passing translation of method overloading and structural subtyping in Featherweight Generic Go

    Get PDF
    Featherweight Generic Go (FGG) is a minimal core calculus modeling the essential features of the programming language Go. It includes support for overloaded methods, interface types, structural subtyping, and generics. The most straightforward semantic description of the dynamic behavior of FGG programs is to resolve method calls based on runtime type information of the receiver. This article shows a different approach by defining a type-directed translation from FGG− to an untyped lambda-calculus. FGG− includes all features of FGG but type assertions. The translation of an FGG− program provides evidence for the availability of methods as additional dictionary parameters, similar to the dictionary-passing approach known from Haskell type classes. Then, method calls can be resolved by a simple lookup of the method definition in the dictionary. Every program in the image of the translation has the same dynamic semantics as its source FGG− program. The proof of this result is based on a syntactic, step-indexed logical relation. The step index ensures a well-founded definition of the relation in the presence of recursive interface types and recursive methods. Although being non-deterministic, the translation is coherent

    A dependently typed programming language with dynamic equality

    Get PDF
    Dependent types offer a uniform foundation for both proof systems and programming languages. While the proof systems built with dependent types have become relatively popular, dependently typed programming languages are far from mainstream. One key issue with existing dependently typed languages is the overly conservative definitional equality that programmers are forced to use. When combined with a traditional typing workflow, these systems can be quite challenging and require a large amount of expertise to master. This thesis explores an alternative workflow and a more liberal handling of equality. Programmers are given warnings that contain the same information as the type errors that would be given by an existing system. Programmers can run these programs optimistically, and they will behave appropriately unless a direct contradiction confirming the warning is found. This is achieved by localizing equality constraints using a new form of elaboration based on bidirectional type inference. These local checks, or casts, are given a runtime behavior (similar to those of contracts and monitors). The elaborated terms have a weakened form of type soundness: they will not get stuck without an explicit counter example. The language explored in this thesis will be a Calculus of Constructions like language with recursion, type-in-type, data types with dependent indexing and pattern matching. Several meta-theoretic results will be presented. The key result is that the core language, called the cast system, "will not get stuck without a counter example"; a result called cast soundness. A proof of cast soundness is fully worked out for the fragment of the system without user defined data, and a Coq proof is available. Several other properties based on the gradual guarantees of gradual typing are also presented. In the presence of user defined data and pattern matching these properties are conjectured to hold. A prototype implementation of this work is available

    A Two-Level Linear Dependent Type Theory

    Full text link
    We present a type theory combining both linearity and dependency by stratifying typing rules into a level for logics and a level for programs. The distinction between logics and programs decouples their semantics, allowing the type system to assume tight resource bounds. A natural notion of irrelevancy is established where all proofs and types occurring inside programs are fully erasable without compromising their operational behavior. Through a heap-based operational semantics, we show that extracted programs always make computational progress and run memory clean. Additionally, programs can be freely reflected into the logical level for conducting deep proofs in the style of standard dependent type theories. This enables one to write resource safe programs and verify their correctness using a unified language

    Martin-L\"of \`a la Coq

    Full text link
    We present an extensive mechanization of the meta-theory of Martin-L\"of Type Theory (MLTT) in the Coq proof assistant. Our development builds on pre-existing work in Agda to show not only the decidability of conversion, but also the decidability of type checking, using an approach guided by bidirectional type checking. From our proof of decidability, we obtain a certified and executable type checker for a full-fledged version of MLTT with support for Π\Pi, Σ\Sigma, N\mathbb{N}, and identity types, and one universe. Furthermore, our development does not rely on impredicativity, induction-recursion or any axiom beyond MLTT with a schema for indexed inductive types and a handful of predicative universes, narrowing the gap between the object theory and the meta-theory to a mere difference in universes. Finally, we explain our formalization choices, geared towards a modular development relying on Coq's features, e.g. meta-programming facilities provided by tactics and universe polymorphism

    Focusing on Refinement Typing

    Full text link
    We present a logically principled foundation for systematizing, in a way that works with any computational effect and evaluation order, SMT constraint generation seen in refinement type systems for functional programming languages. By carefully combining a focalized variant of call-by-push-value, bidirectional typing, and our novel technique of value-determined indexes, our system generates solvable SMT constraints without existential (unification) variables. We design a polarized subtyping relation allowing us to prove our logically focused typing algorithm is sound, complete, and decidable. We prove type soundness of our declarative system with respect to an elementary domain-theoretic denotational semantics. Type soundness implies, relatively simply, the total correctness and logical consistency of our system. The relative ease with which we obtain both algorithmic and semantic results ultimately stems from the proof-theoretic technique of focalization.Comment: 61 pages + appendix with proofs, Just Accepted version of paper (with new title) at ACM Transactions on Programming Languages and System

    Efficient Implementation of Parametric Polymorphism using Reified Types

    Get PDF
    Parametric polymorphism is a language feature that lets programmers define code that behaves independently of the types of values it operates on. Using parametric polymorphism enables code reuse and improves the maintainability of software projects. The approach that a language implementation uses to support parametric polymorphism can have important performance implications. One such approach, erasure, converts generic code to non-generic code that uses a uniform representation for generic data. Erasure is notorious for introducing primitive boxing and other indirections that harm the performance of generic code. More generally, erasure destroys type information that could be used by the language implementation to optimize generic code. This thesis presents TASTyTruffle, a new interpreter for the Scala language. Whereas the standard Scala implementation executes erased Java Virtual Machine (JVM) bytecode, TASTyTruffle interprets TASTy, a different representation that has precise type information. This thesis explores how the type information present in TASTy empowers TASTyTruffle to implement generic code more effectively. In particular, TASTy's type information allows TASTyTruffle to reify types as objects that can be passed around the interpreter. These reified types are used to support heterogeneous box-free representations of generic values. Reified types also enable TASTyTruffle to create specialized, monomorphic copies of generic code that can be easily and reliably optimized by its just-in-time (JIT) compiler. Empirically, TASTyTruffle is competitive with the standard JVM implementation. Both implementations perform similarly on monomorphic workloads, but when generic code is used with multiple types, TASTyTruffle consistently outperforms the JVM. TASTy's type information enables TASTyTruffle to find additional optimization opportunities that could not be uncovered with erased JVM bytecode alone

    Cubical modal type theories

    Get PDF

    Mechanised metamathematics : an investigation of first-order logic and set theory in constructive type theory

    Get PDF
    In this thesis, we investigate several key results in the canon of metamathematics, applying the contemporary perspective of formalisation in constructive type theory and mechanisation in the Coq proof assistant. Concretely, we consider the central completeness, undecidability, and incompleteness theorems of first-order logic as well as properties of the axiom of choice and the continuum hypothesis in axiomatic set theory. Due to their fundamental role in the foundations of mathematics and their technical intricacies, these results have a long tradition in the codification as standard literature and, in more recent investigations, increasingly serve as a benchmark for computer mechanisation. With the present thesis, we continue this tradition by uniformly analysing the aforementioned cornerstones of metamathematics in the formal framework of constructive type theory. This programme offers novel insights into the constructive content of completeness, a synthetic approach to undecidability and incompleteness that largely eliminates the notorious tedium obscuring the essence of their proofs, as well as natural representations of set theory in the form of a second-order axiomatisation and of a fully type-theoretic account. The mechanisation concerning first-order logic is organised as a comprehensive Coq library open to usage and contribution by external users.In dieser Doktorarbeit werden einige Schlüsselergebnisse aus dem Kanon der Metamathematik untersucht, unter Verwendung der zeitgenössischen Perspektive von Formalisierung in konstruktiver Typtheorie und Mechanisierung mit Hilfe des Beweisassistenten Coq. Konkret werden die zentralen Vollständigkeits-, Unentscheidbarkeits- und Unvollständigkeitsergebnisse der Logik erster Ordnung sowie Eigenschaften des Auswahlaxioms und der Kontinuumshypothese in axiomatischer Mengenlehre betrachtet. Aufgrund ihrer fundamentalen Rolle in der Fundierung der Mathematik und ihrer technischen Schwierigkeiten, besitzen diese Ergebnisse eine lange Tradition der Kodifizierung als Standardliteratur und, besonders in jüngeren Untersuchungen, eine zunehmende Bedeutung als Maßstab für Mechanisierung mit Computern. Mit der vorliegenden Doktorarbeit wird diese Tradition fortgeführt, indem die zuvorgenannten Grundpfeiler der Methamatematik uniform im formalen Rahmen der konstruktiven Typtheorie analysiert werden. Dieses Programm ermöglicht neue Einsichten in den konstruktiven Gehalt von Vollständigkeit, einen synthetischen Ansatz für Unentscheidbarkeit und Unvollständigkeit, der großteils den berüchtigten, die Essenz der Beweise verdeckenden, technischen Aufwand eliminiert, sowie natürliche Repräsentationen von Mengentheorie in Form einer Axiomatisierung zweiter Ordnung und einer vollkommen typtheoretischen Darstellung. Die Mechanisierung zur Logik erster Ordnung ist als eine umfassende Coq-Bibliothek organisiert, die offen für Nutzung und Beiträge externer Anwender ist

    Foundations for programming and implementing effect handlers

    Get PDF
    First-class control operators provide programmers with an expressive and efficient means for manipulating control through reification of the current control state as a first-class object, enabling programmers to implement their own computational effects and control idioms as shareable libraries. Effect handlers provide a particularly structured approach to programming with first-class control by naming control reifying operations and separating from their handling. This thesis is composed of three strands of work in which I develop operational foundations for programming and implementing effect handlers as well as exploring the expressive power of effect handlers. The first strand develops a fine-grain call-by-value core calculus of a statically typed programming language with a structural notion of effect types, as opposed to the nominal notion of effect types that dominates the literature. With the structural approach, effects need not be declared before use. The usual safety properties of statically typed programming are retained by making crucial use of row polymorphism to build and track effect signatures. The calculus features three forms of handlers: deep, shallow, and parameterised. They each offer a different approach to manipulate the control state of programs. Traditional deep handlers are defined by folds over computation trees, and are the original con-struct proposed by Plotkin and Pretnar. Shallow handlers are defined by case splits (rather than folds) over computation trees. Parameterised handlers are deep handlers extended with a state value that is threaded through the folds over computation trees. To demonstrate the usefulness of effects and handlers as a practical programming abstraction I implement the essence of a small UNIX-style operating system complete with multi-user environment, time-sharing, and file I/O. The second strand studies continuation passing style (CPS) and abstract machine semantics, which are foundational techniques that admit a unified basis for implementing deep, shallow, and parameterised effect handlers in the same environment. The CPS translation is obtained through a series of refinements of a basic first-order CPS translation for a fine-grain call-by-value language into an untyped language. Each refinement moves toward a more intensional representation of continuations eventually arriving at the notion of generalised continuation, which admit simultaneous support for deep, shallow, and parameterised handlers. The initial refinement adds support for deep handlers by representing stacks of continuations and handlers as a curried sequence of arguments. The image of the resulting translation is not properly tail-recursive, meaning some function application terms do not appear in tail position. To rectify this the CPS translation is refined once more to obtain an uncurried representation of stacks of continuations and handlers. Finally, the translation is made higher-order in order to contract administrative redexes at translation time. The generalised continuation representation is used to construct an abstract machine that provide simultaneous support for deep, shallow, and parameterised effect handlers. kinds of effect handlers. The third strand explores the expressiveness of effect handlers. First, I show that deep, shallow, and parameterised notions of handlers are interdefinable by way of typed macro-expressiveness, which provides a syntactic notion of expressiveness that affirms the existence of encodings between handlers, but it provides no information about the computational content of the encodings. Second, using the semantic notion of expressiveness I show that for a class of programs a programming language with first-class control (e.g. effect handlers) admits asymptotically faster implementations than possible in a language without first-class control
    corecore