36 research outputs found

    A Family Of Syntactic Logical Relations For The Semantics Of Haskell-Like Languages

    Get PDF
    Logical relations are a fundamental and powerful tool for reasoning about programs in languages with parametric polymorphism. Logical relations suitable for reasoning about observational behavior in polymorphic calculi supporting various programming language features have been introduced in recent years. Unfortunately, the calculi studied are typically idealized, and the results obtained for them over only partial insight into the impact of such features on observational behavior in implemented languages. In this paper we show how to bring reasoning via logical relations closer to bear on real languages by deriving results that are more pertinent to an intermediate language for the (mostly) lazy functional language Haskell like GHC Core. To provide a more ?ne-grained analysis of program behavior than is possible by reasoning about program equivalence alone, we work with an abstract notion of relating observational behavior of computations which has among its specializations both observational equivalence and observational approximation. We take selective strictness into account, and we consider the impact of different kinds of computational failure, e.g., divergence versus failed pattern matching, because such distinctions are significant in practice. Once distinguished, the relative de?nedness of different failure causes needs to be considered, because different orders here induce different observational relations on programs (including the choice between equivalence and approximation). Our main contribution is the construction of an entire family of logical relations, parameterized over a definedness order on failure causes, each member of which characterizes the corresponding observational relation. Although we deal with properties very much tied to types, we base our results on a type-erasing semantics since this is more faithful to actual implementations

    Improvements for Free

    Full text link
    "Theorems for Free!" (Wadler, FPCA 1989) is a slogan for a technique that allows to derive statements about functions just from their types. So far, the statements considered have always had a purely extensional flavor: statements relating the value semantics of program expressions, but not statements relating their runtime (or other) cost. Here we study an extension of the technique that allows precisely statements of the latter flavor, by deriving quantitative theorems for free. After developing the theory, we walk through a number of example derivations. Probably none of the statements derived in those simple examples will be particularly surprising to most readers, but what is maybe surprising, and at the very least novel, is that there is a general technique for obtaining such results on a quantitative level in a principled way. Moreover, there is good potential to bring that technique to bear on more complex examples as well. We turn our attention to short-cut fusion (Gill et al., FPCA 1993) in particular.Comment: In Proceedings QAPL 2011, arXiv:1107.074

    Free Theorems in Languages with Real-World Programming Features

    Get PDF
    Free theorems, type-based assertions about functions, have become a prominent reasoning tool in functional programming languages. But their correct application requires a lot of care. Restrictions arise due to features present in implemented such languages, but not in the language free theorems were originally investigated in. This thesis advances the formal theory behind free theorems w.r.t. the application of such theorems in non-strict functional languages such as Haskell. In particular, the impact of general recursion and forced strict evaluation is investigated. As formal ground, we employ different lambda calculi equipped with a denotational semantics. For a language with general recursion, we develop and implement a counterexample generator that tells if and why restrictions on a certain free theorem arise due to general recursion. If a restriction is necessary, the generator provides a counterexample to the unrestricted free theorem. If not, the generator terminates without returning a counterexample. Thus, we may on the one hand enhance the understanding of restrictions and on the other hand point to cases where restrictions are superfluous. For a language with a strictness primitive, we develop a refined type system that allows to localize the impact of forced strict evaluation. Refined typing results in stronger free theorems and therefore increases the value of the theorems. Moreover, we provide a generator for such stronger theorems. Lastly, we broaden the view on the kind of assertions free theorems provide. For a very simple, strict evaluated, calculus, we enrich free theorems by (runtime) efficiency assertions. We apply the theory to several toy examples. Finally, we investigate the performance gain of the foldr/build program transformation. The latter investigation exemplifies the main application of our theory: Free theorems may not only ensure semantic correctness of program transformations, they may also ensure that a program transformation speeds up a program.Freie Theoreme sind typbasierte Aussagen über Funktionen. Sie dienen als beliebtes Hilfsmittel für gleichungsbasiertes Schließen in funktionalen Sprachen. Jedoch erfordert ihre korrekte Verwendung viel Sorgfalt. Bestimmte Sprachkonstrukte in praxisorientierten Programmiersprachen beschränken freie Theoreme. Anfängliche theoretische Arbeiten diskutieren diese Einschränkungen nicht oder nur teilweise, da sie nur einen reduzierten Sprachumfang betrachten. In dieser Arbeit wird die Theorie freier Theoreme weiterentwickelt. Im Vordergrund steht die Verbesserung der Anwendbarkeit solcher Theoreme in praxisorientierten, „nicht-strikt” auswertenden, funktionalen Programmiersprachen, wie Haskell. Dazu ist eine Erweiterung des formalen Fundaments notwendig. Insbesondere werden die Auswirkungen von allgemeiner Rekursion und selektiv strikter Auswertung untersucht. Als Ausgangspunkt für die Untersuchungen dient jeweils ein mit einer denotationellen Semantik ausgestattetes Lambda-Kalkül. Im Falle allgemeiner Rekursion wird ein Gegenbeispielgenerator entwickelt und implementiert. Ziel ist es zu zeigen ob und warum allgemeine Rekursion bestimmte Einschränkungen verursacht. Wird die Notwendigkeit einer Einschränkung festgestellt, liefert der Generator ein Gegenbeispiel zum unbeschränkten Theorem. Sonst terminiert er ohne ein Beispiel zu liefern. Auf der einen Seite erhöht der Generator somit das Verständnis für Beschränkungen. Auf der anderen Seite deutet er an, dass Beschränkungen teils überflüssig sind. Bezüglich selektiv strikter Auswertung wird in dieser Arbeit ein verfeinertes Typsystem entwickelt, das den Einfluss solcher vom Programmierer erzwungener Auswertung auf freie Theoreme lokal begrenzt. Verfeinerte Typen ermöglichen stärkere, und somit für die Anwendung wertvollere, freie Theoreme. Durch einen online verfügbaren Generator stehen die Theoreme faktisch aufwandsfrei zur Verfügung. Abschließend wird der Blick auf die Art von Aussagen, die freie Theoreme liefern können, erweitert. Für ein sehr einfaches, strikt auswertendes, Kalkül werden freie Theoreme mit Aussagen über Programmeffizienz bzgl. der Laufzeit angereichert. Die Anwendbarkeit der Theorie wird an einigen sehr einfachen Beispielen verifiziert. Danach wird die Auswirkung der foldr/build- Programmtransformation auf die Programmlaufzeit betrachtet. Diese Betrachtung steckt das Anwendungsziel ab: Freie Theoreme sollen nicht nur die semantische Korrektheit von Programmtransformationen verifizieren, sie sollen außerdem zeigen, wann Transformationen die Performanz eines Programms erhöhen

    Securing the Foundations of Practical Information Flow Control

    Get PDF
    Language-based information flow control (IFC) promises to secure computer programs against malicious or incompetent programmers by addressing key shortcomings of modern programming languages. In spite of showing great promise, the field remains under-utilised in practise. This thesis makes contributions to the theoretical foundations of IFC aimed at making the techniques practically applicable. The paper addresses two primary topics, IFC as a library and IFC without false alarms. The contributions range from foundational observations about soundness and completeness, to practical considerations of efficiency and expressiveness

    Programmiersprachen und Rechenkonzepte

    Get PDF
    Seit 1984 veranstaltet die GI-Fachgruppe "Programmiersprachen und Rechenkonzepte", die aus den ehemaligen Fachgruppen 2.1.3 "Implementierung von Programmiersprachen" und 2.1.4 "Alternative Konzepte für Sprachen und Rechner" hervorgegangen ist, regelmäßig im Frühjahr einen Workshop im Physikzentrum Bad Honnef. Das Treffen dient in erster Linie dem gegenseitigen Kennenlernen, dem Erfahrungsaustausch, der Diskussion und der Vertiefung gegenseitiger Kontakte

    A Reasonably Gradual Type Theory

    Full text link
    Gradualizing the Calculus of Inductive Constructions (CIC) involves dealing with subtle tensions between normalization, graduality, and conservativity with respect to CIC. Recently, GCIC has been proposed as a parametrized gradual type theory that admits three variants, each sacrificing one of these properties. For devising a gradual proof assistant based on CIC, normalization and conservativity with respect to CIC are key, but the tension with graduality needs to be addressed. Additionally, several challenges remain: (1) The presence of two wildcard terms at any type-the error and unknown terms-enables trivial proofs of any theorem, jeopardizing the use of a gradual type theory in a proof assistant; (2) Supporting general indexed inductive families, most prominently equality, is an open problem; (3) Theoretical accounts of gradual typing and graduality so far do not support handling type mismatches detected during reduction; (4) Precision and graduality are external notions not amenable to reasoning within a gradual type theory. All these issues manifest primally in CastCIC, the cast calculus used to define GCIC. In this work, we present an extension of CastCIC called GRIP. GRIP is a reasonably gradual type theory that addresses the issues above, featuring internal precision and general exception handling. GRIP features an impure (gradual) sort of types inhabited by errors and unknown terms, and a pure (non-gradual) sort of strict propositions for consistent reasoning about gradual terms. Internal precision supports reasoning about graduality within GRIP itself, for instance to characterize gradual exception-handling terms, and supports gradual subset types. We develop the metatheory of GRIP using a model formalized in Coq, and provide a prototype implementation of GRIP in Agda.Comment: 27pages + 2pages bibliograph

    Approximate Normalization for Gradual Dependent Types

    Full text link
    Dependent types help programmers write highly reliable code. However, this reliability comes at a cost: it can be challenging to write new prototypes in (or migrate old code to) dependently-typed programming languages. Gradual typing makes static type disciplines more flexible, so an appropriate notion of gradual dependent types could fruitfully lower this cost. However, dependent types raise unique challenges for gradual typing. Dependent typechecking involves the execution of program code, but gradually-typed code can signal runtime type errors or diverge. These runtime errors threaten the soundness guarantees that make dependent types so attractive, while divergence spoils the type-driven programming experience. This paper presents GDTL, a gradual dependently-typed language that emphasizes pragmatic dependently-typed programming. GDTL fully embeds both an untyped and dependently-typed language, and allows for smooth transitions between the two. In addition to gradual types we introduce gradual terms , which allow the user to be imprecise in type indices and to omit proof terms; runtime checks ensure type safety . To account for nontermination and failure, we distinguish between compile-time normalization and run-time execution: compile-time normalization is approximate but total, while runtime execution is exact , but may fail or diverge. We prove that GDTL has decidable typechecking and satisfies all the expected properties of gradual languages. In particular, GDTL satisfies the static and dynamic gradual guarantees: reducing type precision preserves typedness, and altering type precision does not change program behavior outside of dynamic type failures. To prove these properties, we were led to establish a novel normalization gradual guarantee that captures the monotonicity of approximate normalization with respect to imprecision

    Executable Refinement Types

    Full text link
    This dissertation introduces executable refinement types, which refine structural types by semi-decidable predicates, and establishes their metatheory and accompanying implementation techniques. These results are useful for undecidable type systems in general. Particular contributions include: (1) Type soundness and a logical relation for extensional equivalence for executable refinement types (though type checking is undecidable); (2) hybrid type checking for executable refinement types, which blends static and dynamic checks in a novel way, in some sense performing better statically than any decidable approximation; (3) a type reconstruction algorithm - reconstruction is decidable even though type checking is not, when suitably redefined to apply to undecidable type systems; (4) a novel use of existential types with dependent types to ensure that the language of logical formulae is closed under type checking (5) a prototype implementation, Sage, of executable refinement types such that all dynamic errors are communicated back to the compiler and are thenceforth static errors.Comment: Ph.D. dissertation. Accepted by the University of California, Santa Cruz, in March 2014. 278 pages (295 including frontmatter
    corecore