3,664 research outputs found

    Strictness and Totality Analysis

    Get PDF
    We define a novel inference system for strictness and totality analysis for the simply-typed lazy lambda-calculus with constants and fixpoints. Strictness information identifies those terms that definitely denote bottom (i.e. do not evaluate to WHNF) whereas totality information identifies those terms that definitely do not denote bottom (i.e. do evaluate to WHNF). The analysis is presented as an annotated type system allowing conjunctions only at ``top level´´. We give examples of its use and prove the correctness with respect to a natural-style operational semantics

    Annotated Type Systems for Program Analysis

    Get PDF
    In this Ph.D. thesis, we study four program analyses. Three of them are specified by annotated type systems and the last one by abstract interpretation.We present a combined strictness and totality analysis. We are specifying the analysis as an annotated type system. The type system allows conjunctions of annotated types, but only at the top-level. The analysis is somewhat more powerful than the strictness analysis by Kuo and Mishra due to the conjunctions and in that we also consider totality. The analysis is shown sound with respect to a natural-style operational semantics. The analysis is not immediately extendable to full conjunction.The second analysis is also a combined strictness and totality analysis, however with ``full´´ conjunction. Soundness of the analysis is shown with respect to a denotational semantics. The analysis is more powerful than the strictness analyses by Jensen and Benton in that it in addition to strictness considers totality. So far we have only specified the analyses, however in order for the analyses to be practically useful we need an algorithm for inferring the annotated types. We construct an algorithm for the second analysis using the lazy type approach by Hankin and Le MÊtayer. The reason for choosing the second analysis from the thesis is that the approach is not applicable to the first analysis.The third analysis we study is a binding time analysis. We take the analysis specified by Nielson and Nielson and we construct a more efficient algorithm than the one proposed by Nielson and Nielson. The algorithm collects constraints in a structural manner like the type inference algorithm by Damas. Afterwards the minimal solution to the set of constraints is found.The last analysis in the thesis is specified by abstract interpretation. Hunt shows that projection based analyses are subsumed by PER (partial equivalence relation) based analyses using abstract interpretation. The PERs used by Hunt are strict, i.e. bottom is related to bottom. Here we lift this restriction by requiring the PERs to be uniform, in the sense that they treat all the integers equally. By allowing non-strict PERs we get three properties on the integers, corresponding to the three annotations used in the first and second analysis in the thesis

    Deciding subset relationship of co-inductively defined set constants

    Get PDF
    Static analysis of different non-strict functional programming languages makes use of set constants like Top, Inf, and Bot denoting all expressions, all lists without a last Nil as tail, and all non-terminating programs, respectively. We use a set language that permits union, constructors and recursive definition of set constants with a greatest fixpoint semantics. This paper proves decidability, in particular EXPTIMEcompleteness, of subset relationship of co-inductively defined sets by using algorithms and results from tree automata. This shows decidability of the test for set inclusion, which is required by certain strictness analysis algorithms in lazy functional programming languages

    Extensional Collapse Situations I: non-termination and unrecoverable errors

    Get PDF
    We consider a simple model of higher order, functional computation over the booleans. Then, we enrich the model in order to encompass non-termination and unrecoverable errors, taken separately or jointly. We show that the models so defined form a lattice when ordered by the extensional collapse situation relation, introduced in order to compare models with respect to the amount of "intensional information" that they provide on computation. The proofs are carried out by exhibiting suitable applied {\lambda}-calculi, and by exploiting the fundamental lemma of logical relations

    Contracts for Interacting Two-Party Systems

    Full text link
    This article deals with the interrelation of deontic operators in contracts -- an aspect often neglected when considering only one of the involved parties. On top of an automata-based semantics we formalise the onuses that obligations, permissions and prohibitions on one party impose on the other. Such formalisation allows for a clean notion of contract strictness and a derived notion of contract conflict that is enriched with issues arising from party interdependence.Comment: In Proceedings FLACOS 2012, arXiv:1209.169

    Partial Horn logic and cartesian categories

    Get PDF
    A logic is developed in which function symbols are allowed to represent partial functions. It has the usual rules of logic (in the form of a sequent calculus) except that the substitution rule has to be modified. It is developed here in its minimal form, with equality and conjunction, as “partial Horn logic”. Various kinds of logical theory are equivalent: partial Horn theories, “quasi-equational” theories (partial Horn theories without predicate symbols), cartesian theories and essentially algebraic theories. The logic is sound and complete with respect to models in , and sound with respect to models in any cartesian (finite limit) category. The simplicity of the quasi-equational form allows an easy predicative constructive proof of the free partial model theorem for cartesian theories: that if a theory morphism is given from one cartesian theory to another, then the forgetful (reduct) functor from one model category to the other has a left adjoint. Various examples of quasi-equational theory are studied, including those of cartesian categories and of other classes of categories. For each quasi-equational theory another, , is constructed, whose models are cartesian categories equipped with models of . Its initial model, the “classifying category” for , has properties similar to those of the syntactic category, but more precise with respect to strict cartesian functors

    Truth-value semantics and functional extensions for classical logic of partial terms based on equality

    Full text link
    We develop a bottom-up approach to truth-value semantics for classical logic of partial terms based on equality and apply it to prove the conservativity of the addition of partial description and partial selection functions, independently of any strictness assumption.Comment: 15 pages, to appear in the Notre Dame Journal of Formal Logi

    A complete proof of the safety of NĂścker's strictness analysis

    Get PDF
    This paper proves correctness of NĂścker's method of strictness analysis, implemented in the Clean compiler, which is an effective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt did on the correctness of the abstract reduction rules. Our method fully considers the cycle detection rules, which are the main strength of NĂścker's strictness analysis. Our algorithm SAL is a reformulation of NĂścker's strictness analysis algorithm in a higher-order call-by-need lambda-calculus with case, constructors, letrec, and seq, extended by set constants like Top or Inf, denoting sets of expressions. It is also possible to define new set constants by recursive equations with a greatest fixpoint semantics. The operational semantics is a small-step semantics. Equality of expressions is defined by a contextual semantics that observes termination of expressions. Basically, SAL is a non-termination checker. The proof of its correctness and hence of NĂścker's strictness analysis is based mainly on an exact analysis of the lengths of normal order reduction sequences. The main measure being the number of 'essential' reductions in a normal order reduction sequence. Our tools and results provide new insights into call-by-need lambda-calculi, the role of sharing in functional programming languages, and into strictness analysis in general. The correctness result provides a foundation for NĂścker's strictness analysis in Clean, and also for its use in Haskell
    • …
    corecore