28 research outputs found

    FliPpr: A Prettier Invertible Printing System

    Get PDF
    When implementing a programming language, we often write a parser and a pretty-printer. However, manually writing both programs is not only tedious but also error-prone; it may happen that a pretty-printed result is not correctly parsed. In this paper, we propose FliPpr, which is a program transformation system that uses program inversion to produce a CFG parser from a pretty-printer. This novel approach has the advantages of fine-grained control over pretty-printing, and easy reuse of existing efficient pretty-printer and parser implementations

    Adequacy of compositional translations for observational semantics

    Get PDF
    We investigate methods and tools for analysing translations between programming languages with respect to observational semantics. The behaviour of programs is observed in terms of may- and must-convergence in arbitrary contexts, and adequacy of translations, i.e., the reflection of program equivalence, is taken to be the fundamental correctness condition. For compositional translations we propose a notion of convergence equivalence as a means for proving adequacy. This technique avoids explicit reasoning about contexts, and is able to deal with the subtle role of typing in implementations of language extension

    A multi-paradigm language for reactive synthesis

    Get PDF
    This paper proposes a language for describing reactive synthesis problems that integrates imperative and declarative elements. The semantics is defined in terms of two-player turn-based infinite games with full information. Currently, synthesis tools accept linear temporal logic (LTL) as input, but this description is less structured and does not facilitate the expression of sequential constraints. This motivates the use of a structured programming language to specify synthesis problems. Transition systems and guarded commands serve as imperative constructs, expressed in a syntax based on that of the modeling language Promela. The syntax allows defining which player controls data and control flow, and separating a program into assumptions and guarantees. These notions are necessary for input to game solvers. The integration of imperative and declarative paradigms allows using the paradigm that is most appropriate for expressing each requirement. The declarative part is expressed in the LTL fragment of generalized reactivity(1), which admits efficient synthesis algorithms, extended with past LTL. The implementation translates Promela to input for the Slugs synthesizer and is written in Python. The AMBA AHB bus case study is revisited and synthesized efficiently, identifying the need to reorder binary decision diagrams during strategy construction, in order to prevent the exponential blowup observed in previous work.Comment: In Proceedings SYNT 2015, arXiv:1602.0078

    A Simply Numbered Lambda Calculus

    Get PDF
    While programming languages traditionally lean towards functions, query languages are often relational in character. Taking the relations language of Harkes and Visser as a starting point, I explore how the functional paradigm, represented by the lambda calculus, can be extended to form the basis of a relational language. It turns out that a straightforward extension with strings of terms not only supports surprisingly many features of the relations language, but also opens it up for higher-order relations, one prominent feature the relations language does not offer

    Singular and Plural Functions for Functional Logic Programming

    Full text link
    Functional logic programming (FLP) languages use non-terminating and non-confluent constructor systems (CS's) as programs in order to define non-strict non-determi-nistic functions. Two semantic alternatives have been usually considered for parameter passing with this kind of functions: call-time choice and run-time choice. While the former is the standard choice of modern FLP languages, the latter lacks some properties---mainly compositionality---that have prevented its use in practical FLP systems. Traditionally it has been considered that call-time choice induces a singular denotational semantics, while run-time choice induces a plural semantics. We have discovered that this latter identification is wrong when pattern matching is involved, and thus we propose two novel compositional plural semantics for CS's that are different from run-time choice. We study the basic properties of our plural semantics---compositionality, polarity, monotonicity for substitutions, and a restricted form of the bubbling property for constructor systems---and the relation between them and to previous proposals, concluding that these semantics form a hierarchy in the sense of set inclusion of the set of computed values. We have also identified a class of programs characterized by a syntactic criterion for which the proposed plural semantics behave the same, and a program transformation that can be used to simulate one of them by term rewriting. At the practical level, we study how to use the expressive capabilities of these semantics for improving the declarative flavour of programs. We also propose a language which combines call-time choice and our plural semantics, that we have implemented in Maude. The resulting interpreter is employed to test several significant examples showing the capabilities of the combined semantics. To appear in Theory and Practice of Logic Programming (TPLP)Comment: 53 pages, 5 figure

    A call-by-need lambda-calculus with locally bottom-avoiding choice: context lemma and correctness of transformations

    Get PDF
    We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb, which is locally bottom-avoiding. We use a small-step operational semantics in form of a normal order reduction. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into an arbitrary program context their termination behaviour is the same. We use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We evolve different proof tools for proving correctness of program transformations. We provide a context lemma for may- as well as must- convergence which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations keep contextual equivalence. In contrast to other approaches our syntax as well as semantics does not make use of a heap for sharing expressions. Instead we represent these expressions explicitely via letrec-bindings
    corecore