10 research outputs found

    Staging Static Analyses for Program Generation (Extended Version)

    Get PDF
    Program generators are most naturally specified using a quote/antiquote facility; the programmer writes programs with holes which are filled in, at program generation time, by other program fragments. If the programs are generated at compile-time, analysis and compilation follow generation, and no changes in the compiler are needed. However, if program generation is done at run time, compilation and analysis need to be optimized so that they will not overwhelm overall execution time. In this paper, we give a compositional framework for defining program analyses which leads directly to a method of staging these analyses. The staging allows the analysis of incomplete programs to be started at compile time; the residual work to be done at run time may be much less costly than the full analysis. We give frameworks for forward and backward analyses, present several examples of specific analyses, and give timing results showing significant speed-ups for the run-time portion of the analysis relative to the full analysis

    Improving Efficiency and Safety of Program Generation

    Get PDF
    Program Generation (PG) is about writing programs that write programs. A program generator composes various pieces of code to construct a new program. When employed at runtime, PG can produce an efficient version of a program by specializing it according to inputs that become available at runtime. PG has been used in a wide range of applications to improve program efficiency and modularity as well as programmer productivity. There are two major problems associated with PG: (1) Program generation has its own cost, which may cause a performance loss even though PG is intended for performance gain. This is especially important for runtime program generation. (2) Compilability guarantees about the generated program are poor; the generator may produce a type-incorrect program. In this dissertation we focus on these two problems. We provide three techniques that address the first problem. First, we show that just-in-time generation can successfully reduce the cost of generation by avoiding unnecessary program generation. We do this by means of an experiment in the context of marshalling in Java, where we generate specialized object marshallers based on object types. Just-in-time generation improved the speedup from 1.22 to 3.16. Second, we apply source-level transformations to optimize the execution of program generators. Up to 15\% speedup has been achieved in runtime generation time for Jumbo, a PG system for Java. Third, we provide a technique to stage analysis of generated programs to perform a portion of the analysis at compile time rather than completing the entire analysis at runtime. We also give experimental evidence via several examples that this technique reduces runtime generation cost. To address the second problem of PG, we first show that operational semantics of record calculus and program generation are equivalent, and that a record type system can be used to type-check program generators. We also show that this is true in the presence of expressions with side-effects. We then make use of an already-existing record calculus feature, subtyping, to extend the program generation type system with subtyping constraints. As a result, we obtain a very expressive type system to statically guarantee that a generator will produce type-safe code. We state and prove the theorems based on an ML-like language with program generation constructs

    An Executable Semantic Definition of the Beta Language using Rewriting Logic

    Get PDF
    In this paper, we present an overview of our method of specifying the semantics of programming languages using rewriting logic. This method, which we refer to as the "continuation-based style", relies on an explicit representation of a program's control context, allowing flexibility in defining complex, control-intensive features of languages while still allowing simple definitions of simple language constructs. To illustrate this technique, we present a definition of a significant subset of the object-oriented language Beta running in the Maude rewriting engine. This specification gives us an executable platform for running Beta programs and for experimenting with new language features. We illustrate this by extending the language with super calls. We also touch upon some features of the underlying framework, including the ability to model check Beta programs running on our framework with rewriting-based tools

    Staging Static Analyses for Program Generation (Extended Version)

    Get PDF
    Program generators are most naturally specified using a quote/antiquote facility; the programmer writes programs with holes which are filled in, at program generation time, by other program fragments. If the programs are generated at compile-time, analysis and compilation follow generation, and no changes in the compiler are needed. However, if program generation is done at run time, compilation and analysis need to be optimized so that they will not overwhelm overall execution time. In this paper, we give a compositional framework for defining program analyses which leads directly to a method of staging these analyses. The staging allows the analysis of incomplete programs to be started at compile time; the residual work to be done at run time may be much less costly than the full analysis. We give frameworks for forward and backward analyses, present several examples of specific analyses, and give timing results showing significant speed-ups for the run-time portion of the analysis relative to the full analysis. Our framework is defined on abstract syntax trees (AST), because program fragments appear as AST's. We give a translation from source-level code to an intermediate representation (IR) and show that our staging methodology is applicable at the IR-level, too

    Poster session 2: Thursday 4 December 2014, 08:30-12:30Location: Poster area.

    No full text

    Poster session 2: Thursday 4 December 2014, 08:30-12:30Location: Poster area.

    No full text
    corecore