56 research outputs found

    Speculative Staging for Interpreter Optimization

    Full text link
    Interpreters have a bad reputation for having lower performance than just-in-time compilers. We present a new way of building high performance interpreters that is particularly effective for executing dynamically typed programming languages. The key idea is to combine speculative staging of optimized interpreter instructions with a novel technique of incrementally and iteratively concerting them at run-time. This paper introduces the concepts behind deriving optimized instructions from existing interpreter instructions---incrementally peeling off layers of complexity. When compiling the interpreter, these optimized derivatives will be compiled along with the original interpreter instructions. Therefore, our technique is portable by construction since it leverages the existing compiler's backend. At run-time we use instruction substitution from the interpreter's original and expensive instructions to optimized instruction derivatives to speed up execution. Our technique unites high performance with the simplicity and portability of interpreters---we report that our optimization makes the CPython interpreter up to more than four times faster, where our interpreter closes the gap between and sometimes even outperforms PyPy's just-in-time compiler.Comment: 16 pages, 4 figures, 3 tables. Uses CPython 3.2.3 and PyPy 1.

    Lambda-Dropping: Transforming Recursive Equations into Programs with Block Structure

    Get PDF
    Lambda-lifting a functional program transforms it into a set of recursiveequations. We present the symmetric transformation: lambda-dropping.Lambda-dropping a set of recursive equations restores blockstructure and lexical scope.For lack of scope, recursive equations must carry around all theparameters that any of their callees might possibly need. Both lambda-liftingand lambda-dropping thus require one to compute a transitiveclosure over the call graph:- for lambda-lifting: to establish the Def/Use path of each freevariable (these free variables are then added as parameters toeach of the functions in the call path);- for lambda-dropping: to establish the Def/Use path of each parameter(parameters whose use occurs in the same scope as theirdefinition do not need to be passed along in the call path).Without free variables, a program is scope-insensitive. Its blocks arethen free to float (for lambda-lifting) or to sink (for lambda-dropping)along the vertices of the scope tree.We believe lambda-lifting and lambda-dropping are interesting perse, both in principle and in practice, but our prime application is partialevaluation: except for Malmkjær and Ørbæk's case study presented atPEPM'95, most polyvariant specializers for procedural programs operateon recursive equations. To this end, in a pre-processing phase,they lambda-lift source programs into recursive equations. As a result,residual programs are also expressed as recursive equations, often withdozens of parameters, which most compilers do not handle efficiently.Lambda-dropping in a post-processing phase restores their block structureand lexical scope thereby significantly reducing both the compiletime and the run time of residual programs.

    Towards Unifying Inheritance and Automatic Program Specialization

    Get PDF
    Inheritance allows a class to be specialized and its attributes refined, but implementation specialization can only take place by overriding with manually implemented methods. Automatic program specialization can generate a specialized, efficient implementation. However, specialization of programs and specialization of classes (inheritance) are considered different abstractions. We present a new programming language, Lapis, that unifies inheritance and program specialization at the conceptual, syntactic, and semantic levels. This paper presents the initial development of Lapis, which uses inheritance with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be efficiently implemented and a simple yet powerful partial evaluator for an object-oriented language

    Eelco Visser: The Oregon Connection

    Get PDF
    This paper shares some memories of Eelco gathered over the past 25 years as a colleague and friend, and reflects on the nature of modern international collaborations

    An Analytical Approach to Programs as Data Objects

    Get PDF
    This essay accompanies a selection of 32 articles (referred to in bold face in the text and marginally marked in the bibliographic references) submitted to Aarhus University towards a Doctor Scientiarum degree in Computer Science.The author's previous academic degree, beyond a doctoral degree in June 1986, is an "Habilitation à diriger les recherches" from the Université Pierre et Marie Curie (Paris VI) in France; the corresponding material was submitted in September 1992 and the degree was obtained in January 1993.The present 32 articles have all been written since 1993 and while at DAIMI.Except for one other PhD student, all co-authors are or have been the author's students here in Aarhus

    Partial Evaluation for Constraint-Based Program Analyses

    Get PDF
    We report on a case study in the application of partial evaluation, initiatedby the desire to speed up a constraint-based algorithm for control-flow analysis. We designed and implemented a dedicated partial evaluator,able to specialize the analysis wrt. a given constraint graph and thus remove the interpretive overhead, and measured it with Feeley's Schemebenchmarks. Even though the gain turned out to be rather limited, ourinvestigation yielded valuable feed back in that it provided a better understandingof the analysis, leading us to (re)invent an incremental version.We believe this phenomenon to be a quite frequent spinoff from using partial evaluation, since the removal of interpretive overhead makes the flowof control more explicit and hence pinpoints sources of inefficiency. Finally, we observed that partial evaluation in our case yields such regular,low-level specialized programs that it begs for run-time code generation

    Lambda-Dropping: Transforming Recursive Equations into Programs with Block Structure

    Full text link

    Sparcl:A Language for Partially-Invertible Computation

    Get PDF

    Homeomorphic Embedding for Online Termination of Symbolic Methods

    No full text
    Well-quasi orders in general, and homeomorphic embedding in particular, have gained popularity to ensure the termination of techniques for program analysis, specialisation, transformation, and verification. In this paper we survey and discuss this use of homeomorphic embedding and clarify the advantages of such an approach over one using well-founded orders. We also discuss various extensions of the homeomorphic embedding relation. We conclude with a study of homeomorphic embedding in the context of metaprogramming, presenting some new (positive and negative) results and open problems

    Towards Bridging the Gap Between Programming Languages and Partial Evaluation

    Get PDF
    International audiencePartial evaluation is a program-transformation technique that automatically specializes a program with respect to user-supplied invariants. Despite successful applications in areas such as graphics, operating systems, and software engineering, partial evaluators have yet to achieve widespread use. One reason is the difficulty of adequately describing specialization opportunities. Indeed, under-specialization or over-specialization often occurs, without any direct feedback to the user as to the source of the problem. We have developed a high-level, module-based language allowing the programmer to guide the choice of both the code to specialize and the invariants to exploit during the specialization process. To ease the use of partial evaluation, the syntax of this language is similar to the declaration syntax of the target language of the partial evaluator. To provide feedback to the programmer, declarations are checked throughout the analyses performed by partial evaluation. The language has been successfully used by a signal-processing expert in the design of a specializable Forward Error Correction component
    corecore