6 research outputs found

    Adapton: Composable, Demand-Driven Incremental Computation

    Get PDF
    Many researchers have proposed programming languages that support incremental computation (IC), which allows programs to be efficiently re-executed after a small change to the input. However, existing implementations of such languages have two important drawbacks. First, recomputation is oblivious to specific demands on the program output; that is, if a program input changes, all dependencies will be recomputed, even if an observer no longer requires certain outputs. Second, programs are made incremental as a unit, with little or no support for reusing results outside of their original context, e.g., when reordered. To address these problems, we present lambdaCDDIC, a core calculus that applies a demand-driven semantics to incremental computation, tracking changes in a hierarchical fashion in a novel demanded computation graph. lambdaCDDIC also formalizes an explicit separation between inner, incremental computations and outer observers. This combination ensures lambdaCDDIC programs only recompute computations as demanded by observers, and allows inner computations to be composed more freely. We describe an algorithm for implementing lambdaCDDIC efficiently, and we present AdaptOn, a library for writing lambdaCDDIC-style programs in OCaml. We evaluated AdaptOn on a range of benchmarks, and found that it provides reliable speedups, and in many cases dramatically outperforms prior state-of-the-art IC approaches

    A Library for Self-Adjusting Computation

    Get PDF
    We present a Standard ML library for writing programs that automatically adjust to changes to their data. The library combines modifiable references and memoization to achieve efficient updates. We describe an implementation of the library and apply it to the problem of maintaining the convex hull of a dynamically changing set of points. Our experiments show that the overhead of the library is small, and that self-adjusting programs can adjust to small changes three-orders of magnitude faster than recomputing from scratch. The implementation relies on invariants that could be enforced by a modal type system. We show, using an existing language, abstract interfaces for modifiable references and for memoization that ensure the same safety properties without the use of modal types. The interface for memoization, however, does not scale well, suggesting a language-based approach to be preferable after all

    ML 2005 Preliminary Version A Library for Self-Adjusting Computation Abstract

    No full text
    We present a Standard ML library for writing programs that automatically adjust to changes to their data. The library combines modifiable references and memoization to achieve efficient updates. We describe an implementation of the library and apply it to the problem of maintaining the convex hull of a dynamically changing set of points. Our experiments show that the overhead of the library is small, and that self-adjusting programs can adjust to small changes three-orders of magnitude faster than recomputing from scratch. The implementation relies on invariants that could be enforced by a modal type system. We show, using an existing language, abstract interfaces for modifiable references and for memoization that ensure the same safety properties without the use of modal types. The interface for memoization, however, does not scale well, suggesting a language-based approach to be preferable after all. Key words: incremental computation, selective memoization, change propagation, computational geometry, convex hulls, quickhull In many application domains, such as simulations systems, robotics, language environments, it is important for computations to adjust to external changes to thei
    corecore