4 research outputs found

    Performance of Optimistic Make

    Get PDF
    Optimistic make is a version of make that executes the commands necessary to bring targets up-to-date prior to the time the user types a make request. Side effects of these optimistic computations (such as file or screen updates) are concealed until the make request is issued. If the inputs read by the optimistic computations are identical to the inputs the computation would read at the time the make request is issued, the results of the optimistic computations are used immediately, resulting in improved response time. Otherwise, the necessary computations are reexecuted. We have implemented optimistic make in the V-System on a collection of SUN-3 workstations. Statistics collected from this implementation are used to synthesize a workload for a discrete-event simulation and to validate its results. The simulation shows a speedup distribution over pessimistic make with a median of 1.72 and a mean of 8.28. The speedup distribution is strongly dependent on the ratio between the target out-of-date times and the command execution times. In particular, with faster machines the median of the speedup distribution grows to 5.1, and then decreases again. The extra machine resources used by optimistic make are well within the limit of available resources, given the large idle times observed in many workstation environment

    Performance of optimistic make

    Full text link

    Purely top-down software rebuilding

    Get PDF
    Software rebuilding is the process of deriving a deployable software system from its primitive source objects. A build tool helps maintain consistency between the derived objects and source objects by ensuring that all necessary build steps are re-executed in the correct order after a set of changes is made to the source objects. It is imperative that derived objects accurately represent the source objects from which they were supposedly constructed; otherwise, subsequent testing and quality assurance is invalidated. This thesis aims to advance the state-of-the-art in tool support for automated software rebuilding. It surveys the body of background work, lays out a set of design considerations for build tools, and examines areas where current tools are limited. It examines the properties of a next-generation tool concept, redo, conceived by D. J. Bernstein; redo is novel because it employs a purely top-down approach to software rebuilding that promises to be simpler, more flexible, and more reliable than current approaches. The details of a redo prototype written by the author of this thesis are explained including the central algorithms and data structures. Lastly, the redo prototype is evaluated on some sample software systems with respect to migration effort between build tools as well as size, complexity, and performances aspects of the resulting build systems

    Optimistic computation

    No full text
    An optimistic computation is a computation that makes guesses about its future behavior, then proceeds with execution based on these guesses before they can be verified. Optimistic computations guess data values before they are produced and guess the control flow of computations before it is known. The performance of optimistic computations is determined by the number of idle resources available for optimistic execution, the percentage of guesses that are correct, the bookkeeping costs of managing optimistic execution, and the overhead of preventing optimistic computations from interfering with their execution environments until the guesses that they are based on are verified. We model computations by their program dependence graphs, then present a series of application-independent transformations on the dependence graphs that convert pessimistic computations into semantically equivalent optimistic computations. We demonstrate the viability of this approach by applying our transformations to the make program, fault tolerance based on message logging and checkpointing, distributed simulation, database concurrency control, and bulk data transfer. Our derived optimistic computations are similar to those presented in the literature, but typically require additional application-dependent transformations to produce viable implementations. We investigate, in detail, the implementation and performance of optimistic make, an optimistic variant of conventional pessimistic make. Optimistic make executes the commands necessary to bring makefile targets up-to-date prior to the time the user types a make request. The side effects produced by these optimistically executed commands are masked using encapsulations until the commands are known to be needed, at which time the side effects are committed to the external world. The side effects of unneeded commands and commands executed with inputs that have changed are discarded. Measured and simulated results from our implementation of optimistic make show a median response time improvement of 1.72 and a mean improvement of 8.28 over pessimistic make
    corecore