14,379 research outputs found

    A compiler approach to scalable concurrent program design

    Get PDF
    The programmer's most powerful tool for controlling complexity in program design is abstraction. We seek to use abstraction in the design of concurrent programs, so as to separate design decisions concerned with decomposition, communication, synchronization, mapping, granularity, and load balancing. This paper describes programming and compiler techniques intended to facilitate this design strategy. The programming techniques are based on a core programming notation with two important properties: the ability to separate concurrent programming concerns, and extensibility with reusable programmer-defined abstractions. The compiler techniques are based on a simple transformation system together with a set of compilation transformations and portable run-time support. The transformation system allows programmer-defined abstractions to be defined as source-to-source transformations that convert abstractions into the core notation. The same transformation system is used to apply compilation transformations that incrementally transform the core notation toward an abstract concurrent machine. This machine can be implemented on a variety of concurrent architectures using simple run-time support. The transformation, compilation, and run-time system techniques have been implemented and are incorporated in a public-domain program development toolkit. This toolkit operates on a wide variety of networked workstations, multicomputers, and shared-memory multiprocessors. It includes a program transformer, concurrent compiler, syntax checker, debugger, performance analyzer, and execution animator. A variety of substantial applications have been developed using the toolkit, in areas such as climate modeling and fluid dynamics

    Optimization as a design strategy. Considerations based on building simulation-assisted experiments about problem decomposition

    Full text link
    In this article the most fundamental decomposition-based optimization method - block coordinate search, based on the sequential decomposition of problems in subproblems - and building performance simulation programs are used to reason about a building design process at micro-urban scale and strategies are defined to make the search more efficient. Cyclic overlapping block coordinate search is here considered in its double nature of optimization method and surrogate model (and metaphore) of a sequential design process. Heuristic indicators apt to support the design of search structures suited to that method are developed from building-simulation-assisted computational experiments, aimed to choose the form and position of a small building in a plot. Those indicators link the sharing of structure between subspaces ("commonality") to recursive recombination, measured as freshness of the search wake and novelty of the search moves. The aim of these indicators is to measure the relative effectiveness of decomposition-based design moves and create efficient block searches. Implications of a possible use of these indicators in genetic algorithms are also highlighted.Comment: 48 pages. 12 figures, 3 table

    Facticity as the amount of self-descriptive information in a data set

    Get PDF
    Using the theory of Kolmogorov complexity the notion of facticity {\phi}(x) of a string is defined as the amount of self-descriptive information it contains. It is proved that (under reasonable assumptions: the existence of an empty machine and the availability of a faithful index) facticity is definite, i.e. random strings have facticity 0 and for compressible strings 0 < {\phi}(x) < 1/2 |x| + O(1). Consequently facticity measures the tension in a data set between structural and ad-hoc information objectively. For binary strings there is a so-called facticity threshold that is dependent on their entropy. Strings with facticty above this threshold have no optimal stochastic model and are essentially computational. The shape of the facticty versus entropy plot coincides with the well-known sawtooth curves observed in complex systems. The notion of factic processes is discussed. This approach overcomes problems with earlier proposals to use two-part code to define the meaningfulness or usefulness of a data set.Comment: 10 pages, 2 figure

    Fifty years of Hoare's Logic

    Get PDF
    We present a history of Hoare's logic.Comment: 79 pages. To appear in Formal Aspects of Computin

    Software Engineering and Complexity in Effective Algebraic Geometry

    Full text link
    We introduce the notion of a robust parameterized arithmetic circuit for the evaluation of algebraic families of multivariate polynomials. Based on this notion, we present a computation model, adapted to Scientific Computing, which captures all known branching parsimonious symbolic algorithms in effective Algebraic Geometry. We justify this model by arguments from Software Engineering. Finally we exhibit a class of simple elimination problems of effective Algebraic Geometry which require exponential time to be solved by branching parsimonious algorithms of our computation model.Comment: 70 pages. arXiv admin note: substantial text overlap with arXiv:1201.434
    corecore