16 research outputs found

    Systems of logic and belief : an investigation of some conceptual art strategies

    No full text
    Conceptual art came to prominence in the 1960s under the rubric of dematerialising the object of art, foregrounding art as idea. It was in part reaction to the preoccupation with medium-specificity that had prevailed in previous decades. This exegesis describes the programme and outcomes of practice-led research into how some of the strategies that were deployed by conceptual artists may or may not translate into present day practices and technologies. The main focus is on strategies of systematic process; on word and sign; and on matters of authorship and appropriation. The outcomes are represented by two main bodies of work. The first consists in series of advertising signs, with an emphasis on the nature of photography and its semiotic interpretations, the process of appropriation, and the place of the everyday in conceptual art. The second body of work is the outcome of an exploration of the systematic drawings and prints of Sol LeWitt, how computer technology may intervene, and how that intervention may affect the work's reception. Issues arise around matters of authorship, control, art versus craft, and the normalising influence of the art market

    Constructed Product Result Analysis for Haskell

    No full text
    Compilers for ML and Haskell typically go to a good deal of trouble to arrange that multiple arguments can be passed e#ciently to a procedure. For some reason, less e#ort seems to be invested in ensuring that multiple results can also be returned e#ciently. In the context of the lazy functional language Haskell, we describe an analysis, Constructed Product Result (CPR) analysis, that determines when a function can profitably return multiple results in registers. The analysis is based only on a function's definition, and not on its uses (so separate compilation is easily supported) and the results of the analysis can be expressed by a transformation of the function definition alone. We discuss a variety of design issues that were addressed in our implementation, and give measurements of the e#ectiveness of our approach across a substantial benchmark set. Overall, the price/performance ratio is good: the benefits are modest in general (though occasionally dramatic), but the costs in both complexity and compile time, are low

    Constructed product result analysis for Haskell

    No full text

    Constructed product result analysis for Haskell

    No full text
    Compilers for ML and Haskell typically go to a good deal of trouble to arrange that multiple arguments can be passed efficiently to a procedure. For some reason, less effort seems to be invested in ensuring that multiple results can also be returned efficiently. In the context of the lazy functional language Haskell, we describe an analysis, Constructed Product Result (CPR) analysis, that determines when a function can profitably return multiple results in registers. The analysis is based only on a function's definition, and not on its uses (so separate compilation is easily supported) and the results of the analysis can be expressed by a transformation of the function definition alone. We discuss a variety of design issues that were addressed in our implementation, and give measurements of the effectiveness of our approach across a substantial benchmark set. Overall, the price/performance ratio is good : the benefits are modest in general (though occasionally dramatic), but the costs in both complexity and compile time, are low

    An Operational Semantics for Parallel Lazy Evaluation

    No full text
    Parallel (lazy) functional programs must describe both computation and coordination, i.e., what to compute and how to arrange the computation in parallel. The formal manipulation of the behaviours of such programs requires a semantics which accurately captures lazy evaluation, and the dependence of execution on the availability of (physical) resources. In this paper we present a lockstep semantics as a first step towards this goal which, we hope, will allow us to reason about the coordination in a lazy setting. Version History Version 0.44(110998): From a comment by David Crowe, changed normalisation so that use of new closure is correctly noted in seq and par Version 0.43: Included Clem's relationship to other work section. Otherwise, just a minor polish for IFL'98 Version 0.42: Considered, but left unimplemented, Clem's tri-partite heap system. Random, speculative evaluation of dead threads remains a problem of the system. Partitioned rules into sequential and parallel rules. Hop..

    Towards an Operational Semantics for a Parallel Non-strict Functional Language

    No full text
    . Parallel programs must describe both computation and coordination, i.e. what to compute and how to organize the computation. In functional languages equational reasoning is often used to reason about computation. In contrast, there have been many different coordination constructs for functional languages, and far less work on reasoning about coordination. We present an initial semantics for GpH, a small extension of the Haskell language, that allows us to reason about coordination. In particular we can reason about work, average parallelism and runtime. The semantics captures the notions of limited (physical) resources, the preservation of sharing, and speculative evaluation. We show a consistency result with Launchbury's well-known lazy semantics. 1 Introduction One of the advantages of declarative languages is that it is relatively easy to reason about the values computed by programs, this being attributable to their preservation of referential transparency. Indeed, within the fun..
    corecore