2,514 research outputs found

    Coarse homology theories

    Full text link
    In this paper we develop an axiomatic approach to coarse homology theories. We prove a uniqueness result concerning coarse homology theories on the category of `coarse CW-complexes'. This uniqueness result is used to prove a version of the coarse Baum-Connes conjecture for such spaces.Comment: Published by Algebraic and Geometric Topology at http://www.maths.warwick.ac.uk/agt/AGTVol1/agt-1-13.abs.htm

    Automated Feedback for 'Fill in the Gap' Programming Exercises

    Get PDF
    Timely feedback is a vital component in the learning process. It is especially important for beginner students in Information Technology since many have not yet formed an effective internal model of a computer that they can use to construct viable knowledge. Research has shown that learning efficiency is increased if immediate feedback is provided for students. Automatic analysis of student programs has the potential to provide immediate feedback for students and to assist teaching staff in the marking process. This paper describes a “fill in the gap” programming analysis framework which tests students’ solutions and gives feedback on their correctness, detects logic errors and provides hints on how to fix these errors. Currently, the framework is being used with the Environment for Learning to Programming (ELP) system at Queensland University of Technology (QUT); however, the framework can be integrated into any existing online learning environment or programming Integrated Development Environment (IDE

    A Programming Language for Web Service Development

    Get PDF
    There is now widespread acceptance of Web services and service-oriented architectures. But despite the agreement on key Web services standards there remain many challenges. Programming environments based on WSDL support go some way to facilitating Web service development. However Web services fundamentally rely on XML and Schema, not on contemporary programming language type systems such as those of Java or .NET. Moreover, Web services are based on a messaging paradigm and hence bring forward the traditional problems of messaging systems including concurrency control and message correlation. It is easy to write simple synchronous Web services using traditional programming languages; however more realistic scenarios are surprisingly difficult to implement. To alleviate these issues we propose a programming language which directly supports Web service development. The language leverages XQuery for native XML processing, supports implicit message correlation and has high level join calculus-style concurrency control. We illustrate the features of the language through a motivating example

    Analyzing financial sectors in transition : with special reference to the Former Soviet Union

    Get PDF
    This paper proposes a framework for analyzing the evolution of financial sectors in economies transiting from command to market structures. Most commentators have tended to regard this"Transition"as an undifferentiated period to be traversed as rapidly as possible. In doing so they ignore the increasing evidence that the Transition can be an extended, even enduring, state of the world, resulting from a complex interaction of economic, political, cultural and psychological factors. As such, it can and does generate incentives which fundamentally distort the behavior of economic agents in unpredictable ways. The authors argue that one result of ignoring the true messiness of the Transition is that the analysis and policy recommendations offered to governments can be flawed and often provide conclusions which are odds with the reality on the ground. The paper discusses this concern and proposes a simple analytical framework both for focusing on the Transition itself and also for use in defining and evaluating possible public policy interventions for the banking sector. This paper is organized as follows. Section II sketches the main shortcomings of the traditional analyses of the financial sector in transition - particularly the limitations of current thinking on the sequencing of financial sector reforms. Section III introduces and elaborates on the basic framework proposed. Section IV uses the framework to distill some simple but important propositions about transition banking. Section V provides an initial qualitative test of the framework, by using it as a lens through which to view some of the characteristics of transition banks which we have observed. It shows that behavior patterns which, at first glance, seem curious and counterproductive, do indeed have a rational explanation when viewed in relation to the analytical framework we propose. Section VI applies the proposed analytical approach to assess the appropriateness of various interventions -- especially those which have proved popular in the World Bank -- to support the financial sector transition. Two Annexes then elaborate particular aspects of the analysis. Annex 1 assesses how our analysis might recondition thinking about the appropriate regulatory and supervisory structures for transition banking. Annex 2 provides some initial empirical results based on our proposed framework of analysis.Environmental Economics&Policies,Banks&Banking Reform,Financial Intermediation,Payment Systems&Infrastructure,Labor Policies,Financial Intermediation,Environmental Economics&Policies,Insurance&Risk Mitigation,Financial Crisis Management&Restructuring,Banks&Banking Reform

    Parallel programming using functional languages

    Get PDF
    It has been argued for many years that functional programs are well suited to parallel evaluation. This thesis investigates this claim from a programming perspective; that is, it investigates parallel programming using functional languages. The approach taken has been to determine the minimum programming which is necessary in order to write efficient parallel programs. This has been attempted without the aid of clever compile-time analyses. It is argued that parallel evaluation should be explicitly expressed, by the programmer, in programs. To do achieve this a lazy functional language is extended with parallel and sequential combinators. The mathematical nature of functional languages means that programs can be formally derived by program transformation. To date, most work on program derivation has concerned sequential programs. In this thesis Squigol has been used to derive three parallel algorithms. Squigol is a functional calculus from program derivation, which is becoming increasingly popular. It is shown that some aspects of Squigol are suitable for parallel program derivation, while others aspects are specifically orientated towards sequential algorithm derivation. In order to write efficient parallel programs, parallelism must be controlled. Parallelism must be controlled in order to limit storage usage, the number of tasks and the minimum size of tasks. In particular over-eager evaluation or generating excessive numbers of tasks can consume too much storage. Also, tasks can be too small to be worth evaluating in parallel. Several program techniques for parallelism control were tried. These were compared with a run-time system heuristic for parallelism control. It was discovered that the best control was effected by a combination of run-time system and programmer control of parallelism. One of the problems with parallel programming using functional languages is that non-deterministic algorithms cannot be expressed. A bag (multiset) data type is proposed to allow a limited form of non-determinism to be expressed. Bags can be given a non-deterministic parallel implementation. However, providing the operations used to combine bag elements are associative and commutative, the result of bag operations will be deterministic. The onus is on the programmer to prove this, but usually this is not difficult. Also bags' insensitivity to ordering means that more transformations are directly applicable than if, say, lists were used instead. It is necessary to be able to reason about and measure the performance of parallel programs. For example, sometimes algorithms which seem intuitively to be good parallel ones, are not. For some higher order functions it is posible to devise parameterised formulae describing their performance. This is done for divide and conquer functions, which enables constraints to be formulated which guarantee that they have a good performance. Pipelined parallelism is difficult to analyse. Therefore a formal semantics for calculating the performance of pipelined programs is devised. This is used to analyse the performance of a pipelined Quicksort. By treating the performance semantics as a set of transformation rules, the simulation of parallel programs may be achieved by transforming programs. Some parallel programs perform poorly due to programming errors. A pragmatic method of debugging such programming errors is illustrated by some examples

    Ira-Paul Schwarz. Impressions of a Cloud and Romantic Mementos: Duets with Piano.

    Get PDF

    The Art of Collaboration - an interdisciplinary practice-led research initiative

    Get PDF

    A Phenomenology of Collaboration in Contemporary Composition and Performance

    Get PDF
    ABSTRACT A PHENOMENOLOGY OF COLLABORATION IN CONTEMPORARY COMPOSITION AND PERFORMANCE This thesis considers how collaboration between composer and performer affects the practice of these musicians. The established paradigm for the creation of new work in the context of contemporary classical music promotes separation between composers and performers. Typically the composer is seen as ‘creator’, the performer as ‘interpreter’, and the audience as the ‘recipient’ of the music. This inherent hegemony creates division between these musicians, creating expressive barriers in the development and the dissemination of new work. In this research, the creative processes of both composition and performance are assessed in the context of collaborative practice, in a continuum where both composers and performers are seen as integrated elements within music making. In order to evaluate collaborative practice between composer and performer I commissioned five Irish composers to write solo bass clarinet pieces for me to perform. These five individual cases provided an opportunity to examine collaboration in a practical framework. An integral part of each commission was the examination of collaboration through the careful documentation of the creative processes of interactive practice. Over the course of a year I worked collaboratively with the composers concerned in a series of practical sessions where the new works were discussed and tried out. A key part of these meetings was the investigation of various elements relating to collaboration, including notation, improvisation and transmission. A significant amount of data was collected in the course of this examination including audio recordings and transcripts of meetings. The findings from this research indicate that collaboration between composers and performers can have significant beneficial effects on musicians’ practice. These benefits include increased motivation, creative stimulation, multiple communication modes and notational clarification. These represent some of the practical findings from this investigation of the effect collaboration has on the practice of composers and performers

    A toolbox for animal call recognition

    Get PDF
    Monitoring the natural environment is increasingly important as habit degradation and climate change reduce theworld’s biodiversity.We have developed software tools and applications to assist ecologists with the collection and analysis of acoustic data at large spatial and temporal scales.One of our key objectives is automated animal call recognition, and our approach has three novel attributes. First, we work with raw environmental audio, contaminated by noise and artefacts and containing calls that vary greatly in volume depending on the animal’s proximity to the microphone. Second, initial experimentation suggested that no single recognizer could dealwith the enormous variety of calls. Therefore, we developed a toolbox of generic recognizers to extract invariant features for each call type. Third, many species are cryptic and offer little data with which to train a recognizer. Many popular machine learning methods require large volumes of training and validation data and considerable time and expertise to prepare. Consequently we adopt bootstrap techniques that can be initiated with little data and refined subsequently. In this paper, we describe our recognition tools and present results for real ecological problems
    • 

    corecore