67,812 research outputs found

    Interactive High Performance Computing for Music

    Get PDF
    The origins of computer music are closely tied to the development of the first high-performance computers associated with major academic and research institutions. These institutions have continued to build extremely powerful computers, now containing thousands of CPUs with incredible processing power. Their precursors were typically designed to operate in non-real time, “batch” mode, and that tradition has remained a dominant paradigm for high performance computing. We describe experimental research in developing the interactive use of a modern high- performance machine, the Abe supercomputer at the National Center for Supercomputing Applications on the University of Illinois at Urbana-Champaign campus, for real-time musical and artistic purposes. We describe the requirements, development, problems, and observations from this project.National Science Foundation, TG-DDM09000

    Interactive Spaces. Models and Algorithms for Reality-based Music Applications

    Get PDF
    Reality-based interfaces have the property of linking the user's physical space with the computer digital content, bringing in intuition, plasticity and expressiveness. Moreover, applications designed upon motion and gesture tracking technologies involve a lot of psychological features, like space cognition and implicit knowledge. All these elements are the background of three presented music applications, employing the characteristics of three different interactive spaces: a user centered three dimensional space, a floor bi-dimensional camera space, and a small sensor centered three dimensional space. The basic idea is to deploy the application's spatial properties in order to convey some musical knowledge, allowing the users to act inside the designed space and to learn through it in an enactive way

    Sensing and mapping for interactive performance

    Get PDF
    This paper describes a trans-domain mapping (TDM) framework for translating meaningful activities from one creative domain onto another. The multi-disciplinary framework is designed to facilitate an intuitive and non-intrusive interactive multimedia performance interface that offers the users or performers real-time control of multimedia events using their physical movements. It is intended to be a highly dynamic real-time performance tool, sensing and tracking activities and changes, in order to provide interactive multimedia performances. From a straightforward definition of the TDM framework, this paper reports several implementations and multi-disciplinary collaborative projects using the proposed framework, including a motion and colour-sensitive system, a sensor-based system for triggering musical events, and a distributed multimedia server for audio mapping of a real-time face tracker, and discusses different aspects of mapping strategies in their context. Plausible future directions, developments and exploration with the proposed framework, including stage augmenta tion, virtual and augmented reality, which involve sensing and mapping of physical and non-physical changes onto multimedia control events, are discussed
    • …
    corecore