67,812 research outputs found
Interactive High Performance Computing for Music
The origins of computer music are closely tied to the development of the first high-performance computers associated with major academic and research institutions. These institutions have continued to build extremely powerful computers, now containing thousands of CPUs with incredible processing power. Their precursors were typically designed to operate in non-real time, “batch” mode, and that tradition has remained a dominant paradigm for high performance computing. We describe experimental research in developing the interactive use of a modern high- performance machine, the Abe supercomputer at the National Center for Supercomputing Applications on the University of Illinois at Urbana-Champaign campus, for real-time musical and artistic purposes. We describe the requirements, development, problems, and observations from this project.National Science Foundation, TG-DDM09000
Interactive Spaces. Models and Algorithms for Reality-based Music Applications
Reality-based interfaces have the property of linking the user's physical space with the computer digital content, bringing in intuition, plasticity and expressiveness.
Moreover, applications designed upon motion and gesture tracking technologies involve a lot of psychological features, like space cognition and implicit knowledge.
All these elements are the background of three presented music applications, employing the characteristics of three different interactive spaces: a user centered three dimensional space, a floor bi-dimensional camera space, and a small sensor centered three dimensional space.
The basic idea is to deploy the application's spatial properties in order to convey some musical knowledge, allowing the users to act inside the designed space and to learn through it in an enactive way
Recommended from our members
Design Implications for Technology-Mediated Audience Participation in Live Music
Mobile and sensor-based technologies have created new interaction design possibilities for technology-mediated au- dience participation in live music performance. However, there is little if any work in the literature that systematically identifies and characterises design issues emerging from this novel class of multi-dimensional interactive performance systems. As an early contribution towards addressing this gap in knowledge, we present the analysis of a detailed sur- vey of technology-mediated audience participation in live music, from the perspective of two key stakeholder groups - musicians and audiences. Results from the survey of over two hundred spectators and musicians are presented, along with descriptive analysis and discussion. These results are used to identify emerging design issues, such as expressive- ness, communication and appropriateness. Implications for interaction design are considered. While this study focuses on musicians and audiences, lessons are noted for diverse stakeholders, including composers, performers, interaction designers, media artists and engineers
Sensing and mapping for interactive performance
This paper describes a trans-domain mapping (TDM) framework for translating meaningful activities from one creative domain onto another. The multi-disciplinary framework is designed to facilitate an intuitive and non-intrusive interactive multimedia performance interface that offers the users or performers real-time control of multimedia events using their physical movements. It is intended to be a highly dynamic real-time performance tool, sensing and tracking activities and changes, in order to provide interactive multimedia performances.
From a straightforward definition of the TDM framework, this paper reports several implementations and multi-disciplinary collaborative projects using the proposed framework, including a motion and colour-sensitive system, a sensor-based system for triggering musical events, and a distributed multimedia server for audio mapping of a real-time face tracker, and discusses different aspects of mapping strategies in their context.
Plausible future directions, developments and exploration with the proposed framework, including stage augmenta tion, virtual and augmented reality, which involve sensing and mapping of physical and non-physical changes onto multimedia control events, are discussed
- …