2,682 research outputs found

    Screen-based musical instruments as semiotic machines

    Get PDF
    The ixi software project started in 2000 with the intention to explore new interactive patterns and virtual interfaces in computer music software. The aim of this paper is not to describe these programs, as they have been described elsewhere, but rather explicate the theoretical background that underlies the design of these screen-based instruments. After an analysis of the similarities and differences in the design of acoustic and screen-based instruments, the paper describes how the creation of an interface is essentially the creation of a semiotic system that affects and influences the musician and the composer. Finally the terminology of this semiotics is explained as an interaction model

    Interfacing the Network: An Embedded Approach to Network Instrument Creation

    Get PDF
    This paper discusses the design, construction, and development of a multi-site collaborative instrument, The Loop, developed by the JacksOn4 collective during 2009-10 and formally presented in Oslo at the arts.on.wires and NIME conferences in 2011. The development of this instrument is primarily a reaction to historical network performance that either attempts to present traditional acoustic practice in a distributed format or utilises the network as a conduit to shuttle acoustic and performance data amongst participant nodes. In both scenarios the network is an integral and indispensible part of the performance, however, the network is not perceived as an instrument, per se. The Loop is an attempt to create a single, distributed hybrid instrument retaining traditionally acoustic interfaces and resonant bodies that are mediated by the network. The embedding of the network into the body of the instrument raises many practical and theoretical discussions, which are explored in this paper through a reflection upon the notion of the distributed instrument and the way in which its design impacts the behaviour of the participants (performers and audiences); the mediation of musical expression across networks; the bi-directional relationship between instrument and design; as well as how the instrument assists in the realisation of the creators’ compositional and artistic goals

    Hydraulophone design considerations : absement, displacement, and velocity-sensitive music keyboard in which each key is a water jet

    Get PDF
    We present a musical keyboard that is not only velocity-sensitive, but in fact responds to absement (presement), displacement (placement), velocity, acceleration, jerk, jounce, etc. (i.e. to all the derivatives, as well as the integral, of displacement). Moreover, unlike a piano keyboard in which the keys reach a point of maximal displacement, our keys are essentially infinite in length, and thus never reach an end to their key travel. Our infinite length keys are achieved by using water jet streams that continue to flow past the fingers of a person playing the instrument. The instrument takes the form of a pipe with a row of holes, in which water flows out of each hole, while a user is invited to play the instrument by interfering with the flow of water coming out of the holes. The instrument resembles a large flute, but, unlike a flute, there is no complicated fingering pattern. Instead, each hole (each water jet) corresponds to one note (as with a piano or pipe organ). Therefore, unlike a flute, chords can be played by blocking more than one water jet hole at the same time. Because each note corresponds to only one hole, different fingers of the musician can be inserted into, onto, around, or near several of the instrument’s many water jet holes, in a variety of different ways, resulting in an ability to independently control the way in which each note in a chord sounds. Thus the hydraulophone combines the intricate embouchure control of woodwind instruments with the polyphony of keyboard instruments. Various forms of our instrument include totally acoustic, totally electronic, as well as hybrid instruments that are acoustic but also include an interface to a multimedia computer to produce a mixture of sounds that are produced by the acoustic properties of water screeching through orific plates, as well as synthesized sounds

    Surfing the Waves: Live Audio Mosaicing of an Electric Bass Performance as a Corpus Browsing Interface

    Get PDF
    In this paper, the authors describe how they use an electric bass as a subtle, expressive and intuitive interface to browse the rich sample bank available to most laptop owners. This is achieved by audio mosaicing of the live bass performance audio, through corpus-based concatenative synthesis (CBCS) techniques, allowing a mapping of the multi-dimensional expressivity of the performance onto foreign audio material, thus recycling the virtuosity acquired on the electric instrument with a trivial learning curve. This design hypothesis is contextualised and assessed within the Sandbox#n series of bass+laptop meta-instruments, and the authors describe technical means of the implementation through the use of the open-source CataRT CBCS system adapted for live mosaicing. They also discuss their encouraging early results and provide a list of further explorations to be made with that rich new interface

    Music Information Retrieval in Live Coding: A Theoretical Framework

    Get PDF
    The work presented in this article has been partly conducted while the first author was at Georgia Tech from 2015–2017 with the support of the School of Music, the Center for Music Technology and Women in Music Tech at Georgia Tech. Another part of this research has been conducted while the first author was at Queen Mary University of London from 2017–2019 with the support of the AudioCommons project, funded by the European Commission through the Horizon 2020 programme, research and innovation grant 688382. The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.Music information retrieval (MIR) has a great potential in musical live coding because it can help the musician–programmer to make musical decisions based on audio content analysis and explore new sonorities by means of MIR techniques. The use of real-time MIR techniques can be computationally demanding and thus they have been rarely used in live coding; when they have been used, it has been with a focus on low-level feature extraction. This article surveys and discusses the potential of MIR applied to live coding at a higher musical level. We propose a conceptual framework of three categories: (1) audio repurposing, (2) audio rewiring, and (3) audio remixing. We explored the three categories in live performance through an application programming interface library written in SuperCollider, MIRLC. We found that it is still a technical challenge to use high-level features in real time, yet using rhythmic and tonal properties (midlevel features) in combination with text-based information (e.g., tags) helps to achieve a closer perceptual level centered on pitch and rhythm when using MIR in live coding. We discuss challenges and future directions of utilizing MIR approaches in the computer music field

    Acoustic localization of tactile interactions for the development of novel tangible interfaces

    Get PDF
    In this paper we propose different acoustic array processing methods for the localization of tactile interactions with planar surfaces. The aim is to create a new class of tangible interfaces for musical performance that can be obtained by simply applying sensors on existing surfaces. The solutions considered in this paper are mainly based on the measurement and the analysis of the Time-Delay-Of-Arrival (TDOA) over a set of contact sensors, placed around the area of potential contact, and allows us to rapidly localize tactile interactions with reasonable accuracy

    Designing and Composing for Interdependent Collaborative Performance with Physics-Based Virtual Instruments

    Get PDF
    Interdependent collaboration is a system of live musical performance in which performers can directly manipulate each other’s musical outcomes. While most collaborative musical systems implement electronic communication channels between players that allow for parameter mappings, remote transmissions of actions and intentions, or exchanges of musical fragments, they interrupt the energy continuum between gesture and sound, breaking our cognitive representation of gesture to sound dynamics. Physics-based virtual instruments allow for acoustically and physically plausible behaviors that are related to (and can be extended beyond) our experience of the physical world. They inherently maintain and respect a representation of the gesture to sound energy continuum. This research explores the design and implementation of custom physics-based virtual instruments for realtime interdependent collaborative performance. It leverages the inherently physically plausible behaviors of physics-based models to create dynamic, nuanced, and expressive interconnections between performers. Design considerations, criteria, and frameworks are distilled from the literature in order to develop three new physics-based virtual instruments and associated compositions intended for dissemination and live performance by the electronic music and instrumental music communities. Conceptual, technical, and artistic details and challenges are described, and reflections and evaluations by the composer-designer and performers are documented
    corecore