2 research outputs found

    Rich Contacts: Corpus-Based Convolution of Audio Contact Gestures for Enhanced Musical Expression

    Get PDF
    We propose ways of enriching the timbral potential of gestural sonic material captured via piezo or contact microphones, through latency-free convolution of the microphone signal with grains from a sound corpus. This creates a new way to combine the sonic richness of large sound corpora, easily accessible via navigation through a timbral descriptor space, with the intuitive gestural interaction with a surface, captured by any contact microphone. We use convolution to excite the grains from the corpus via the microphone input, capturing the contact interaction sounds, which allows articulation of the corpus by hitting, scratching, or strumming a surface with various parts of the hands or objects. We also show how changes of grains have to be carefully handled, how one can smoothly interpolate between neighbouring grains, and finally evaluate the system against previous attempts

    Studies on customisation-driven digital music instruments

    Get PDF
    From John Cage’s Prepared Piano to the turntable, the history of musical instruments is scattered with examples of musicians who deeply customised their instruments to fit personal artistic objectives, objectives that differed from the ones the instruments have been designed for. In their digital counterpart however, musical instruments are often presented in the form of closed, finalised systems with apriori symbolic rules set by their designer that leave very little room for the artists to customise the technologies for their unique art practices; in these cases the only possibility to change the mode of interaction with digital instrument is to reprogram them, a possibility available to programmers but not to musicians. This thesis presents two digital music instruments designed with the explicit goal of being highly customisable by musicians and to provide different modes of interactions, whilst keeping simplicity and immediateness of use. The first one leverages real-time gesture recognition to provide continuous feedback to users as guidance in defining the behaviour of the system and the gestures it recognises. The second one is a novel tangible user interface which allows to transform everyday objects into expressive digital music instruments, and whose sound generated strongly depends by the particular nature of the physical object selected
    corecore