156 research outputs found

    D'Groove: A haptic turntable for digital audio control

    Get PDF
    Proceedings of the 9th International Conference on Auditory Display (ICAD), Boston, MA, July 7-9, 2003.In this paper, we discuss the design and implementation of D'Groove, an intelligent Disc Jockey (DJ) system that uses a haptic turntable for controlling the playback of digital audio. We begin by describing the tasks of a DJ and defining some of the challenges associated with the traditional DJ process. We then introduce our new system, discussing how it improves auditory navigation for DJs and introduces new performance possibilities. We also discuss the role of haptics in an auditory display

    Ambiguous Devices: improvisation, agency, touch and feedthrough in distributed music performance

    Get PDF
    This article documents the processes behind our distributed musical instrument, Ambiguous Devices. The project is motivated by our mutual desire to explore disruptive forms of networked musical interactions in an attempt to challenge and extend our practices as improvisers and instrument makers. We begin by describing the early design stage of our performance ecosystem, followed by a technical description of how the system functions with examples from our public performances and installations. We then situate our work within a genealogy of human-machine improvisation, while highlighting specific values that continue to motivate our artistic approach. These practical accounts inform our discussion of tactility, proximity, effort, friction and other attributes that have shaped our strategies for designing musical interactions. The positive role of ambiguity is elaborated in relationship to distributed agency. Finally, we employ the concept of ‘feedthrough’ as a way of understanding the co-constitutive behaviour of communication networks, assemblages and performers

    Instruments for Spatial Sound Control in Real Time Music Performances. A Review

    Get PDF
    The systematic arrangement of sound in space is widely considered as one important compositional design category of Western art music and acoustic media art in the 20th century. A lot of attention has been paid to the artistic concepts of sound in space and its reproduction through loudspeaker systems. Much less attention has been attracted by live-interactive practices and tools for spatialisation as performance practice. As a contribution to this topic, the current study has conducted an inventory of controllers for the real time spatialisation of sound as part of musical performances, and classified them both along different interface paradigms and according to their scope of spatial control. By means of a literature study, we were able to identify 31 different spatialisation interfaces presented to the public in context of artistic performances or at relevant conferences on the subject. Considering that only a small proportion of these interfaces combines spatialisation and sound production, it seems that in most cases the projection of sound in space is not delegated to a musical performer but regarded as a compositional problem or as a separate performative dimension. With the exception of the mixing desk and its fader board paradigm as used for the performance of acousmatic music with loudspeaker orchestras, all devices are individual design solutions developed for a specific artistic context. We conclude that, if controllers for sound spatialisation were supposed to be perceived as musical instruments in a narrow sense, meeting certain aspects of instrumentality, immediacy, liveness, and learnability, new design strategies would be required

    Using wrist vibrations to guide hand movement and whole body navigation

    Get PDF
    International audienceIn the absence of vision, mobility and orientation are challenging. Audio and tactile feedback can be used to guide visually impaired people. In this paper, we present two complementary studies on the use of vibrational cues for hand guidance during the exploration of itineraries on a map, and whole body-guidance in a virtual environment. Concretely, we designed wearable Arduino bracelets integrating a vibratory motor producing multiple patterns of pulses. In a first study, this bracelet was used for guiding the hand along unknown routes on an interactive tactile map. A wizard-of-Oz study with six blindfolded participants showed that tactons, vibrational patterns, may be more efficient than audio cues for indicating directions. In a second study, this bracelet was used by blindfolded participants to navigate in a virtual environment. The results presented here show that it is possible to significantly decrease travel distance with vibrational cues. To sum up, these preliminary but complementary studies suggest the interest of vibrational feedback in assistive technology for mobility and orientation for blind people

    Haptic Wave

    Get PDF
    We present the Haptic Wave, a device that allows cross-modal mapping of digital audio to the haptic domain, intended for use by audio producers/engineers with visual impairments. We describe a series of participatory design activities adapted to non-sighted users where the act of prototyping facilitates dialog. A series of workshops scoping user needs, and testing a technology mock up and lo-fidelity prototype fed into the design of a final high-spec prototype. The Haptic Wave was tested in the laboratory, then deployed in real world settings in recording studios and audio production facilities. The cross-modal mapping is kinesthetic and allows the direct manipulation of sound without the translation of an existing visual interface. The research gleans insight into working with users with visual impairments, and transforms perspective to think of them as experts in non-visual interfaces for all users. This received the Best Paper Award at CHI 2016, the most prestigious human-computer interaction conference and one of the top-ranked conferences in computer science

    Computed fingertip touch for the instrumental control of musical sound with an excursion on the computed retinal afterimage

    Get PDF
    In this thesis, we present an articulated, empirical view on what human music making is, and on how this fundamentally relates to computation. The experimental evidence which we obtained seems to indicate that this view can be used as a tool, to systematically generate models, hypotheses and new technologies that enable an ever more complete answer to the fundamental question as to what forms of instrumental control of musical sound are possible to implement. This also entails the development of two novel transducer technologies for computed fingertip touch: The cyclotactor (CT) system, which provides fingerpad-orthogonal force output while tracking surface-orthogonal fingertip movement; and the kinetic surface friction transducer (KSFT) system, which provides fingerpad-parallel force output while tracking surface-parallel fingertip movement. In addition to the main research, the thesis also contains two research excursions, which are due to the nature of the Ph.D. position. The first excursion shows how repeated and varying pressing movements on the already held-down key of a computer keyboard can be used both to simplify existing user interactions and to implement new ones, that allow the rapid yet detailed navigation of multiple possible interaction outcomes. The second excursion shows that automated computational techniques can display shape specifically in the retinal afterimage, a well-known effect in the human visual system.Computer Systems, Imagery and Medi

    Iris: A Circular Polyrhythmic Music Sequencer

    Get PDF
    Iris is a conceptual circular music sequencer, with a working touchscreen interface prototype developed as a proof of concept. It seeks for a contemporary and explorative, yet intuitive way of visualizing and manipulating musical rhythms. The Iris interface consists of several concentric rings, representing step pattens. A playhead constantly rotates around the rings, playing turned-on steps as it comes across them. Each ring is assigned a midi note, and each active step plays a note on message. The number of steps on each ring can range from 2 to 64. The user can change the resolution of each ring individually: as a result, during one playhead cycle, the playhead can pass 16 steps on one ring, and 12 steps on another one, as an example. This allows the user to create complex polyrhythms. Each ring can be separately rotated in an arbitrary position with a two-finger swipe gesture. This causes the orientation of a single rhythm layer to change in relation to other layers as well as with the playhead. With minor shifts, phenomenons like swing or shuffle can also be produced. This thesis describes the Iris concept in detail, including the design decisions on the interface and architecture. It provides a contextual framework for the production: I explore the methods of creating rhythms in different music genres. I also look at the history of sequencing and looping music, and benchmark existing products and productions related to the concept. The outcomes of the project are reviewed from my personal perspective as a musician and designer. Initial user feedback received from the prototype is also presented. The source codes for the implementations on the iPad and Max 4 Live environment are released as a part of this thesis
    corecore