37,000 research outputs found

    Rethinking live electronic music: a DJ perspective

    Get PDF
    The author critiques the conventional understanding of live electronic music through empirical research on his own DJ practice and investigates others working in the field. In reviewing the opinions of theorists and practitioners in both the live electronic music genre and DJ-ing he argues against the body/machine dialectic that has determined much of the thinking in the former. The author forms a notion of the DJ as a real-time composer working beyond traditional binary distinctions who brings the human body and machine into a mutual relationship. Through practice-led research he charts an investigation beginning in physical human gesture and culminating in digital machine repetition. He concludes that mechanical and digital repetition do not obscure human agency in the production of live works and that this concern is imaginary

    Multiple Media Interfaces for Music Therapy

    Get PDF
    This article describes interfaces (and the supporting technological infrastructure) to create audiovisual instruments for use in music therapy. In considering how the multidimensional nature of sound requires multidimensional input control, we propose a model to help designers manage the complex mapping between input devices and multiple media software. We also itemize a research agenda

    Advanced Media Control Through Drawing: Using a graphics tablet to control complex audio and video data in a live context

    Get PDF
    This paper demonstrates the results of the authors’ Wacom tablet MIDI user interface. This application enables users’ drawing actions on a graphics tablet to control audio and video parameters in real-time. The programming affords five degrees (x, y, pressure, x tilt, y tilt) of concurrent control for use in any audio or video software capable of receiving and processing MIDI data. Drawing gesture can therefore form the basis of dynamic control simultaneously in the auditory and visual realms. This creates a play of connections between parameters in both mediums, and illustrates a direct correspondence between drawing action and media transformation that is immediately apparent to viewers. The paper considers the connection between drawing technique and media control both generally and specifically, postulating that dynamic drawing in a live context creates a performance mode not dissimilar to performing on a musical instrument or conducting with a baton. The use of a dynamic and physical real-time media interface re-inserts body actions into live media performance in a compelling manner. Performers can learn to “draw/play” the graphics tablet as a musical and visual “instrument”, creating a new and uniquely idiomatic form of electronic drawing. The paper also discusses how to practically program the application and presents examples of its use as a media manipulation tool

    Ambient Gestures

    No full text
    We present Ambient Gestures, a novel gesture-based system designed to support ubiquitous ‘in the environment’ interactions with everyday computing technology. Hand gestures and audio feedback allow users to control computer applications without reliance on a graphical user interface, and without having to switch from the context of a non-computer task to the context of the computer. The Ambient Gestures system is composed of a vision recognition software application, a set of gestures to be processed by a scripting application and a navigation and selection application that is controlled by the gestures. This system allows us to explore gestures as the primary means of interaction within a multimodal, multimedia environment. In this paper we describe the Ambient Gestures system, define the gestures and the interactions that can be achieved in this environment and present a formative study of the system. We conclude with a discussion of our findings and future applications of Ambient Gestures in ubiquitous computing

    PLXTRM : prediction-led eXtended-guitar tool for real-time music applications and live performance

    Get PDF
    peer reviewedThis article presents PLXTRM, a system tracking picking-hand micro-gestures for real-time music applications and live performance. PLXTRM taps into the existing gesture vocabulary of the guitar player. On the first level, PLXTRM provides a continuous controller that doesn’t require the musician to learn and integrate extrinsic gestures, avoiding additional cognitive load. Beyond the possible musical applications using this continuous control, the second aim is to harness PLXTRM’s predictive power. Using a reservoir network, string onsets are predicted within a certain time frame, based on the spatial trajectory of the guitar pick. In this time frame, manipulations to the audio signal can be introduced, prior to the string actually sounding, ’prefacing’ note onsets. Thirdly, PLXTRM facilitates the distinction of playing features such as up-strokes vs. down-strokes, string selections and the continuous velocity of gestures, and thereby explores new expressive possibilities

    Introduction to Gestural Similarity in Music. An Application of Category Theory to the Orchestra

    Full text link
    Mathematics, and more generally computational sciences, intervene in several aspects of music. Mathematics describes the acoustics of the sounds giving formal tools to physics, and the matter of music itself in terms of compositional structures and strategies. Mathematics can also be applied to the entire making of music, from the score to the performance, connecting compositional structures to acoustical reality of sounds. Moreover, the precise concept of gesture has a decisive role in understanding musical performance. In this paper, we apply some concepts of category theory to compare gestures of orchestral musicians, and to investigate the relationship between orchestra and conductor, as well as between listeners and conductor/orchestra. To this aim, we will introduce the concept of gestural similarity. The mathematical tools used can be applied to gesture classification, and to interdisciplinary comparisons between music and visual arts.Comment: The final version of this paper has been published by the Journal of Mathematics and Musi

    Sensing and mapping for interactive performance

    Get PDF
    This paper describes a trans-domain mapping (TDM) framework for translating meaningful activities from one creative domain onto another. The multi-disciplinary framework is designed to facilitate an intuitive and non-intrusive interactive multimedia performance interface that offers the users or performers real-time control of multimedia events using their physical movements. It is intended to be a highly dynamic real-time performance tool, sensing and tracking activities and changes, in order to provide interactive multimedia performances. From a straightforward definition of the TDM framework, this paper reports several implementations and multi-disciplinary collaborative projects using the proposed framework, including a motion and colour-sensitive system, a sensor-based system for triggering musical events, and a distributed multimedia server for audio mapping of a real-time face tracker, and discusses different aspects of mapping strategies in their context. Plausible future directions, developments and exploration with the proposed framework, including stage augmenta tion, virtual and augmented reality, which involve sensing and mapping of physical and non-physical changes onto multimedia control events, are discussed

    Advanced Media Control Through Drawing: Using a graphics tablet to control complex audio and video data in a live context

    Get PDF
    This paper demonstrates the results of the authors’ Wacom tablet MIDI user interface. This application enables users’ drawing actions on a graphics tablet to control audio and video parameters in real-time. The programming affords five degrees (x, y, pressure, x tilt, y tilt) of concurrent control for use in any audio or video software capable of receiving and processing MIDI data
    • 

    corecore