37,000 research outputs found
Rethinking live electronic music: a DJ perspective
The author critiques the conventional understanding of live electronic music through empirical research on his own DJ practice and investigates others working in the field. In reviewing the opinions of theorists and practitioners in both the live electronic music genre and DJ-ing he argues against the body/machine dialectic that has determined much of the thinking in the former. The author forms a notion of the DJ as a real-time composer working beyond traditional binary distinctions who brings the human body and machine into a mutual relationship. Through practice-led research he charts an investigation beginning in physical human gesture and culminating in digital machine repetition. He concludes that mechanical and digital repetition do not obscure human agency in the production of live works and that this concern is imaginary
Multiple Media Interfaces for Music Therapy
This article describes interfaces (and the supporting technological infrastructure) to create audiovisual instruments for use in music therapy. In considering how the multidimensional nature of sound requires multidimensional input control, we propose a model to help designers manage the complex mapping between input devices and multiple media software. We also itemize a research agenda
Advanced Media Control Through Drawing: Using a graphics tablet to control complex audio and video data in a live context
This paper demonstrates the results of the authorsâ Wacom tablet MIDI user interface. This application enables usersâ drawing actions on a graphics tablet to control audio and video parameters in real-time. The programming affords five degrees (x, y, pressure, x tilt, y tilt) of concurrent control for use in any audio or video software capable of receiving and processing MIDI data. Drawing gesture can therefore form the basis of dynamic control simultaneously in the auditory and visual realms. This creates a play of connections between parameters in both mediums, and illustrates a direct correspondence between drawing action and media transformation that is immediately apparent to viewers.
The paper considers the connection between drawing technique and media control both generally and specifically, postulating that dynamic drawing in a live context creates a
performance mode not dissimilar to performing on a musical instrument or conducting with a baton. The use of a dynamic and physical real-time media interface re-inserts body actions into live media performance in a compelling manner. Performers can learn to âdraw/playâ the graphics tablet as a musical and visual âinstrumentâ, creating a new and uniquely idiomatic form of electronic drawing. The paper also discusses how to practically program the application and presents examples of its use as a media manipulation tool
Ambient Gestures
We present Ambient Gestures, a novel gesture-based system designed to support ubiquitous âin the environmentâ interactions with everyday computing technology. Hand gestures and audio feedback allow users to control computer applications without reliance on a graphical user interface, and without having to switch from the context of a non-computer task to the context of the computer. The Ambient Gestures system is composed of a vision recognition software application, a set of gestures to be processed by a scripting application and a navigation and selection application that is controlled by the gestures. This system allows us to explore gestures as the primary means of interaction within a multimodal, multimedia environment. In this paper we describe the Ambient Gestures system, define the gestures and the interactions that can be achieved in this environment and present a formative study of the system. We conclude with a discussion of our findings and future applications of Ambient Gestures in ubiquitous computing
Recommended from our members
Seeking out the spaces between: Using improvisation for collaborative composition and interactive technology
Copyright © 2010 ISASTThis article presents findings from experiments into piano performance live electronics undertaken by the author since early 2007. The use of improvisation has infused every step of the process---both as a methodology to obtain meaningful results using interactive technology and as a way to generate and characterize a collaborative musical space with composers. The technology used has included pre-built MIDI interfaces such as the PianoBar, actuators such as miniature DC motors and sensor interfaces including iCube and the Wii controller. Collaborators have included researchers at the Centre for Digital Music (QMUL), Richard Barrett, Pierre Alexandre Tremblay and Atau Tanaka. In seeking to create responsive âperformance environmentsâ at the piano, I explore live, performative control of electronics to create better connections for both performer (providing the same level of interpretive freedom as with a âpureâ instrumental performance) and audience (communicating clearly to them). I have been lucky to witness first-hand many live interactive performances and to work with various empathetic composers/performers in flexible working environments. Collaborating with experienced technologists and musicians, I have witnessed time and again what, for me, is a fundamental truth in interactive instrumental performance: As a living, spontaneous form it must be nurtured and informed by the performerâs physicality and imagination as much as by the creativity or knowledge of the composer and/or technologist. Specifically in the case of sensors, their dependence on the detail of each personâs body and reactions is so refined as to necessitate, I would argue, an entirely collaborative approach and therefore one that involves at least directed improvisation and, more likely, fairly extensive improvised exploration. The fundamentally personal and intimate nature of sensor readings---the amount of tension created by each performer, the shape of the ancillary gestures or the level of emotional involvement (especially relevant when using galvanic skin response or EEG)---makes creating pieces with sensors extremely difficult for a composer to do in isolation. Improvisation therefore provides a way for performer and composer to generate a common musical and gestural language. Related to these issues is the fact that the technical or notational parameters in interactive music are not yet (and may never be) standardized, thereby creating a very real and practical need for improvisation to figure at least somewhere in the process.This study is funded by the Arts and Humanities Research Council
PLXTRM : prediction-led eXtended-guitar tool for real-time music applications and live performance
peer reviewedThis article presents PLXTRM, a system tracking picking-hand micro-gestures for real-time music applications and live performance. PLXTRM taps into the existing gesture vocabulary of the guitar player. On the first level, PLXTRM provides a continuous controller that doesnât require the musician to learn and integrate extrinsic gestures, avoiding additional cognitive load. Beyond the possible musical applications using this continuous control, the second aim is to harness PLXTRMâs predictive power. Using a reservoir network, string onsets are predicted within a certain time frame, based on the spatial trajectory of the guitar pick. In this time frame, manipulations to the audio signal can be introduced, prior to the string actually sounding, âprefacingâ note onsets. Thirdly, PLXTRM facilitates the distinction of playing features such as up-strokes vs. down-strokes, string selections and the continuous velocity of gestures, and thereby explores new expressive possibilities
Introduction to Gestural Similarity in Music. An Application of Category Theory to the Orchestra
Mathematics, and more generally computational sciences, intervene in several
aspects of music. Mathematics describes the acoustics of the sounds giving
formal tools to physics, and the matter of music itself in terms of
compositional structures and strategies. Mathematics can also be applied to the
entire making of music, from the score to the performance, connecting
compositional structures to acoustical reality of sounds. Moreover, the precise
concept of gesture has a decisive role in understanding musical performance. In
this paper, we apply some concepts of category theory to compare gestures of
orchestral musicians, and to investigate the relationship between orchestra and
conductor, as well as between listeners and conductor/orchestra. To this aim,
we will introduce the concept of gestural similarity. The mathematical tools
used can be applied to gesture classification, and to interdisciplinary
comparisons between music and visual arts.Comment: The final version of this paper has been published by the Journal of
Mathematics and Musi
Sensing and mapping for interactive performance
This paper describes a trans-domain mapping (TDM) framework for translating meaningful activities from one creative domain onto another. The multi-disciplinary framework is designed to facilitate an intuitive and non-intrusive interactive multimedia performance interface that offers the users or performers real-time control of multimedia events using their physical movements. It is intended to be a highly dynamic real-time performance tool, sensing and tracking activities and changes, in order to provide interactive multimedia performances.
From a straightforward definition of the TDM framework, this paper reports several implementations and multi-disciplinary collaborative projects using the proposed framework, including a motion and colour-sensitive system, a sensor-based system for triggering musical events, and a distributed multimedia server for audio mapping of a real-time face tracker, and discusses different aspects of mapping strategies in their context.
Plausible future directions, developments and exploration with the proposed framework, including stage augmenta tion, virtual and augmented reality, which involve sensing and mapping of physical and non-physical changes onto multimedia control events, are discussed
Advanced Media Control Through Drawing: Using a graphics tablet to control complex audio and video data in a live context
This paper demonstrates the results of the authorsâ Wacom tablet MIDI user interface. This application enables usersâ drawing actions on a graphics tablet to control audio and video parameters in real-time. The programming affords five degrees (x, y, pressure, x tilt, y tilt) of concurrent control for use in any audio or video software capable of receiving and processing MIDI data
- âŠ