420 research outputs found
Recommended from our members
Seeking out the spaces between: Using improvisation for collaborative composition and interactive technology
Copyright © 2010 ISASTThis article presents findings from experiments into piano performance live electronics undertaken by the author since early 2007. The use of improvisation has infused every step of the process---both as a methodology to obtain meaningful results using interactive technology and as a way to generate and characterize a collaborative musical space with composers. The technology used has included pre-built MIDI interfaces such as the PianoBar, actuators such as miniature DC motors and sensor interfaces including iCube and the Wii controller. Collaborators have included researchers at the Centre for Digital Music (QMUL), Richard Barrett, Pierre Alexandre Tremblay and Atau Tanaka. In seeking to create responsive “performance environments” at the piano, I explore live, performative control of electronics to create better connections for both performer (providing the same level of interpretive freedom as with a “pure” instrumental performance) and audience (communicating clearly to them). I have been lucky to witness first-hand many live interactive performances and to work with various empathetic composers/performers in flexible working environments. Collaborating with experienced technologists and musicians, I have witnessed time and again what, for me, is a fundamental truth in interactive instrumental performance: As a living, spontaneous form it must be nurtured and informed by the performer’s physicality and imagination as much as by the creativity or knowledge of the composer and/or technologist. Specifically in the case of sensors, their dependence on the detail of each person’s body and reactions is so refined as to necessitate, I would argue, an entirely collaborative approach and therefore one that involves at least directed improvisation and, more likely, fairly extensive improvised exploration. The fundamentally personal and intimate nature of sensor readings---the amount of tension created by each performer, the shape of the ancillary gestures or the level of emotional involvement (especially relevant when using galvanic skin response or EEG)---makes creating pieces with sensors extremely difficult for a composer to do in isolation. Improvisation therefore provides a way for performer and composer to generate a common musical and gestural language. Related to these issues is the fact that the technical or notational parameters in interactive music are not yet (and may never be) standardized, thereby creating a very real and practical need for improvisation to figure at least somewhere in the process.This study is funded by the Arts and Humanities Research Council
AQUA-G: a universal gesture recognition framework
In this thesis, I describe a software architecture and implementation which is designed to ease the process of 1) developing gesture-enabled applications and 2) using multiple disparate interaction devices simultaneously to create gestures. Developing gesture-enabled applications from scratch can be a time-consuming process involving obtaining input from novel input devices, processing that input in order to recognize gestures, and connecting this information to the application. Previously, developers have turned to gesture recognition systems to assist them in developing these applications. However, existing systems to date are limited in flexibility and adaptability. I propose AQUA-G, a universal gesture recognition framework that utilizes a unified event architecture to communicate with a limitless variety of input devices. AQUA-G provides abstraction of gesture recognition and allows developers to write custom gestures. Its features have been driven in part by previous architectures and are partially based on a needs assessment with a sample of developers. This research contributes a scalable and reliable software system for gesture-enabled application development, which makes developing and prototyping novel interaction styles more accessible to a larger development community
Conducting a virtual ensemble with a kinect device
This paper presents a gesture-based interaction technique for the implementation of an orchestra conductor and a virtual ensemble, using a 3D camera-based sensor to capture user’s gestures. In particular, a human-computer interface has been developed to recognize conducting gestures using a Microsoft Kinect device. The system allows the conductor to control both the tempo in the piece played as well as the dynamics of each instrument set independently. In order to modify the tempo in the playback, a time-frequency
processing-based algorithmis used. Finally, an experiment was conducted to assess user’s opinion of the system as well as experimentally confirm if the features in the system were effectively improving user experience or not.This work has been funded by the Ministerio de Economia y Competitividad of the Spanish Government under Project No. TIN2010-21089-C03-02 and Project No. IPT-2011-0885-430000 and by the Junta de Andalucia under Project No. P11-TIC-7154. The work has been done at Universidad de Malaga. Campus de Excelencia Internacional Andalucia Tech
Separating gesture detection and application control concerns with a multimodal architecture
Comunicação apresentada na conferência internacional - CIT 2015, realizada de 26-28 de outubro de 2015Gesture-controlled applications typically are tied to specific gestures, and also tied to specific recognition methods and specific gesture-detection devices. We propose a concernseparation architecture, which mediates the following concerns: gesture acquisition; gesture recognition; and gestural control. It enables application developers to respond to gesture-independent commands, recognized using plug-in gesture-recognition modules that process gesture data via both device-dependent and deviceindependent data formats and callbacks. Its feasibility is demonstrated with a sample implementation
Wii Remote-based Collaborative Interfaces for Music
Wii Remote is the main controller for Nintendo\u27s Wii con- sole. Thanks to the use of accelerometer and optical sensor technology, it presents motion sensing capability, which implies gesture recognition and intuitive pointing. Such a controller is user-friendly, inexpensive and easily available, due to growing Wii console\u27s popularity. These features make Wii Remote a good device to create and manipulate both music and audio in a home entertainment environment. In this paper, the most interesting characteristics of the controller will be reviewed. A case study will be presented, namely the creation of a virtual music instrument to be controlled in a collaborative way through a set of Wii Remotes
User-Defined Gestural Interaction: a Study on Gesture Memorization
8 pagesInternational audienceIn this paper we study the memorization of user created gestures for 3DUI. Wide public applications mostly use standardized gestures for interactions with simple contents. This work is motivated by two application cases for which a standardized approach is not possible and thus user specific or dedicated interfaces are needed. The first one is applications for people with limited sensory-motor abilities for whom generic interaction methods may not be adapted. The second one is creative arts applications, for which gesture freedom is part of the creative process. In this work, users are asked to create gestures for a set of tasks, in a specific phase, prior to using the system. We propose a user study to explore the question of gesture memorization. Gestures are recorded and recognized with a Hidden Markov Model. Results show that it seems difficult to recall more than two abstract gestures. Affordances strongly improve memorization whereas the use of colocalization has no significant effect
Virtual kompang: mapping in-air hand gestures for music interaction using gestural musical controller
The introduction of new gesture interfaces has been expanding the possibilities of creating new Digital Musical Instruments (DMI). However, the created interfaces are mainly focused on modern western musical instruments such as piano, drum and guitar. This paper presents a virtual musical instrument, namely Virtual Kompang, a traditional Malay percussion instrument. The interface design and its implementation are presented in this paper. The results of a guessability study are presented in the study to elicit end-user hand movement to map onto commands. The study demonstrated the existing of common hand gestures among the users on mapping with the selected commands. A consensus set of gestures is presented as the outcome of this study.Keywords: Digital Music Instrument, Virtual Environment, Gestural Control, Leap Motion, Virtual Instrumen
Virtual Conductor for String Quartet Practice
This paper presents a system that emulates an ensemble conductor for string quartets. This application has been developed as a support tool for individual and group practice, so that users of any age range can use it to further hone their skills, both for regular musicians and students
alike. The virtual conductor designed can offer similar indications to those given by a real ensemble conductor to potential users regarding beat times, dynamics, etc. The application developed allows the user to rehearse his/her
performance without the need of having an actual conductor present, and also gives access to additional tools to further support the learning/practice process, such as a tuner
or a melody evaluator. The system developed also allows for both solo practice and group practice. A set of tests were conducted to check the usefulness of the application as a practice support tool. A group of musicians from the
Chamber Orchestra of Malaga including an ensemble conductor tested the system, and reported to have found it a very useful tool within an educational environment and that it helps to address the lack of this kind of educational tools in a self-learning environment.This work has been funded by the Ministerio de Economia y Competitividad of the Spanish Government under Project No. TIN2010-21089-C03- 02 and Project No. IPT-2011-0885-430000 and by the Ministerio de Industria, Turismo y Comercio under Project No. TSI-090100-2011-25
Music conducting pedagogy and technology : a document analysis on best practices
This document analysis was designed to investigate pedagogical practices of music conducting teachers in conjunction with research of technologists on the use of various technologies as teaching tools. I sought to discern how conducting teachers and pedagogues are applying recent technological advancements into their teaching strategies. I also sought to understand what paths research is taking about the use of software, hardware, and computer systems applied to the teaching of music conducting technique. This dissertation was guided by four main research questions: (1) How has technology been used to aid in the teaching of conducting? (2) What is the role of technology in the context of conducting pedagogy? (3) Given that conducting is a performative act, how can it be developed through technological means? (4) What technological possibilities exist in the teaching of music conducting technique? Data were collected through music conducting syllabi, conducting textbooks, and research articles. Documents were selected through purposive sampling procedures. Analysis of documents through the constant comparative approach identified emerging themes and differences across the three types of documents. Based on a synthesis of information, I discussed implications for conducting pedagogy and made suggestions for conducting educators.Includes bibliographical references
- …