4 research outputs found

    From expressive gesture to sound: the development of an embodied mapping trajectory inside a musical interface

    Get PDF
    This paper contributes to the development of a multimodal, musical tool that extends the natural action range of the human body to communicate expressiveness into the virtual music domain. The core of this musical tool consists of a low cost, highly functional computational model developed upon the Max/MSP platform that (1) captures real-time movement of the human body into a 3D coordinate system on the basis of the orientation output of any type of inertial sensor system that is OSC-compatible, (2) extract low-level movement features that specify the amount of contraction/expansion as a measure of how a subject uses the surrounding space, (3) recognizes these movement features as being expressive gestures, and (4) creates a mapping trajectory between these expressive gestures and the sound synthesis process of adding harmonic related voices on an in origin monophonic voice. The concern for a user-oriented and intuitive mapping strategy was thereby of central importance. This was achieved by conducting an empirical experiment based on theoretical concepts from the embodied music cognition paradigm. Based on empirical evidence, this paper proposes a mapping trajectory that facilitates the interaction between a musician and his instrument, the artistic collaboration between (multimedia) artists and the communication of expressiveness in a social, musical context

    MOTIONCOMPOSER: A DEVICE THAT TURNS MOVEMENT INTO MUSIC

    Get PDF
    Using video-based motion tracking technology, it is possible to make music without touching a musical instrument, but instead by gesturing in space. With a focus on therapeutic, healthcare and pedagogic contexts, the MotionComposer (MC) is one of a small but growing number of tools being developed today which let all kinds of users participate, including those with different abilities. The aim of the MotionComposer is not just to make a fun trick, but rather to delve into the deeper muscle-music relationships that lie at the heart of healthy music and dance expression. After briefly describing the device, this paper describes some of the therapeutic and pedagogical aspects of its use.La utilización de tecnología de captura de movimiento basada en el vídeo permite hacer música sin necesidad de tocar un instrumento musical, sino gestualizando en el espacio. Con un foco en contextos terapéuticos y pedagógicos, el MotionComposer (MC) es una de las herramientas que están siendo desarrolladas en la actualidad, que permite a todo tipo de usuarios participar, incluyendo a las personas con capacidades diversas. El objetivo de MotionComposer no consiste en conseguir un juego fácil, sino investigar en las relaciones musculares-musicales que están en la base de la expresión sana de la danza y la música. Tras describir brevemente el dispositivo, este artículo describe algunos de sus aspectos terapéuticos y pedagógicos de su uso, incluyendo su facilidad de utilización; la posibilidad de uso y adaptación del dispositivo al movimiento a diversas partes del cuerpo; las posibilidades de programar diversos modos de uso; la causalidad que permite comprender la relación entre gestualidad y sonoridad; la posibilidad de tocar músicas estéticamente interesantes independientemente de las capacidades de las personas; y la posibilidad de utilizarse con múltiples usuarios

    Interactive computer music: a performer\u27s guide to issues surrounding Kyma with live clarinet input

    Get PDF
    Musicians are familiar with interaction in rehearsal and performance of music. Technology has become sophisticated and affordable to the point where interaction with a computer in real time performance is also possible. The nature of live interactive electronic music has blurred the distinction between the formerly exclusive realm of composition and that of performance. It is quite possible for performers to participate in the genre but currently little information is available for those wishing to explore it. This written document contains a definition of interaction, discussion on how it occurs in traditional music-making and a brief history of the emergence of live interaction in computer music. It also discusses the concept of live interaction, its aesthetic value, and highlights the possibilities of live interactive computer music using clarinet and the Kyma system, revealing ways a performer may maximize the interactive experience. The document, written from a player\u27s perspective, contains descriptions of possible methods of interaction with Kyma and live clarinet input divided into two areas: the clarinet can be used as a controller and the clarinet can be used as a source of sound. Information upon technical issues such as the speaker system, performance-space acoustics and diffusion options, possible interactive inputs, and specifically on microphone choices for clarinet is provided. There is little information for musicians contemplating the use of Kyma; specifically clarinetists will find in this paper a practical guide to many aspects of live electronic interaction and be better informed to explore the field. This area has the potential to expand not only our performing opportunities, but might increase economic development. Application of interactive music technology can be used in a traditional recital and for collaborative work with other art forms, installation projects and even music therapy. Knowledge of these programs also opens possibilities for sound design in theatre, film and other commercial applications
    corecore