3 research outputs found
Sensing and mapping for interactive performance
This paper describes a trans-domain mapping (TDM) framework for translating meaningful activities from one creative domain onto another. The multi-disciplinary framework is designed to facilitate an intuitive and non-intrusive interactive multimedia performance interface that offers the users or performers real-time control of multimedia events using their physical movements. It is intended to be a highly dynamic real-time performance tool, sensing and tracking activities and changes, in order to provide interactive multimedia performances.
From a straightforward definition of the TDM framework, this paper reports several implementations and multi-disciplinary collaborative projects using the proposed framework, including a motion and colour-sensitive system, a sensor-based system for triggering musical events, and a distributed multimedia server for audio mapping of a real-time face tracker, and discusses different aspects of mapping strategies in their context.
Plausible future directions, developments and exploration with the proposed framework, including stage augmenta tion, virtual and augmented reality, which involve sensing and mapping of physical and non-physical changes onto multimedia control events, are discussed
A Motion Recognition Method for a Wearable Dancing Musical Instrument
Abstract In this paper, we constructed a system for realizing a new style of dance performance that dancers play music by dancing. From pilot study, we have found that the motion recognition for dance performance needed the synchronism to back ground music (BGM). Therefore, we propose a new motion recognition method specialized to dance performances. The key techniques of the proposed method are (1) adaptive decision of the size of recognition window to recognize a motion in sync with BGM, and (2) motion recognition in two-phase (rough and detailed) to fulfill the accuracy in high speed recognition. Data was recorded using a 3-axis wireless accelerometers mounted on both shoes. We evaluated the method on a dataset of 5 different dance steps (each repeated 100 times). The results show that this method is capable of improving recognition for all steps (in one case improving recognition from 62% to 99%) while retaining a feeling of seamless connection between movement and sound
Recommended from our members
jn4.gesture: An interactive composition for dance.
jn4.gesture is an interactive multimedia composition for dancer and computer designed to extend the possibilities of artistic expression through the use of contemporary technology. The software produces the audiovisual materials, which are controlled by the movement of the dancer on a custom rug sensor. The software that produces the graphic and sonic material is created using a modular design. During run-time, the software's modules are directed by a scripting language developed to control and adjust the audiovisual production through time. The visual material provides the only illumination of the performer, and the projections follow the performer's movements. The human form is isolated in darkness and it remains the focal point in the visual environment. These same movements are used to create the sonic material and control the diffusion of sound in an eight channel sound system. The video recording of the performance was made on April 22, 2002. The work was produced in a specialized performance space using two computer projectors and a state of the art sound system. Arleen Sugano designed the costumes, choreographed and performed the composition in the Merrill Ellis Intermedia Theatre (MEIT) at the University of North Texas. The paper focuses on the design of the program that controls the production of the audiovisual environment. This is achieved with a discussion of background issues associated with gesture capture. A brief discussion of human-computer interface and its relationship with the arts provides an overview to the software design and mapping scenarios used to create the composition. The design of the score, a graphical user interface, is discussed as the element that synchronizes the media creation in "scenes" of the composition. This score delivers a hybrid script to the modules that controls the production of audiovisual elements. Particular modules are discussed and related to sensor mapping routines that give multiple mapping control to computer function enabling a single gesture o have multiple outcomes