35,219 research outputs found
A Conceptual Framework for Motion Based Music Applications
Imaginary projections are the core of the framework for motion
based music applications presented in this paper. Their design depends
on the space covered by the motion tracking device, but also
on the musical feature involved in the application. They can be considered
a very powerful tool because they allow not only to project
in the virtual environment the image of a traditional acoustic instrument,
but also to express any spatially defined abstract concept.
The system pipeline starts from the musical content and, through a
geometrical interpretation, arrives to its projection in the physical
space. Three case studies involving different motion tracking devices
and different musical concepts will be analyzed. The three
examined applications have been programmed and already tested
by the authors. They aim respectively at musical expressive interaction
(Disembodied Voices), tonal music knowledge (Harmonic
Walk) and XX century music composition (Hand Composer)
Sensing and mapping for interactive performance
This paper describes a trans-domain mapping (TDM) framework for translating meaningful activities from one creative domain onto another. The multi-disciplinary framework is designed to facilitate an intuitive and non-intrusive interactive multimedia performance interface that offers the users or performers real-time control of multimedia events using their physical movements. It is intended to be a highly dynamic real-time performance tool, sensing and tracking activities and changes, in order to provide interactive multimedia performances.
From a straightforward definition of the TDM framework, this paper reports several implementations and multi-disciplinary collaborative projects using the proposed framework, including a motion and colour-sensitive system, a sensor-based system for triggering musical events, and a distributed multimedia server for audio mapping of a real-time face tracker, and discusses different aspects of mapping strategies in their context.
Plausible future directions, developments and exploration with the proposed framework, including stage augmenta tion, virtual and augmented reality, which involve sensing and mapping of physical and non-physical changes onto multimedia control events, are discussed
Tangible user interfaces : past, present and future directions
In the last two decades, Tangible User Interfaces (TUIs) have emerged as a new interface type that interlinks the digital and physical worlds. Drawing upon users' knowledge and skills of interaction with the real non-digital world, TUIs show a potential to enhance the way in which people interact with and leverage digital information. However, TUI research is still in its infancy and extensive research is required in or- der to fully understand the implications of tangible user interfaces, to develop technologies that further bridge the digital and the physical, and to guide TUI design with empirical knowledge. This paper examines the existing body of work on Tangible User In- terfaces. We start by sketching the history of tangible user interfaces, examining the intellectual origins of this field. We then present TUIs in a broader context, survey application domains, and review frame- works and taxonomies. We also discuss conceptual foundations of TUIs including perspectives from cognitive sciences, phycology, and philoso- phy. Methods and technologies for designing, building, and evaluating TUIs are also addressed. Finally, we discuss the strengths and limita- tions of TUIs and chart directions for future research
Recommended from our members
Towards 3-D Sound: Spatial Presence and the Space Vacuum
This chapter demonstrates the evolution of relationships between sound design and music in cinematic representations of the interstellar space vacuum. Mera provides a framework for understanding how audiences believe they are physically present in the represented environment and argues that, in the late 2000s, we move towards three-dimensional (3-D) sound, an aesthetic and technical extension of the superfield and the ultrafield as defined by Chion and Kerins, respectively. 3-D Sound’s primary characteristic is the emancipation of music from a fixed sound-stage spatialization, resulting in greater fluidity between sound design and music. This chapter examines the relationship between two types of spatial presence, articulating both the audience’s suspension of disbelief within a film’s narrative world and the spatial presence of sound and music within a multichannel cinema environme
Preserving today for tomorrow: A case study of an archive of Interactive Music Installations
This work presents the problems addressed
and the first results obtained by a project aimed at
the preservation of Interactive Music Installations (IMI).
Preservation requires that besides all the necessary components
for the (re)production of a performance, also the
knowledge about these components is kept, so that the
original process can be repeated at any given time. This
work proposes a multilevel approach for the preservation
of IMI. As case studies, the Pinocchio Square (installed in
EXPO 2002) and the Il Caos delle Sfere are considered
Recommended from our members
Harmony and Technology Enhanced Learning
New technologies offer rich opportunities to support education in harmony. In this chapter we consider theoretical perspectives and underlying principles behind technologies for learning and teaching harmony. Such perspectives help in matching existing and future technologies to educational purposes, and to inspire the creative re-appropriation of technologies
Conducting a virtual ensemble with a kinect device
This paper presents a gesture-based interaction technique for the implementation of an orchestra conductor and a virtual ensemble, using a 3D camera-based sensor to capture user’s gestures. In particular, a human-computer interface has been developed to recognize conducting gestures using a Microsoft Kinect device. The system allows the conductor to control both the tempo in the piece played as well as the dynamics of each instrument set independently. In order to modify the tempo in the playback, a time-frequency
processing-based algorithmis used. Finally, an experiment was conducted to assess user’s opinion of the system as well as experimentally confirm if the features in the system were effectively improving user experience or not.This work has been funded by the Ministerio de Economia y Competitividad of the Spanish Government under Project No. TIN2010-21089-C03-02 and Project No. IPT-2011-0885-430000 and by the Junta de Andalucia under Project No. P11-TIC-7154. The work has been done at Universidad de Malaga. Campus de Excelencia Internacional Andalucia Tech
Toward a model of computational attention based on expressive behavior: applications to cultural heritage scenarios
Our project goals consisted in the development of attention-based analysis of human expressive behavior and the implementation of real-time algorithm in EyesWeb XMI in order to improve naturalness of human-computer interaction and context-based monitoring of human behavior. To this aim, perceptual-model that mimic human attentional processes was developed for expressivity analysis and modeled by entropy. Museum scenarios were selected as an ecological test-bed to elaborate three experiments that focus on visitor profiling and visitors flow regulation
- …