12,951 research outputs found

    Interacting with 3D Reactive Widgets for Musical Performance

    Get PDF
    International audienceWhile virtual reality and 3D interaction open new prospects for musical performance, existing immersive virtual instruments are often limited to single process instruments or musical navigation tools. We believe that immersive virtual environments may be used to design expressive and efficient multi-process instruments. In this paper we present the 3D reactive widgets. These graphical elements enable efficient and simultaneous control and visualization of musical processes. Then we describe Piivert, a novel input device that we have developed to manipulate these widgets, and several techniques for 3D musical interaction

    A Conceptual Framework for Motion Based Music Applications

    Get PDF
    Imaginary projections are the core of the framework for motion based music applications presented in this paper. Their design depends on the space covered by the motion tracking device, but also on the musical feature involved in the application. They can be considered a very powerful tool because they allow not only to project in the virtual environment the image of a traditional acoustic instrument, but also to express any spatially defined abstract concept. The system pipeline starts from the musical content and, through a geometrical interpretation, arrives to its projection in the physical space. Three case studies involving different motion tracking devices and different musical concepts will be analyzed. The three examined applications have been programmed and already tested by the authors. They aim respectively at musical expressive interaction (Disembodied Voices), tonal music knowledge (Harmonic Walk) and XX century music composition (Hand Composer)

    Gulliver project: performers and visitors

    Get PDF
    This paper discusses two projects in our research environment. The Gulliver project, an ambitious project conceived by some artists connected to our research efforts, and the Aveiro-project, as well ambitious, but with goals that can be achieved beause of technological developments, rather than be dependent on artistic and 'political' (read: financial) sources. Both projects are on virtual and augmented reality. The main goal is to design inhabited environments, where 'inhabited' refers to autonomous agents and agents that represent humans, realtime or off-line, visiting the virtual environment and interacting with other agents. The Gulliver environment has been designed by two artists: Matjaz Stuk and Alena Hudcovicova. The Aveiro project is a research effort of a group of researchers trying to design models of intelligence and interaction underlying the behavior of (groups of) agents inhabiting virtual worlds. In this paper we survey the current state of both projects and we discuss current and future attempts to have music performances by virtual and real performers in these environments

    Moveable worlds/digital scenographies

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ Intellect Ltd 2010.The mixed reality choreographic installation UKIYO explored in this article reflects an interest in scenographic practices that connect physical space to virtual worlds and explore how performers can move between material and immaterial spaces. The spatial design for UKIYO is inspired by Japanese hanamichi and western fashion runways, emphasizing the research production company's commitment to various creative crossovers between movement languages, innovative wearable design for interactive performance, acoustic and electronic sound processing and digital image objects that have a plastic as well as an immaterial/virtual dimension. The work integrates various forms of making art in order to visualize things that are not in themselves visual, or which connect visual and kinaesthetic/tactile/auditory experiences. The ‘Moveable Worlds’ in this essay are also reflections of the narrative spaces, subtexts and auditory relationships in the mutating matrix of an installation-space inviting the audience to move around and follow its sensorial experiences, drawn near to the bodies of the dancers.Brunel University, the British Council, and the Japan Foundation

    Screen-based musical instruments as semiotic machines

    Get PDF
    The ixi software project started in 2000 with the intention to explore new interactive patterns and virtual interfaces in computer music software. The aim of this paper is not to describe these programs, as they have been described elsewhere, but rather explicate the theoretical background that underlies the design of these screen-based instruments. After an analysis of the similarities and differences in the design of acoustic and screen-based instruments, the paper describes how the creation of an interface is essentially the creation of a semiotic system that affects and influences the musician and the composer. Finally the terminology of this semiotics is explained as an interaction model

    Conducting a virtual ensemble with a kinect device

    Get PDF
    This paper presents a gesture-based interaction technique for the implementation of an orchestra conductor and a virtual ensemble, using a 3D camera-based sensor to capture user’s gestures. In particular, a human-computer interface has been developed to recognize conducting gestures using a Microsoft Kinect device. The system allows the conductor to control both the tempo in the piece played as well as the dynamics of each instrument set independently. In order to modify the tempo in the playback, a time-frequency processing-based algorithmis used. Finally, an experiment was conducted to assess user’s opinion of the system as well as experimentally confirm if the features in the system were effectively improving user experience or not.This work has been funded by the Ministerio de Economia y Competitividad of the Spanish Government under Project No. TIN2010-21089-C03-02 and Project No. IPT-2011-0885-430000 and by the Junta de Andalucia under Project No. P11-TIC-7154. The work has been done at Universidad de Malaga. Campus de Excelencia Internacional Andalucia Tech

    Sensing and mapping for interactive performance

    Get PDF
    This paper describes a trans-domain mapping (TDM) framework for translating meaningful activities from one creative domain onto another. The multi-disciplinary framework is designed to facilitate an intuitive and non-intrusive interactive multimedia performance interface that offers the users or performers real-time control of multimedia events using their physical movements. It is intended to be a highly dynamic real-time performance tool, sensing and tracking activities and changes, in order to provide interactive multimedia performances. From a straightforward definition of the TDM framework, this paper reports several implementations and multi-disciplinary collaborative projects using the proposed framework, including a motion and colour-sensitive system, a sensor-based system for triggering musical events, and a distributed multimedia server for audio mapping of a real-time face tracker, and discusses different aspects of mapping strategies in their context. Plausible future directions, developments and exploration with the proposed framework, including stage augmenta tion, virtual and augmented reality, which involve sensing and mapping of physical and non-physical changes onto multimedia control events, are discussed

    Straddling the intersection

    Get PDF
    Music technology straddles the intersection between art and science and presents those who choose to work within its sphere with many practical challenges as well as creative possibilities. The paper focuses on four main areas: secondary education, higher education, practice and research and finally collaboration. The paper emphasises the importance of collaboration in tackling the challenges of interdisciplinarity and in influencing future technological developments

    Sampling the past:a tactile approach to interactive musical instrument exhibits in the heritage sector

    Get PDF
    In the last decade, the heritage sector has had to adapt to a shifting cultural landscape of public expectations and attitudes towards ownership and intellectual property. One way it has done this is to focus on each visitor’s encounter and provide them with a sense of experiential authenticity.There is a clear desire by the public to engage with music collections in this way, and a sound museological rationale for providing such access, but the approach raises particular curatorial problems, specifically how do we meaningfully balance access with the duty to preserve objects for future generations?This paper charts the development of one such project. Based at Fenton House in Hampstead, and running since 2008, the project seeks to model digitally the keyboard instruments in the Benton Fletcher Collection and provide a dedicated interactive exhibit, which allows visitors to view all of the instruments in situ, and then play them through a custom-built two-manual MIDI controller with touch-screen interface.We discuss the approach to modelling, which uses high-definition sampling, and highlight the strengths and weaknesses of the exhibit as it currently stands, with particular focus on its key shortcoming: at present, there is no way to effectively model the key feel of a historic keyboard instrument.This issue is of profound importance, since the feel of any instrument is fundamental to its character, and shapes the way performers relate to it. The issue is further compounded if we are to consider a single dedicated keyboard as being the primary mode of interface for several instrument models of different classes, each with its own characteristic feel.We conclude by proposing an outline solution to this problem, detailing early work on a real-time adaptive haptic keyboard interface that changes its action in response to sampled resistance curves, measured on a key-by-key basis from the original instruments
    corecore