1,716 research outputs found

    A Conceptual Framework for Motion Based Music Applications

    Get PDF
    Imaginary projections are the core of the framework for motion based music applications presented in this paper. Their design depends on the space covered by the motion tracking device, but also on the musical feature involved in the application. They can be considered a very powerful tool because they allow not only to project in the virtual environment the image of a traditional acoustic instrument, but also to express any spatially defined abstract concept. The system pipeline starts from the musical content and, through a geometrical interpretation, arrives to its projection in the physical space. Three case studies involving different motion tracking devices and different musical concepts will be analyzed. The three examined applications have been programmed and already tested by the authors. They aim respectively at musical expressive interaction (Disembodied Voices), tonal music knowledge (Harmonic Walk) and XX century music composition (Hand Composer)

    Interactive Spaces. Models and Algorithms for Reality-based Music Applications

    Get PDF
    Reality-based interfaces have the property of linking the user's physical space with the computer digital content, bringing in intuition, plasticity and expressiveness. Moreover, applications designed upon motion and gesture tracking technologies involve a lot of psychological features, like space cognition and implicit knowledge. All these elements are the background of three presented music applications, employing the characteristics of three different interactive spaces: a user centered three dimensional space, a floor bi-dimensional camera space, and a small sensor centered three dimensional space. The basic idea is to deploy the application's spatial properties in order to convey some musical knowledge, allowing the users to act inside the designed space and to learn through it in an enactive way

    Virtual kompang: mapping in-air hand gestures for music interaction using gestural musical controller

    Get PDF
    The introduction of new gesture interfaces has been expanding the possibilities of creating new Digital Musical Instruments (DMI). However, the created interfaces are mainly focused on modern western musical instruments such as piano, drum and guitar. This paper presents a virtual musical instrument, namely Virtual Kompang, a traditional Malay percussion instrument. The interface design and its implementation are presented in this paper. The results of a guessability study are presented in the study to elicit end-user hand movement to map onto commands. The study demonstrated the existing of common hand gestures among the users on mapping with the selected commands. A consensus set of gestures is presented as the outcome of this study.Keywords: Digital Music Instrument, Virtual Environment, Gestural Control, Leap Motion, Virtual Instrumen

    Demonstrating Interactive Machine Learning Tools for Rapid Prototyping of Gestural Instruments in the Browser

    Get PDF
    These demonstrations will allow visitors to prototype gestural, interactive musical instruments in the browser. Different browser based synthesisers can be controlled by either a Leap Motion sensor or a Myo armband. The visitor will be able to use an interactive machine learning toolkit to quickly and iteratively explore different interaction possibilities. The demonstrations show how interactive, browser-based machine learning tools can be used to rapidly prototype gestural controllers for audio. These demonstrations showcase RapidLib, a browser based machine learning library developed through the RAPID-MIX project

    Rhythmic Micro-Gestures: Discreet Interaction On-the-Go

    Get PDF
    We present rhythmic micro-gestures, micro-movements of the hand that are repeated in time with a rhythm. We present a user study that investigated how well users can perform rhythmic micro-gestures and if they can use them eyes-free with non-visual feedback. We found that users could successfully use our interaction technique (97% success rate across all gestures) with short interaction times, rating them as low difficulty as well. Simple audio cues that only convey the rhythm outperformed animations showing the hand movements, supporting rhythmic micro-gestures as an eyes-free input technique

    Plays of proximity and distance: Gesture-based interaction and visual music

    Get PDF
    This thesis presents the relations between gestural interfaces and artworks which deal with real- time and simultaneous performance of dynamic imagery and sound, the so called visual music practices. Those relation extend from a historical, practical and theoretical viewpoint, which this study aims to cover, at least partially, all of them. Such relations are exemplified by two artistic projects developed by the author of this thesis, which work as a starting point for analysing the issues around the two main topics. The principles, patterns, challenges and concepts which struc- tured the two artworks are extracted, analysed and discussed, providing elements for comparison and evaluation, which may be useful for future researches on the topic

    Electrifying Opera, Amplifying Agency: Designing a performer-controlled interactive audio system for opera singers

    Get PDF
    This artistic research project examines the artistic, technical, and pedagogical challenges of developing a performer-controlled interactive technology for real-time vocal processing of the operatic voice. As a classically trained singer-composer, I have explored ways to merge the compositional aspects of transforming electronic sound with the performative aspects of embodied singing. I set out to design, develop, and test a prototype for an interactive vocal processing system using sampling and audio processing methods. The aim was to foreground and accommodate an unamplified operatic voice interacting with the room's acoustics and the extended disembodied voices of the same performer. The iterative prototyping explored the performer's relationship to the acoustic space, the relationship between the embodied acoustic voice and disembodied processed voice(s), and the relationship to memory and time. One of the core challenges was to design a system that would accommodate mobility and allow interaction based on auditory and haptic cues rather than visual. In other words, a system allowing the singer to control their sonic output without standing behind a laptop. I wished to highlight and amplify the performer's agency with a system that would enable nuanced and variable vocal processing, be robust, teachable, and suitable for use in various settings: solo performances, various types and sizes of ensembles, and opera. This entailed mediating different needs, training, and working methods of both electronic music and opera practitioners. One key finding was that even simple audio processing could achieve complex musical results. The audio processes used were primarily combinations of feedback and delay lines. However, performers could get complex musical results quickly through continuous gestural control and the ability to route signals to four channels. This complexity sometimes led to surprising results, eliciting improvisatory responses also from singers without musical improvisation experience. The project has resulted in numerous vocal solo, chamber, and operatic performances in Norway, the Netherlands, Belgium, and the United States. The research contributes to developing emerging technologies for live electronic vocal processing in opera, developing the improvisational performance skills needed to engage with those technologies, and exploring alternatives for sound diffusion conducive to working with unamplified operatic voices. Links: Exposition and documentation of PhD research in Research Catalogue: Electrifying Opera, Amplifying Agency. Artistic results. Reflection and Public Presentations (PhD) (2023): https://www.researchcatalogue.net/profile/show-exposition?exposition=2222429 Home/Reflections: https://www.researchcatalogue.net/view/2222429/2222460 Mapping & Prototyping: https://www.researchcatalogue.net/view/2222429/2247120 Space & Speakers: https://www.researchcatalogue.net/view/2222429/2222430 Presentations: https://www.researchcatalogue.net/view/2222429/2247155 Artistic Results: https://www.researchcatalogue.net/view/2222429/222248

    GestureChords: Transparency in gesturally controlled digital musical instruments through iconicity and conceptual metaphor

    Get PDF
    This paper presents GestureChords, a mapping strategy for chord selection in freehand gestural instruments. The strategy maps chord variations to a series of hand postures using the concepts of iconicity and conceptual metaphor, influenced by their use in American Sign Language (ASL), to encode meaning in gestural signs. The mapping uses the conceptual metaphors MUSICAL NOTES ARE POINTS IN SPACE and INTERVALS BETWEEN NOTES ARE SPACES BETWEEN POINTS, which are mapped respectively to the number of extended fingers in a performer’s hand and the abduction or adduction between them. The strategy is incorporated into a digital musical instrument and tested in a preliminary study for transparency by both performers and spectators, which gave promising results for the technique

    Towards Engaging Intangible Holographic Public Displays

    Get PDF
    Public displays are some of the most challenging interfaces to design because of two key characteristics. First, the experience should be engaging, to attract and maintain users’ attention. Second, the interaction with the display should be natural, meaning that users should be able to receive the desired output with little or no training. Holographic displays are increasingly popular in public spaces such as museums and concert halls but there is little published research on users’ experiences with such displays. Previous research has suggested both tangible and intangible inputs as engaging and natural options for holographic displays, but there is no conclusive evidence on their relative merits. Hence, we run a study to investigate the user experience with a holographic display comparing the level of engagement and feeling of natural experience in the interacting process. We used a mix of surveys, interviews, video recordings, and task-based metrics to measure users’ performance on a specific task, the perceived usability, and levels of engagement and satisfaction. Our findings suggest that a tangible input was reported as more natural than the intangible one, however, both tangible and intangible inputs were found to be equally engaging. The latter findings contribute to the efforts of designing intangible public holographic displays and other interactive systems that take into consideration health safety issues, especially during the Covid-19 pandemic era in which contamination can be established with tangible and physical interaction between users and public displays, yet without affecting the level of engagement compared to the tangible experience
    • …
    corecore