28,883 research outputs found

    Immersion and togetherness: How live visualization of audience engagement can enhance music events

    Get PDF
    This paper evaluates the influence of an additional visual aesthetic layer on the experience of concert goers during a live event. The additional visual layer incorporates musical features as well as bio-sensing data collected during the concert, which is coordinated by our audience engagement monitoring technology. This technology was used during a real Jazz concert. The collected measurements were used in an experiment with 32 participants, where two different forms of visualization were compared: one factoring in music amplitude, audience engagement collected by the sensors and the dynamic atmosphere of the event, the other one purely relying on the beat of the music. The findings indicate that the visual layer could add value to the experience if used during a live concert, providing a higher level of immersion and feeling of togetherness among the audience

    Refining personal and social presence in virtual meetings

    Get PDF
    Virtual worlds show promise for conducting meetings and conferences without the need for physical travel. Current experience suggests the major limitation to the more widespread adoption and acceptance of virtual conferences is the failure of existing environments to provide a sense of immersion and engagement, or of ‘being there’. These limitations are largely related to the appearance and control of avatars, and to the absence of means to convey non-verbal cues of facial expression and body language. This paper reports on a study involving the use of a mass-market motion sensor (Kinectℱ) and the mapping of participant action in the real world to avatar behaviour in the virtual world. This is coupled with full-motion video representation of participant’s faces on their avatars to resolve both identity and facial expression issues. The outcomes of a small-group trial meeting based on this technology show a very positive reaction from participants, and the potential for further exploration of these concepts

    Sensing and mapping for interactive performance

    Get PDF
    This paper describes a trans-domain mapping (TDM) framework for translating meaningful activities from one creative domain onto another. The multi-disciplinary framework is designed to facilitate an intuitive and non-intrusive interactive multimedia performance interface that offers the users or performers real-time control of multimedia events using their physical movements. It is intended to be a highly dynamic real-time performance tool, sensing and tracking activities and changes, in order to provide interactive multimedia performances. From a straightforward definition of the TDM framework, this paper reports several implementations and multi-disciplinary collaborative projects using the proposed framework, including a motion and colour-sensitive system, a sensor-based system for triggering musical events, and a distributed multimedia server for audio mapping of a real-time face tracker, and discusses different aspects of mapping strategies in their context. Plausible future directions, developments and exploration with the proposed framework, including stage augmenta tion, virtual and augmented reality, which involve sensing and mapping of physical and non-physical changes onto multimedia control events, are discussed

    Kinect-ed Piano

    Get PDF
    We describe a gesturally-controlled improvisation system for an experimental pianist, developed over several laboratory sessions and used during a performance [1] at the 2011 Conference on New Inter- faces for Musical Expression (NIME). We discuss the architecture and performative advantages and limitations of our gesturally-controlled improvisation system, and reflect on the lessons learned throughout its development. KEYWORDS: piano; improvisation; gesture recognition; machine learning
    • 

    corecore