39 research outputs found

    Back To The Cross-Modal Object: A Look Back At Early Audiovisual Performance Through The Lens Of Objecthood

    Get PDF
    This paper looks at 2 early digital audiovisual performance works, solo work Overbow and the group Sensors_Sonics_Sights (S.S.S) and describes the compositional and performance strategies behind each one. We draw upon the concept of audiovisual objecthood proposed by Kubovy and Schutz to think about the different ways in which linkages between vision and audition can be established, and how audio-visual objects can be composed from the specific attributes of auditory and visual perception. The model is used as a means to analyze these live audio-visual works performed using sensor-based instruments. The fact that gesture is not the only visual component in these performances, and is the common source articulating sound and visual output, extends the classical 2-way audiovisual object into a three-way relationship between gesture, sound, and image, fulfilling a potential of cross-modal objects

    NIME Identity from the Performer’s Perspective

    Get PDF
    The term ‘NIME’ - New Interfaces for Musical Expression - has come to signify both technical and cultural characteristics. Not all new musical instruments are NIMEs, and not all NIMEs are defined as such for the sole ephemeral condition of being new. So, what are the typical characteristics of NIMEs and what are their roles in performers’ practice? Is there a typical NIME repertoire? This paper aims to address these questions with a bottom up approach. We reflect on the answers of 78 NIME performers to an online questionnaire discussing their performance experience with NIMEs. The results of our investigation explore the role of NIMEs in the performers’ practice and identify the values that are common among performers. We find that most NIMEs are viewed as exploratory tools created by and for performers, and that they are constantly in development and almost in no occasions in a finite state. The findings of our survey also reflect upon virtuosity with NIMEs, whose peculiar performance practice results in learning trajectories that often do not lead to the development of virtuosity as it is commonly understood in traditional performanc

    On Mapping EEG Information into Music

    Get PDF
    With the rise of ever-more affordable EEG equipment available to musicians, artists and researchers, designing and building a Brain-Computer Music Interface (BCMI) system has recently become a realistic achievement. This chapter discusses previous research in the fields of mapping, sonification and musification in the context of designing a BCMI system and will be of particular interest to those who seek to develop their own. Design of a BCMI requires unique consider-ations due to the characteristics of the EEG as a human interface device (HID). This chapter analyses traditional strategies for mapping control from brain waves alongside previous research in bio-feedback musical systems. Advances in music technology have helped provide more complex approaches with regards to how music can be affected and controlled by brainwaves. This, paralleled with devel-opments in our understanding of brainwave activity has helped push brain-computer music interfacing into innovative realms of real-time musical perfor-mance, composition and applications for music therapy

    BCI for Music Making: Then, Now, and Next

    Get PDF
    Brain–computer music interfacing (BCMI) is a growing field with a history of experimental applications derived from the cutting edge of BCI research as adapted to music making and performance. BCMI offers some unique possibilities over traditional music making, including applications for emotional music selection and emotionally driven music creation for individuals as communicative aids (either in cases where users might have physical or mental disabilities that otherwise preclude them from taking part in music making or in music therapy cases where emotional communication between a therapist and a patient by means of traditional music making might otherwise be impossible). This chapter presents an overview of BCMI and its uses in such contexts, including existing techniques as they are adapted to musical control, from P300 and SSVEP (steady-state visually evoked potential) in EEG (electroencephalogram) to asymmetry, hybrid systems, and joint fMRI (functional magnetic resonance imaging) studies correlating affective induction (by means of music) with neurophysiological cues. Some suggestions for further work are also volunteered, including the development of collaborative platforms for music performance by means of BCMI

    Cinema Fabriqué : a gestural environment for realtime video performance

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2003.Includes bibliographical references (p. [107]-[108]).This thesis presents an environment that enables a single person to improvise video and audio programming in real time through gesture control. The goal of this system is to provide the means to compose and edit video stories for a live audience with an interface that is exposed and engaging to watch. Many of the software packages used today for realtime audio-visual performance were not built with this use in mind, and have been repurposed or modified with plug-ins to meet the performer's needs. Also, these applications are typically controlled by standard keyboard, mouse, or MIDI inputs, which were not designed for precise video control or live spectacle. As an alternative I built a system called Cinema Fabriqué which integrates video editing and effects software and hand gesture tracking methods into a single system for audio-visual performance.by Justin Manor.S.M

    The Murray State News, October 20, 1989

    Get PDF

    The Murray State News, October 20, 1989

    Get PDF
    corecore