80 research outputs found

    The Body as Musical Instrument

    Get PDF
    This chapter explores the possibility of thinking of the human body as musical instrument. It builds on the philosophy of phenomenology to discuss body schemata that might be considered ā€œinstrumentalā€ and discusses the diversity of bodies proposed by body theory to consider the incorporation of digital technology. Concepts of embodied interaction from the scientific field of humanā€“computer interaction are discussed with an eye toward musical application. The history of gestural musical instruments is presented, from the Theremin to instruments from the STEIM studio. The text then focuses on the use of physiological signals to create music, from historical works of Lucier and Rosenboom to recent performances by the authors. The body as musical instrument is discussed in a dynamic of coadaptation between performer and instrument in different configurations of body and technology

    Interaction Design for Digital Musical Instruments

    Get PDF
    The thesis aims to elucidate the process of designing interactive systems for musical performance that combine software and hardware in an intuitive and elegant fashion. The original contribution to knowledge consists of: (1) a critical assessment of recent trends in digital musical instrument design, (2) a descriptive model of interaction design for the digital musician and (3) a highly customisable multi-touch performance system that was designed in accordance with the model. Digital musical instruments are composed of a separate control interface and a sound generation system that exchange information. When designing the way in which a digital musical instrument responds to the actions of a performer, we are creating a layer of interactive behaviour that is abstracted from the physical controls. Often, the structure of this layer depends heavily upon: 1. The accepted design conventions of the hardware in use 2. Established musical systems, acoustic or digital 3. The physical configuration of the hardware devices and the grouping of controls that such configuration suggests This thesis proposes an alternate way to approach the design of digital musical instrument behaviour ā€“ examining the implicit characteristics of its composite devices. When we separate the conversational ability of a particular sensor type from its hardware body, we can look in a new way at the actual communication tools at the heart of the device. We can subsequently combine these separate pieces using a series of generic interaction strategies in order to create rich interactive experiences that are not immediately obvious or directly inspired by the physical properties of the hardware. This research ultimately aims to enhance and clarify the existing toolkit of interaction design for the digital musician

    Sensor-rich real-time adaptive gesture and affordance learning platform for electronic music control

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2004.Includes bibliographical references (p. [151]-156).Acoustic musical instruments have traditionally featured static mappings from input gesture to output sound, their input affordances being tied to the physics of their sound-production mechanism. More recently, the advent of digital sound synthesizers and electronic music controllers has abolished the tight coupling between input gesture and resultant sound, making an exponentially large range of input-to-output mappings possible, as well as an infinite set of possible timbres. This revolutionary change in the way sound can be produced and controlled brings with it the burden of design: Compelling and natural mappings from gesture to sound now must be created in order to create a playable electronic music instrument. The goal of this thesis is to present a device that allows flexible assignment of input gesture to output sound, so acting as a laboratory to help further understanding about the connection from gesture to sound. An embodied multi-degree-of-freedom gestural input device was constructed. The device was built to support six-degree-of-freedom inertial sensing, five isometric buttons, two digital buttons, two-axis bend sensing, isometric rotation sensing, and isotonic electric field sensing of position. Software was written to handle the incoming serial data, and to implement a trainable interface by which a user can explore the sounds possible with the device, associate a custom inertial gesture with a sound for later playback, make custom input degree-of-freedom (DOF) to effect modulation mappings, and play with the resulting configuration. A user study with 25 subjects was run to evaluate the system in terms of its engaging-ness, enjoyability, ability to inspire interest in future play and performance,(cont.) ease of gesturing and novelty. In addition to these subjective measures, implicit data was collected about the types of gesture-to-sound and input-DOF-to-effect mappings that the subjects created. Favorable and interesting results were found in the data from the study, indicating that a flexible trainable musical instrument is not only a compelling performance tool, but is a useful laboratory for understanding the connection between human gesture and sound.by Jeffrey Merrill.S.M

    Advancing performability in playable media : a simulation-based interface as a dynamic score

    Get PDF
    ļ»æļ»æWhen designing playable media with non-game orientation, alternative play scenarios to gameplay scenarios must be accompanied by alternative mechanics to game mechanics. Problems of designing playable media with non-game orientation are stated as the problems of designing a platform for creative explorations and creative expressions. For such design problems, two requirements are articulated: 1) play state transitions must be dynamic in non-trivial ways in order to achieve a significant level of engagement, and 2) pathways for playersā€™ experience from exploration to expression must be provided. The transformative pathway from creative exploration to creative expression is analogous to pathways for game playersā€™ skill acquisition in gameplay. The paper first describes a concept of simulation-based interface, and then binds that concept with the concept of dynamic score. The former partially accounts for the first requirement, the latter the second requirement. The paper describes the prototype and realization of the two conceptsā€™ binding. ā€œScoreā€ is here defined as a representation of cue organization through a transmodal abstraction. A simulation based interface is presented with swarm mechanics and its function as a dynamic score is demonstrated with an interactive musical composition and performance

    Interaction and the Art of User-Centered Digital Musical Instrument Design

    Get PDF
    This thesis documents the formulation of a research-based practice in multimedia art, technology and digital musical instrument design. The primary goal of my research was to investigate the principles and methodologies involved in the structural design of new interactive digital musical instruments aimed at performance by members of the general public, and to identify ways that the design process could be optimized to increase user adoption of these new instruments. The research was performed over three years and moved between studies at the University of Maine, internships in New York, and specialized research at the Input Devices and Music Interaction Laboratory at McGill University. My work is presented in two sections. The first covers early studies in user interaction and exploratory works in web and visual design, sound art, installation, and music performance. While not specifically tied to the research topic of user adoption of digital musical instruments, this work serves as the conceptual and technical background for the dedicated work to follow. The second section is dedicated to focused research on digital musical instrument design through two major projects carried out as a Graduate Research Trainee at McGill University. The first was the design and prototype of the Noisebox, a new digital musical instrument. The purpose of this project was to learn the various stages of instrument design through practical application. A working prototype has been presented and tested, and a second version is currently being built. The second project was a user study that surveyed musicians about digital musical instrument use. It asked questions about background, instrument choice, music styles played, and experiences with and attitudes towards new digital musical instruments. Based on the results of the two research projects, a model of digital musical instrument design is proposed that adopts a user-centered focus, soliciting user input and feedback throughout the design process from conception to final testing. This approach aims to narrow the gap between conceptual design of new instruments and technologies and the actual musicians who would use them

    The composer as technologist : an investigation into compositional process

    Get PDF
    This work presents an investigation into compositional process. This is undertaken where a study of musical gesture, certain areas of cognitive musicology, computer vision technologies and object-orientated programming, provide the basis for a composer (author) to assume the role of a technologist and acquire knowledge and skills to that end. In particular, it focuses on the application and development of a video gesture recognition heuristic to the compositional problems posed. The result is the creation of an interactive musical work with score for violin and electronics that supports the research findings. In addition, the investigative approach into developing technology to solve musical problems that explores practical composition and aesthetic challenges is detailed

    FLOW. INTERACTIVE SONIC ART: THE CREATION AND USE OF RESPONSIVE STRATEGIES TO RE-IMAGINE THE PERFORMER/SPECTATOR RELATIONSHIP AND CREATE VISITOR INCLUSIVE SONIC ENVIRONMENTS.

    Get PDF
    FLOW operates on two levels, firstly as an engaging live performance environment and secondly as a vehicle to discuss a number of philosophical ideas relating to sound as art. As a performance piece FLOW exists to provide an inclusive interactive environment for musicians and casual visitors alike. A series of sensors allow those who enter the arena to make interventions in an immersive soundscape through their movements, opening up possibilities for the exploration of sound and gestural action within the space. The piece challenges the conventional roles of performer and spectator and offers interactive technology as a means of uniting the two. The artist creates a re-imagination of the performance paradigm based on active engagement rather than passive observance through the establishment of a circular discourse between human and computer. The following paper will also examine the nature of sound as art, suggesting that the poststructural ideas of Derrida and Deleuze and Guattari can be used as a conduit to define sonic emergences and morphologies within a Human/computer discourse, both in terms of timbral nature and spatial diffusion. Central to this is the concept that suggests the relationship between man and machine in interactive sonic art is one of energy transfer from organic fluidity to digital regulation and back to energy in the form of processed sound, according to the processes put in place. This leads into a final discussion of the nature of experimental compositional process, the choice between the determinate and the stochastic and the compromises between these that may need to be made to retain artistic coherence

    The cyber-guitar system: a study in technologically enabled performance practice

    Get PDF
    A thesis submitted to the Faculty of Humanities, University of the Witwatersrand, in fulfilment of the requirements for the degree of Doctor of Philosophy, March 2017This thesis documents the development and realisation of an augmented instrument, expressed through the processes of artistic practice as research. The research project set out to extend my own creative practice on the guitar by technologically enabling and extending the instrument. This process was supported by a number of creative outcomes (performances, compositions and recordings), running parallel to the interrogation of theoretical areas emerging out of the research. In the introduction I present a timeline for the project and situate the work in the field of artistic practice as research, explaining relationship between the traditional and creative practices. Following on from this chapter one, Notation, Improvisation and the Cyber-Guitar System discusses the impact of notation on my own education as a musician, unpacking how the nature of notation impacted on improvisation both historically and within my own creative work. Analysis of fields such as graphic notation led to the creation of the composition Hymnus Caesus Obcessiones, a central work in this research. In chapter two, Noise, Music and the Creative Boundary I consider the boundary and relationship between noise and music, beginning with the futurist composer Luigi Russolo. The construction of the augmented instrument was informed by this boundary and aimed to bring the lens onto this in my own practice, recognising what I have termed the ephemeral noise boundary. I argue that the boundary line between them yields the most fertile place of sonic and technological engagement. Chapter three focuses on the instrumental development and a new understanding of organology. It locates an understanding of the position of the musical instrument historically with reference to the values emerging from the studies of notation and noise. It also considers the impacts of technology and gestural interfacing. Chapter four documents the physical process of designing and building the guitar. Included in the Appendix are three CDs and a live DVD of the various performances undertaken across the years of research.XL201

    Interactive Spaces: Model for Motion-based Music Applications

    Get PDF
    With the extensive utilization of touch screens, smartphones and various reactive surfaces, reality- based and intuitive interaction styles have now become customary. The employment of larger interactive areas, like floors or peripersonal three-dimensional spaces, further increase the reality- based interaction affordances, allowing full-body involvement and the development of a co- located, shared user experience. Embodied and spatial cognition play a fundamental role for the interaction in this kind of spaces, where users act in the reality with no device in the hands and obtain an audio and graphical output depending on their movements. Starting from the early experiments of Myron Krueger in 1971, responsive floors have been developed through various technologies including sensorized tiles and computer vision systems, to be employed in learn- ing environments, entertainment, games and rehabilitation. Responsive floors allow the spatial representation of concepts and for this reason are suitable for immediate communication and engagement. As many musical features have meaningful spatial representations, they can easily be reproduced in the physical space through a conceptual blending approach and be made available to a great number of users. This is the key idea for the design of the original music applications presented in this thesis. The applications, devoted to music learning, production and active listening, introduce a novel creative approach to music, which can be further assumed as a general paradigm for the design of motion-based learning environments. Application assessment with upper elementary and high school students has proved that users engagement and bodily inter- action have a high learning power, which can be a valid resource for deeper music knowledge and more creative learning processes. Although further interface tests showed that touch screen interaction performs better than full-body interaction, some important guidelines for the design of reactive floors applications have been obtained on the basis of these test results. Moreover, the conceptual framework developed for the design of music applications can represent a valid paradigm also in the general field of human-computer interaction

    Multiparametric interfaces for fine-grained control of digital music

    Get PDF
    Digital technology provides a very powerful medium for musical creativity, and the way in which we interface and interact with computers has a huge bearing on our ability to realise our artistic aims. The standard input devices available for the control of digital music tools tend to afford a low quality of embodied control; they fail to realise our innate expressiveness and dexterity of motion. This thesis looks at ways of capturing more detailed and subtle motion for the control of computer music tools; it examines how this motion can be used to control music software, and evaluates musiciansā€™ experience of using these systems. Two new musical controllers were created, based on a multiparametric paradigm where multiple, continuous, concurrent motion data streams are mapped to the control of musical parameters. The first controller, Phalanger, is a markerless video tracking system that enables the use of hand and finger motion for musical control. EchoFoam, the second system, is a malleable controller, operated through the manipulation of conductive foam. Both systems use machine learning techniques at the core of their functionality. These controllers are front ends to RECZ, a high-level mapping tool for multiparametric data streams. The development of these systems and the evaluation of musiciansā€™ experience of their use constructs a detailed picture of multiparametric musical control. This work contributes to the developing intersection between the fields of computer music and human-computer interaction. The principal contributions are the two new musical controllers, and a set of guidelines for the design and use of multiparametric interfaces for the control of digital music. This work also acts as a case study of the application of HCI user experience evaluation methodology to musical interfaces. The results highlight important themes concerning multiparametric musical control. These include the use of metaphor and imagery, choreography and language creation, individual differences and uncontrol. They highlight how this style of interface can fit into the creative process, and advocate a pluralistic approach to the control of digital music tools where different input devices fit different creative scenarios
    • ā€¦
    corecore