2 research outputs found

    An Introduction to Interactive Music for Percussion and Computers

    Get PDF
    Composers began combining acoustic performers with electronically produced sounds in the early twentieth century. As computer-processing power increased the potential for significant musical communication was developed. Despite the body of research concerning electronic music, performing a composition with a computer partner remains intimidating for performers. The purpose of this paper is to provide an introductory method for interacting with a computer. This document will first follow the parallel evolution of percussion and electronics in order to reveal how each medium was influenced by the other. The following section will define interaction and explain how this is applied to musical communication between humans and computers. The next section introduces specific techniques used to cultivate human-computer interaction. The roles of performer, instrument, composer and conductor will then be defined as they apply to the human performer and the computer. If performers are aware of these roles they will develop richer communication that can enhance the performer's and audience member's recognition of human-computer interaction. In the final section, works for percussion and computer will be analyzed to reveal varying levels of interaction and the shifting roles of the performer. Three compositions will illustrate this point, 120bpm from neither Anvil nor Pulley by Dan Trueman, It's Like the Nothing Never Was by Von Hansen, and Music for Snare Drum and Computer by Cort Lippe. These three pieces develop a continuum of increasing interaction, moving from interaction within a fully defined score, to improvisation with digital synthesis, to the manipulation of computerized compositional algorithms using performer input. The unique ways each composer creates interaction will expose the vast possibilities for performing with interactive music systems

    Towards a Better Understanding of Emotion Communication in Music: An Interactive Production Approach.

    Get PDF
    It has been well established that composers and performers are able to encode certain emotional expressions in music, which in turn are decoded by listeners, and in general, successfully recognised. There is still much to discover, however, as to how musical cues combine to shape different emotions in the music, since previous literature has tended to focus on a limited number of cues and emotional expressions. The work in this thesis aims to investigate how combinations of tempo, articulation, pitch, dynamics, brightness, mode, and later, instrumentation, are used to shape sadness, joy, calmness, anger, fear, power, and surprise in Western tonal music. In addition, new tools for music and emotion research are presented with the aim of providing an efficient production approach to explore a large cue-emotion space in a relatively short time. To this end, a new interactive interface called EmoteControl was created which allows users to alter musical pieces in real-time through the available cues. Moreover, musical pieces were specifically composed to be used as stimuli. Empirical experiments were then carried out with the interface to determine how participants shaped different emotions in the pieces using the available cues. Specific cue combinations for the different emotions were produced. Findings revealed that overall, mode and tempo were the strongest contributors to the conveyed emotion whilst brightness was the least effective cue. However, the importance of the cues varied depending on the intended emotion. Finally, a comparative evaluation of production and traditional approaches was carried out which showed that similar results may be obtained with both. However, the production approach allowed for a larger cue-emotion space to be navigated in a shorter time. In sum, the production approach allowed participants to directly show us how they think emotional expressions should sound, and how they are shaped in music
    corecore