60 research outputs found

    Physical Interactions with Digital Strings - A hybrid approach to a digital keyboard instrument

    Get PDF
    A new hybrid approach to digital keyboard playing is presented, where the actual acoustic sounds from a digital keyboard are captured with contact microphones and applied as excitation signals to a digital model of a prepared piano, i.e., an extended wave-guide model of strings with the possibility of stopping and muting the strings at arbitrary positions. The parameters of the string model are controlled through TouchKeys multitouch sensors on each key, combined with MIDI data and acoustic signals from the digital keyboard frame, using a novel mapping. The instrument is evaluated from a performing musician's perspective, and emerging playing techniques are discussed. Since the instrument is a hybrid acoustic-digital system with several feedback paths between the domains, it provides for expressive and dynamic playing, with qualities approaching that of an acoustic instrument, yet with new kinds of control. The contributions are two-fold. First, the use of acoustic sounds from a physical keyboard for excitations and resonances results in a novel hybrid keyboard instrument in itself. Second, the digital model of "inside piano" playing, using multitouch keyboard data, allows for performance techniques going far beyond conventional keyboard playing

    The creative act of live coding practice in music performance

    Get PDF
    Live coding is the creative act of interactive code evaluations and online multimodal assessments. In the context of music performance, novel code evaluations are becoming part of the running program and are interrelated to acoustic sounds. Performers’ and audience ability to experience these novel auditory percepts may involuntary engage our attention. In this study, we discuss how live coding is related to auditory and motor perception and how gestural interactions may influence musical algorithmic structures. Furthermore, we examine how musical live coding practices may bring forth emergent qualities of musical gestures on potentially equivalent systems. The main contribution of this study is a preliminary conceptual framework for evaluation of live coding systems. We discuss several live coding systems which exhibit broad variations on the proposed dimensional framework and two cases which go beyond the expressive capacity of the framework

    An analytical framework for musical live coding systems based on gestural interactions in performance practices

    Get PDF
    Gestural interaction in live coding performance is still in its infancy, albeit the long tradition in music performance studies. Computational challenges in musical live coding have been motivating the research community towards the development of novel programming languages and interfaces. On the other hand, given the maturity of many music systems there is an increasing demand for theory building on live coding systems and practices. Here, we present an observational study from videos of live performances available online and we introduce an analytical framework for live coding music systems. We begin by examining how performance practices differ on potentially equivalent systems. On the spotlight of the framework is the viewpoint of gestural interactions under the prism of music psychology and perception. We examined several systems on three main processes: (i) interface design, (ii) gestural mapping and (iii) user\u27s interaction. These processes are presented as an orthogonal three-dimensional framework, so to facilitate visualizations and readers\u27 understanding. Preliminary assessments of the systems in question agree with ground truth knowledge of the computational classification of the systems. Furthermore, we analyze a few notable systems that are stretching the boundaries of our dimensional framework, indicating that more dimensions may be required. Finally, we discuss the analytical framework in relation to a higher-level description of live coding music performance and we discuss future studies that may be conducted to assess the validity of this approach

    Circle Squared and Circle Keys: Performing on and with an unstable live algorithm for the Disklavier

    Get PDF
    ABSTRACT Two related versions of an unstable live algorithm for the Disklavier player piano are presented. The underlying generative feedback system consists of four virtual musicians, listening to each other in a circular configuration. There is no temporal form, and all parameters of the system are controlled by the performer through an intricate but direct mapping, in an attempt to combine the experienced musician's physical control of gesture and phrasing, with the structural complexities and richness of generative music. In the first version, Circle Squared, the interface is an array of pressure sensors, and the performer performs on the system without participating directly, like a puppet master. In the second version, control parameters are derived directly from playing on the same piano that performs the output of the system. Here, the performer both plays with and on the system in an intricate dance with the unpredictable output of the unstable virtual ensemble. The underlying mapping strategies are presented, together with the structure of the generative system. Experiences from a series of performances are discussed, primarily from the perspective of the improvising musician

    Generative comics: a character evolution approach for creating fictional comics

    Get PDF
    Comics can be a suitable form of representation for generative narrative. This paper provides an argument for this based on an analysis of properties of the comics medium, and describes a tool for character design and comic strip creation that applies interactive evolution methods to characters in a virtual environment. The system is used to interactively create artificial characters with extreme personality traits inspired by well-known comics characters

    Computational Systems for Music Improvisation

    Get PDF
    Computational music systems that afford improvised creative interaction in real time are often designed for a specific improviser and performance style. As such the field is diverse, fragmented and lacks a coherent framework. Through analysis of examples in the field we identify key areas of concern in the design of new systems, which we use as categories in the construction of a taxonomy. From our broad overview of the field we select significant examples to analyse in greater depth. This analysis serves to derive principles that may aid designers scaffold their work on existing innovation. We explore successful evaluation techniques from other fields and describe how they may be applied to iterative design processes for improvisational systems. We hope that by developing a more coherent design and evaluation process, we can support the next generation of improvisational music systems

    A MutaSynth in Parameter Space: Interactive Composition Through Evolution

    No full text
    In this vision paper I will discuss a few questions concerning the use of generative processes in composition and automatic music creation. Why do I do it, and does it really work? I discuss the problems involved, focusing on the use of interactivity, and describe the use of interactive evolution as a way of introducing interactivity in composition. The installation MutaSynth is presented as an implementation of this idea

    Creating and Exploring the Huge Space Called Sound: Interactive Evolution as a Composition Tool

    No full text
    This paper introduces a program that applies the principles of interactive evolution, i.e., the process of repeatedly selecting preferred individuals from a population of genetically bred sound objects, to different synthesis and pattern generation algorithms. This allows for aural real-time exploration of complex sound spaces, and introduces the task of constructing sound engines and instruments customized for this kind of creation and exploration. Several such sound engines are presented together with sound examples and a discussion of compositional applications
    • …
    corecore