34 research outputs found

    SPATIAL SWARM GRANULATION

    Get PDF
    This paper presents an implementation for dynamic two or three dimensional spatial distribution of granulated sound (or granular synthesis) over an arbitrary loudspeaker system

    Gestural control of sonic swarms: Composing with grouped sound objects

    Get PDF
    This paper outlines an alternative controller designed to diffuse and manipulate a swarm of sounds in 3- dimensional space and discusses the compositional issues that emerge from its use. The system uses an algorithm from a nature-derived model describing the spatial behavior of a swarm. The movement of the swarm is mapped in the 3- dimensional space and a series of sound transformation functions for the sonic agents are implemented. The notion of causal relationships is explored regarding the spatial movement of the swarm and sound transformation of the agents by employing the physical controller as a performance, compositional and diffusion tool

    Self-Organised Music

    Get PDF
    Self-organisation, as manifest, for example, by swarms, flock, herds and other collectives, is a powerful natural force, capable of generating large and sustained structures. Yet the individuals who participate in these social groups may not even be aware of the structures that they are creating. Almost certainly, these structures emerge through the application of simple, local interactions. Improvised music is an uncertain activity, characterised by a lack of top-down organisation and busy, local activity between improvisers. Emerging structures may only be perceivable at a (temporal) distance. The development of higher-level musical structure arises from interactions at lower levels, and we propose here that the self-organisation of social animals provides a very suggestive analogy. This paper builds a model of interactivity based on stigmergy, the process by which social insects communicate indirectly by environment modification. The improvisational element of our model arises from the dynamics of a particle swarm. A process called interpretation extracts musical parameters from the aural sound environment, and uses these parameters to place attractors in the environment of the swarm, after which stigmergy can take place. The particle positions are reinterpreted as parameterised audio events. This paper describes this model and two applications, Swarm Music and Swarm Granulator

    Creative Computers, Improvisation and Intimacy

    Get PDF
    Autonomous musical machine partners, live algorithms, are able to collaborate with human improvisers on an equal footing. Adaptability can be a significant factor in human/machine interaction in this context. Intimacy is an additional factor; intimacy might be achieved if human and machine performers can adapt to each other and learn from one another. Previously associated in computer music with ideas of embodiment and HCI, intimacy as more widely understood, refers to the interpersonal process enjoyed between individuals, in which personal self-disclosure finds validation through a partner's response. Real intimacies are learned over time, not designed, and are based upon an evident reciprocity and emergent mutuality. In the context of musical expression, a social rather than a biological/technological discourse can be applied to live algorithms with a capacity for learning. This possibility is explored with reference to the author's various improvisation/composition systems including au(or)a, piano_prosthesis, and oboe_prosthesis

    Goldsmiths Electronic Music Studios: 40 Years

    Get PDF
    This year marks the 40th anniversary of the founding of the Electronic Music Studios (EMS) at Goldsmiths, University of London. The 1968 studio placed Goldsmiths at the forefront of such developments in the UK university sector. 2008 also marks the launch of our EMS Research Group, which brings together a diverse range of interests and activities in computer music research, creative practice and music technology

    Stimulating creative flow through computational feedback

    Get PDF

    Clap-along: A negotiation strategy for creative musical interaction with computational systems

    Get PDF
    This paper describes Clap-along, an interactive system for theorising about creativity in improvised musical performance. It ex- plores the potential for negotiation between human and computer par- ticipants in a cyclical rhythmic duet. Negotiation is seen as one of a set of potential interactive strategies, but one that ensures the most equitable correspondence between human and machine. Through mutual negotia- tion (involving listening/feature extraction and adaptation) the two par- ticipants attempt to satisfy their own and each other’s target outcome, without knowing the other’s goal. Each iteration is evaluated by both participants and compared to their target. In this model of negotiation, we query the notion of ‘flow’ as an objective of creative human-computer collaboration. This investigation suggests the potential for sophisticated applications for real-time creative computational systems

    Flow Fields and Agents for Immersive Interaction in Mutator VR: Vortex

    Get PDF
    This paper discusses the challenges in creating Mutator VR: Vortex, a virtual reality experience based on interaction with semi-autonomous, organically-inspired agents. The work allows the immersant to morph between a vast number of procedurally- generated microworlds each with its own visual elements, sounds, agent dynamics, and user interactions. We outline two methods used for procedural generation that are based fundamentally on integration of di?erent modalities. Curve-based synthesis is used for simultaneous generation of entity sounds and shape and ?ow grains are employed to determine both agent dynamics and user interaction with the agents

    An Exploration of Creative Audio Spatialisation Tools for Ableton Live

    Get PDF
    Electronic music composers working within Ableton Live lack integrated spatialisation tools that give global control over spatial behaviour. Popular spatialisation tools like GRM Tools Spaces (2011) and Ableton Live’s Surround Panner are tied to specific speaker layouts presenting several drawbacks. Firstly, the tools cannot be chained together as is standard practice with stereo plugins, limiting their creative potential. Secondly, Ableton Live channels are restricted to stereo, making the setup of these tools a complicated and slow process, requiring many additional channels to route spatial audio signals. Other spatialisation tools such as the IEM Plug-in Suite (2020) and Envelop (2020) use ambisonics to enable the chaining of effects but are not sufficient for composers, primarily due to their utility-focused nature or unintuitive user interfaces. This thesis proposes a solution utilising the Max for Live device format and 5th order ambisonic audio encoding to decouple the spatialisation from a specified speaker layout and enable chaining of spatial effects. The new tools integrate effects into the spatialisation process and enable a more rapid workflow for composers. Audio examples demonstrate the creative potential of the tools
    corecore