6,888 research outputs found

    The Translocal Event and the Polyrhythmic Diagram

    Get PDF
    This thesis identifies and analyses the key creative protocols in translocal performance practice, and ends with suggestions for new forms of transversal live and mediated performance practice, informed by theory. It argues that ontologies of emergence in dynamic systems nourish contemporary practice in the digital arts. Feedback in self-organised, recursive systems and organisms elicit change, and change transforms. The arguments trace concepts from chaos and complexity theory to virtual multiplicity, relationality, intuition and individuation (in the work of Bergson, Deleuze, Guattari, Simondon, Massumi, and other process theorists). It then examines the intersection of methodologies in philosophy, science and art and the radical contingencies implicit in the technicity of real-time, collaborative composition. Simultaneous forces or tendencies such as perception/memory, content/ expression and instinct/intellect produce composites (experience, meaning, and intuition- respectively) that affect the sensation of interplay. The translocal event is itself a diagram - an interstice between the forces of the local and the global, between the tendencies of the individual and the collective. The translocal is a point of reference for exploring the distribution of affect, parameters of control and emergent aesthetics. Translocal interplay, enabled by digital technologies and network protocols, is ontogenetic and autopoietic; diagrammatic and synaesthetic; intuitive and transductive. KeyWorx is a software application developed for realtime, distributed, multimodal media processing. As a technological tool created by artists, KeyWorx supports this intuitive type of creative experience: a real-time, translocal “jamming” that transduces the lived experience of a “biogram,” a synaesthetic hinge-dimension. The emerging aesthetics are processual – intuitive, diagrammatic and transversal

    Computers in Support of Musical Expression

    Get PDF

    The ixiQuarks: merging code and GUI in one creative space

    Get PDF
    This paper reports on ixiQuarks; an environment of instruments and effects that is built on top of the audio programming language SuperCollider. The rationale of these instruments is to explore alternative ways of designing musical interaction in screen-based software, and investigate how semiotics in interface design affects the musical output. The ixiQuarks are part of external libraries available to SuperCollider through the Quarks system. They are software instruments based on a non- realist design ideology that rejects the simulation of acoustic instruments or music hardware and focuses on experimentation at the level of musical interaction. In this environment we try to merge the graphical with the textual in the same instruments, allowing the user to reprogram and change parts of them in runtime. After a short introduction to SuperCollider and the Quark system, we will describe the ixiQuarks and the philosophical basis of their design. We conclude by looking at how they can be seen as epistemic tools that influence the musician in a complex hermeneutic circle of interpretation and signification

    Music Information Retrieval in Live Coding: A Theoretical Framework

    Get PDF
    The work presented in this article has been partly conducted while the first author was at Georgia Tech from 2015–2017 with the support of the School of Music, the Center for Music Technology and Women in Music Tech at Georgia Tech. Another part of this research has been conducted while the first author was at Queen Mary University of London from 2017–2019 with the support of the AudioCommons project, funded by the European Commission through the Horizon 2020 programme, research and innovation grant 688382. The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.Music information retrieval (MIR) has a great potential in musical live coding because it can help the musician–programmer to make musical decisions based on audio content analysis and explore new sonorities by means of MIR techniques. The use of real-time MIR techniques can be computationally demanding and thus they have been rarely used in live coding; when they have been used, it has been with a focus on low-level feature extraction. This article surveys and discusses the potential of MIR applied to live coding at a higher musical level. We propose a conceptual framework of three categories: (1) audio repurposing, (2) audio rewiring, and (3) audio remixing. We explored the three categories in live performance through an application programming interface library written in SuperCollider, MIRLC. We found that it is still a technical challenge to use high-level features in real time, yet using rhythmic and tonal properties (midlevel features) in combination with text-based information (e.g., tags) helps to achieve a closer perceptual level centered on pitch and rhythm when using MIR in live coding. We discuss challenges and future directions of utilizing MIR approaches in the computer music field

    AUMI-Futurism: the Elsewhere and "Elsewhen" of (Un)Rolling the Boulder and Turning the Page

    Get PDF
    This article discusses two performances that used the movement-to-music technology known as the "Adaptive Use Musical Instrument" or AUMI to allow differently-abled participants to collaborate with one another: (Un)Rolling the Boulder: Improvising New Communities, a multimedia, mixed-ability improvisation that was staged at the University of Kansas in October 2013 and Turning the Page, an interdisciplinary musical theatre piece premiered in Ottawa, Canada in April 2014. We theorize these performances as examples of "AUMI-Futurism”, combining insights gleaned from two different sources: the Afrofuturist philosophy of composer, improviser, and bandleader Sun Ra, and the work of disability studies scholar Alison Kafer. This essay examines the collaborative, improvisatory processes that surrounded (Un)Rolling the Boulder and Turning the Page, focusing in particular on the role that the AUMI software played in imagining and performing new communities

    MINDtouch embodied ephemeral transference: Mobile media performance research

    Get PDF
    This is the post-print version of the final published article that is available from the link below. Copyright @ Intellect Ltd 2011.The aim of the author's media art research has been to uncover any new understandings of the sensations of liveness and presence that may emerge in participatory networked performance, using mobile phones and physiological wearable devices. To practically investigate these concepts, a mobile media performance series was created, called MINDtouch. The MINDtouch project proposed that the mobile videophone become a new way to communicate non-verbally, visually and sensually across space. It explored notions of ephemeral transference, distance collaboration and participant as performer to study presence and liveness emerging from the use of wireless mobile technologies within real-time, mobile performance contexts. Through participation by in-person and remote interactors, creating mobile video-streamed mixes, the project interweaves and embodies a daisy chain of technologies through the network space. As part of a practice-based Ph.D. research conducted at the SMARTlab Digital Media Institute at the University of East London, MINDtouch has been under the direction of Professor Lizbeth Goodman and sponsored by BBC R&D. The aim of this article is to discuss the project research, conducted and recently completed for submission, in terms of the technical and aesthetic developments from 2008 to present, as well as the final phase of staging the events from July 2009 to February 2010. This piece builds on the article (Baker 2008) which focused on the outcomes of phase 1 of the research project and initial developments in phase 2. The outcomes from phase 2 and 3 of the project are discussed in this article

    Designing Group Music Improvisation Systems:A Decade of Design Research in Education

    Get PDF
    In this article we discuss Designing Group Music Improvisation Systems (DGMIS), a design education activity that investigates the contemporary challenge design is facing when we go beyond single-user single-artefact interactions. DGMIS examines how to design for systems of interdependent artefacts and human actors, from the perspective of improvised music. We have explored this challenge in design research and education for over a decade in different Industrial Design education contexts, at various geographic locations, in several formats. In this article we describe our experiences and discuss our general observations, corrective measures, and lessons we learned for (teaching) the design of novel, interactive, systemic products
    • …
    corecore