870 research outputs found

    Computers in Support of Musical Expression

    Get PDF

    Evaluation of live human-computer music-making: Quantitative and qualitative approaches

    Get PDF
    NOTICE: this is the author’s version of a work that was accepted for publication in International Journal of Human-Computer Studies. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in International Journal of Human-Computer Studies, [VOL 67,ISS 11(2009)] DOI: 10.1016/j.ijhcs.2009.05.00

    Networks of Liveness in Singer-Songwriting: A practice-based enquiry into developing audio-visual interactive systems and creative strategies for composition and performance.

    Get PDF
    This enquiry explores the creation and use of computer-based, real-time interactive audio-visual systems for the composition and performance of popular music by solo artists. Using a practice-based methodology, research questions are identified that relate to the impact of incorporating interactive systems into the songwriting process and the liveness of the performances with them. Four approaches to the creation of interactive systems are identified: creating explorative-generative tools, multiple tools for guitar/vocal pieces, typing systems and audio-visual metaphors. A portfolio of ten pieces that use these approaches was developed for live performance. A model of the songwriting process is presented that incorporates system-building and strategies are identified for reconciling the indeterminate, electronic audio output of the system with composed popular music features and instrumental/vocal output. The four system approaches and ten pieces are compared in terms of four aspects of liveness, derived from current theories. It was found that, in terms of overall liveness, a unity to system design facilitated both technological and aesthetic connections between the composition, the system processes and the audio and visual outputs. However, there was considerable variation between the four system approaches in terms of the different aspects of liveness. The enquiry concludes by identifying strategies for maximising liveness in the different system approaches and discussing the connections between liveness and the songwriting process

    Designing instruments towards networked music practices

    Get PDF
    It is commonly noted in New Interfaces for Musical Expression (NIME) research that few of these make it to the mainstream and are adopted by the general public. Some research in Sound and Music Computing (SMC) suggests that the lack of humanistic research guiding technological development may be one of the causes. Many new technologies are invented, however without real aim else than for technical innovation, great products however emphasize the user-friendliness, user involvement in the design process or User-Centred Design (UCD), that seek to guarantee that innovation address real, existing needs among users. Such an approach includes not only traditionally quantifiable usability goals, but also qualitative, psychological, philosophical and musical such. The latter approach has come to be called experience design, while the former is referred to as interaction design. Although the Human Computer Interaction (HCI) community in general has recognized the significance of qualitative needs and experience design, NIME has been slower to adopt this new paradigm. This thesis therefore attempts to investigate its relevance in NIME, and specifically Computer Supported Cooperative Work (CSCW) for music applications by devising a prototype for group music action based on needs defined from pianists engaging in piano duets, one of the more common forms of group creation seen in the western musical tradition. These needs, some which are socio-emotional in nature, are addressed through our prototype although in the context of computers and global networks by allowing for composers from all over the world to submit music to a group concert on a Yamaha Disklavier in location in Porto, Portugal. Although this prototype is not a new gestural controller per se, and therefore not a traditional NIME, but rather a platform that interfaces groups of composers with a remote audience, the aim of this research is on investigating how contextual parameters like venue, audience, joint concert and technologies impact the overall user experience of such a system. The results of this research has been important not only in understanding the processes, services, events or environments in which NIME’s operate, but also understanding reciprocity, creativity, experience design in Networked Music practices.É de conhecimento generalizado que na área de investigação em novos interfaces para expressão musical (NIME - New Interfaces for Musical Expression), poucos dos resultantes dispositivos acabam por ser popularizados e adoptados pelo grande público. Algum do trabalho em computação sonora e musical (SMC- Sound and Music Computing) sugere que uma das causas para esta dificuldade, reside numalacuna ao nível da investigação dos comportamentos humanos como linha orientadora para os desenvolvimentos tecnológicos. Muitos dos desenvolvimentos tecnológicos são conduzidos sem um real objectivo, para além da inovação tecnológica, resultando em excelentes produtos, mas sem qualquer enfâse na usabilidade humana ou envolvimento do utilizador no processo de Design (UCDUser Centered Design), no sentido de garantir que a inovação atende a necessidades reais dos utilizadores finais. Esta estratégia implica, não só objectivos quantitativos tradicionais de usabilidade, mas também princípios qualitativos, fisiológicos, psicológicos e musicológicos. Esta ultima abordagem é atualmente reconhecida como Design de Experiência (Experience Design) enquanto a abordagem tradicional é vulgarmente reconhecida apenas como Design de Interação (Interaction Design). Apesar de na área Interação Homem-Computador (HCI – Human Computer Interaction) as necessidades qualitativas no design de experiência ser amplamente reconhecido em termos do seu significado e aplicabilidade, a comunidade NIME tem sido mais lenta em adoptar este novo paradigma. Neste sentido, esta Tese procura investigar a relevância em NIME, especificamente nu subtópico do trabalho cooperativo suportado por Computadores (CSCW – Computer Supported Cooperative Work), para aplicações musicais, através do desenvolvimento de um protótipo de um sistema que suporta ações musicais coletivas, baseado nas necessidades especificas de Pianistas em duetos de Piano, uma das formas mais comuns de criação musical em grupo popularizada na tradição musical ocidental. Estes requisitos, alguns sócioemocionais na sua natureza, são atendidos através do protótipo, neste caso aplicado ao contexto informático e da rede de comunicações global, permitindo a compositores de todo o mundo submeterem a sua música para um concerto de piano em grupo num piano acústico Yamaha Disklavier, localizado fisicamente na cidade do Porto, Portugal. Este protótipo não introduz um novo controlador em si mesmo, e consequentemente não está alinhado com as típicas propostas de NIME. Trata-se sim, de uma nova plataforma de interface em grupo para compositores com uma audiência remota, enquadrado com objectivos de experimentação e investigação sobre o impacto de diversos parâmetros, tais como o espaço performativo, as audiências, concertos colaborativos e tecnologias em termos do sistema global. O resultado deste processo de investigação foi relevante, não só para compreender os processos, serviços, eventos ou ambiente em que os NIME podem operar, mas também para melhor perceber a reciprocidade, criatividade e design de experiencia nas práticas musicais em rede

    A Faust-Built Mobile Vocoder Instrument

    Get PDF
    The growth of increasingly powerful mobiles devices and their ubiquity opens up more and more possibilities in the creation of New Interfaces for Musical Expression (NIMEs). However, since smartphones were not conceived for musical purposes, they are affected by some limitations. This work aims to develop a mobile version of a vocoder using the Faust programming language, in order to test the limits and opportunities offered by smartphones in creating a portable version of such an old musical effect. Both a custom app and purpose-designed phone case prototype were developed. The vocoder app presents a clear reconstruction of the words, via the quite pleasant and well-known timbre. However, some difficulties were encountered in the development process. In particular, some mobile devices are not powerful enough to handle a high level of polyphony

    That Syncing Feeling: Networked Strategies for Enabling Ensemble Creativity in iPad Musicians

    No full text
    The group experience of synchronisation is a key aspect of ensemble musical performance. This paper presents a number of strategies for syncing performance information across networked iPad-instruments to enable creativity among an ensemble of improvising musicians. Acoustic instrumentalists sync without mechanical intervention. Electronic instruments frequently synchronise rhythm using MIDI or OSC connections. In contrast, our system syncs other aspects of performance, such as tonality, instrument functions, and gesture classifications, to support and enhance improvised performance. Over a number of performances with an iPad and percussion group, Ensemble Metatone, various syncing scenarios have been explored that support, extend, and disrupt ensemble creativity

    Evaluation of Drum Rhythmspace in a Music Production Environment

    Get PDF
    In modern computer-based music production, vast musical data libraries are essential. However, their presentation via subpar interfaces can hinder creativity, complicating the selection of ideal sequences. While low-dimensional space solutions have been suggested, their evaluations in real-world music production remain limited. In this study, we focus on Rhythmspace, a two-dimensional platform tailored for the exploration and generation of drum patterns in symbolic MIDI format. Our primary objectives encompass two main aspects: first, the evolution of Rhythmspace into a VST tool specifically designed for music production settings, and second, a thorough evaluation of this tool to ascertain its performance and applicability within the music production scenario. The tool’s development necessitated transitioning the existing Rhythmspace, which operates in Puredata and Python, into a VST compatible with Digital Audio Workstations (DAWs) using the JUCE(C++) framework. Our evaluation encompassed a series of experiments, starting with a composition test where participants crafted drum sequences followed by a listening test, wherein participants ranked the sequences from the initial experiment. The results show that Rhythmspace and similar tools are beneficial, facilitating the exploration and creation of drum patterns in a user-friendly and intuitive manner, and enhancing the creative process for music producers. These tools not only streamline the drum sequence generation but also offer a fresh perspective, often serving as a source of inspiration in the dynamic realm of electronic music production

    A hybrid keyboard-guitar interface using capacitive touch sensing and physical modeling

    Get PDF
    This paper was presented at the 9th Sound and Music Computing Conference, Copenhagen, Denmark.This paper presents a hybrid interface based on a touch- sensing keyboard which gives detailed expressive control over a physically-modeled guitar. Physical modeling al- lows realistic guitar synthesis incorporating many expres- sive dimensions commonly employed by guitarists, includ- ing pluck strength and location, plectrum type, hand damp- ing and string bending. Often, when a physical model is used in performance, most control dimensions go unused when the interface fails to provide a way to intuitively con- trol them. Techniques as foundational as strumming lack a natural analog on the MIDI keyboard, and few digital controllers provide the independent control of pitch, vol- ume and timbre that even novice guitarists achieve. Our interface combines gestural aspects of keyboard and guitar playing. Most dimensions of guitar technique are control- lable polyphonically, some of them continuously within each note. Mappings are evaluated in a user study of key- boardists and guitarists, and the results demonstrate its playa- bility by performers of both instruments
    corecore