240,806 research outputs found

    Audio technology and mobile human-computer interaction: from space and place, to social media, music, composition and creation

    Get PDF
    Audio-based mobile technology is opening up a range of new interactive possibilities. This paper brings some of those possibilities to light by offering a range of perspectives based in this area. It is not only the technical systems that are developing, but novel approaches to the design and understanding of audio-based mobile systems are evolving to offer new perspectives on interaction and design and support such systems to be applied in areas, such as the humanities

    Visual collaging of music in a digital library

    Get PDF
    This article explores the role visual browsing can play within a digital music library. The context to the work is provided through a review of related techniques drawn from the fields of digital libraries and human computer interaction. Implemented within the open source digital library toolkit Greenstone, a prototype system is described that combines images located through textual metadata with a visualisation technique known as collaging to provide a leisurely, undirected interaction with a music collection. Emphasis in the article is given to the augmentations of the basic technique to work in the musical domain

    Embodied Musical Interaction

    Get PDF
    Music is a natural partner to human-computer interaction, offering tasks and use cases for novel forms of interaction. The richness of the relationship between a performer and their instrument in expressive musical performance can provide valuable insight to human-computer interaction (HCI) researchers interested in applying these forms of deep interaction to other fields. Despite the longstanding connection between music and HCI, it is not an automatic one, and its history arguably points to as many differences as it does overlaps. Music research and HCI research both encompass broad issues, and utilize a wide range of methods. In this chapter I discuss how the concept of embodied interaction can be one way to think about music interaction. I propose how the three “paradigms” of HCI and three design accounts from the interaction design literature can serve as a lens through which to consider types of music HCI. I use this conceptual framework to discuss three different musical projects—Haptic Wave, Form Follows Sound, and BioMuse

    Improvisation, Computers, and Interaction : Rethinking Human-Computer Interaction Through Music

    Get PDF
    Interaction is an integral part of all music. Interaction is part of listening, of playing, of composing and even of thinking about music. In this thesis the multiplicity of modes in which one may engage interactively in, through and with music is the starting point for rethinking Human-Computer Interaction in general and Interactive Music in particular. I propose that in Human-Computer interaction the methodology of control, interaction-as-control, in certain cases should be given up in favor for a more dynamic and reciprocal mode of interaction, interaction-as-difference: Interaction as an activity concerned with inducing differences that make a difference. Interaction-as-difference suggests a kind of parallelity rather than click-and-response. In essence, the movement from control to difference was a result of rediscovering the power of improvisation as a method for organizing and constructing musical content and is not to be understood as an opposition: It is rather a broadening of the more common paradigm of direct manipulation in Human-Computer Interaction. Improvisation is at the heart of all the sub-projects included in this thesis, also, in fact, in those that are not immediately related to music but more geared towards computation. Trusting the self-organizing aspect of musical improvisation, and allowing it to diffuse into other areas of my practice, constitutes the pivotal change that has radically influenced my artistic practice. Furthermore, is the work-in-movement (re-)introduced as a work kind that encompasses radically open works. The work-in-movement, presented and exemplified by a piece for guitar and computer, requires different modes of representation as the traditional musical score is too restrictive and is not able to communicate that which is the most central aspect: the collaboration, negotiation and interaction. The Integra framework and the relational database model with its corresponding XML representation is proposed as a means to produce annotated scores that carry past performances and version with it. The common nominator, the prerequisite, for interaction-as-difference and a improvisatory and self-organizing attitude towards musical practice it the notion of giving up of the Self. Only if the Self is able and willing to accept the loss the priority of interpretation (as for the composer) or the faithfulness to ideology or idiomatics (performer). Only is one is willing to forget is interaction-as-difference made possible. Among the artistic works that have been produced as part of this inquiry are some experimental tools in the form of computer software to support the proposed concepts of interactivity. These, along with the more traditional musical work make up both the object and the method in this PhD project. These sub-projects contained within the frame of the thesis, some (most) of which are still works-in-progress, are used to make inquiries into the larger question of the significance of interaction in the context of artistic practice involving computers

    Greenstone as a music digital library toolkit

    Get PDF
    Greenstone is an open source digital library system that has developed and matured since its inception in 1995. Today it is used in over 60 countries, with a strong emphasis on humanitarian aid. The software is also used as a framework for research in other fields such has human computer interaction, text-mining, and ethnography. This article provides a summary of Greenstone's uses to date with music documents. First we discuss incorporating musical formats into the Greenstone system; then we describe provision for searching and browsing in a music collection

    An Introduction to Interactive Music for Percussion and Computers

    Get PDF
    Composers began combining acoustic performers with electronically produced sounds in the early twentieth century. As computer-processing power increased the potential for significant musical communication was developed. Despite the body of research concerning electronic music, performing a composition with a computer partner remains intimidating for performers. The purpose of this paper is to provide an introductory method for interacting with a computer. This document will first follow the parallel evolution of percussion and electronics in order to reveal how each medium was influenced by the other. The following section will define interaction and explain how this is applied to musical communication between humans and computers. The next section introduces specific techniques used to cultivate human-computer interaction. The roles of performer, instrument, composer and conductor will then be defined as they apply to the human performer and the computer. If performers are aware of these roles they will develop richer communication that can enhance the performer's and audience member's recognition of human-computer interaction. In the final section, works for percussion and computer will be analyzed to reveal varying levels of interaction and the shifting roles of the performer. Three compositions will illustrate this point, 120bpm from neither Anvil nor Pulley by Dan Trueman, It's Like the Nothing Never Was by Von Hansen, and Music for Snare Drum and Computer by Cort Lippe. These three pieces develop a continuum of increasing interaction, moving from interaction within a fully defined score, to improvisation with digital synthesis, to the manipulation of computerized compositional algorithms using performer input. The unique ways each composer creates interaction will expose the vast possibilities for performing with interactive music systems

    Expanding the Human Bandwidth Through Subvocalization and Other Methods

    Get PDF
    This is a look at human bandwidth and how it applies to human-human interaction and human-computer interaction. The paper discusses what human bandwidth means and what must be done to try to expand it. Current methods of expanding bandwidth are discussed. The methods include detection of subvocal activity, facial expression detection, eye tracking, emotion detection in digital music, pen based musical input systems, and augmented reality. After explaining these methods, the paper focuses on using some of the technologies together to give an idea of what the future of interaction with computers might look like. These proposed ideas include emotion based music, various uses for augmented reality, and composing music with the mind
    • 

    corecore