11,615 research outputs found
Screen-based musical instruments as semiotic machines
The ixi software project started in 2000 with the intention to explore new interactive patterns and virtual interfaces in computer music software. The aim of this paper is not to describe these programs, as they have been described elsewhere, but rather explicate the theoretical background that underlies the design of these screen-based instruments. After an analysis of the similarities and differences in the design of acoustic and screen-based instruments, the paper describes how the creation of an interface is essentially the creation of a semiotic system that affects and influences the musician and the composer. Finally the terminology of this semiotics is explained as an interaction model
Tangible user interfaces : past, present and future directions
In the last two decades, Tangible User Interfaces (TUIs) have emerged as a new interface type that interlinks the digital and physical worlds. Drawing upon users' knowledge and skills of interaction with the real non-digital world, TUIs show a potential to enhance the way in which people interact with and leverage digital information. However, TUI research is still in its infancy and extensive research is required in or- der to fully understand the implications of tangible user interfaces, to develop technologies that further bridge the digital and the physical, and to guide TUI design with empirical knowledge. This paper examines the existing body of work on Tangible User In- terfaces. We start by sketching the history of tangible user interfaces, examining the intellectual origins of this field. We then present TUIs in a broader context, survey application domains, and review frame- works and taxonomies. We also discuss conceptual foundations of TUIs including perspectives from cognitive sciences, phycology, and philoso- phy. Methods and technologies for designing, building, and evaluating TUIs are also addressed. Finally, we discuss the strengths and limita- tions of TUIs and chart directions for future research
Interfacing the Network: An Embedded Approach to Network Instrument Creation
This paper discusses the design, construction, and
development of a multi-site collaborative instrument,
The Loop, developed by the JacksOn4 collective during
2009-10 and formally presented in Oslo at the
arts.on.wires and NIME conferences in 2011. The
development of this instrument is primarily a reaction
to historical network performance that either attempts
to present traditional acoustic practice in a distributed
format or utilises the network as a conduit to shuttle
acoustic and performance data amongst participant
nodes. In both scenarios the network is an integral and
indispensible part of the performance, however, the
network is not perceived as an instrument, per se. The
Loop is an attempt to create a single, distributed hybrid
instrument retaining traditionally acoustic interfaces
and resonant bodies that are mediated by the network.
The embedding of the network into the body of the
instrument raises many practical and theoretical
discussions, which are explored in this paper through a
reflection upon the notion of the distributed instrument
and the way in which its design impacts the behaviour
of the participants (performers and audiences); the
mediation of musical expression across networks; the
bi-directional relationship between instrument and
design; as well as how the instrument assists in the
realisation of the creators’ compositional and artistic
goals
Recommended from our members
CreaTable Content and Tangible Interaction in Aphasia
Multimedia digital content (combining pictures, text and music) is ubiquitous. The process of creating such content using existing tools typically requires complex, language-laden interactions which pose a challenge for users with aphasia (a language impairment following brain injury). Tangible interactions offer a potential means to address this challenge, however, there has been little work exploring their potential for this purpose. In this paper, we present CreaTable – a platform that enables us to explore tangible interaction as a means of supporting digital content creation for people with aphasia. We report details of the co-design of CreaTable and findings from a digital creativity workshop. Workshop findings indicated that CreaTable enabled people with aphasia to create something they would not otherwise have been able to. We report how users’ aphasia profiles affected their experience, describe tensions in collaborative content creation and provide insight into more accessible content creation using tangibles
Music Information Retrieval in Live Coding: A Theoretical Framework
The work presented in this article has been partly conducted while the first author was at Georgia Tech from 2015–2017 with the support of the School of Music, the Center for Music Technology and Women in Music Tech at Georgia Tech.
Another part of this research has been conducted while the first author was at Queen Mary University of London from 2017–2019 with the support of the AudioCommons project, funded by the European Commission through the Horizon 2020 programme, research and innovation grant 688382.
The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.Music information retrieval (MIR) has a great potential in musical live coding because it can help the musician–programmer to make musical decisions based on audio content analysis and explore new sonorities by means of MIR techniques. The use of real-time MIR techniques can be computationally demanding and thus they have been rarely used in live coding; when they have been used, it has been with a focus on low-level feature extraction. This article surveys and discusses the potential of MIR applied to live coding at a higher musical level. We propose a conceptual framework of three categories: (1) audio repurposing, (2) audio rewiring, and (3) audio remixing. We explored the three categories in live performance through an application programming interface library written in SuperCollider, MIRLC. We found that it is still a technical challenge to use high-level features in real time, yet using rhythmic and tonal properties (midlevel features) in combination with text-based information (e.g., tags) helps to achieve a closer perceptual level centered on pitch and rhythm when using MIR in live coding. We discuss challenges and future directions of utilizing MIR approaches in the computer music field
On the resistance of the instrument
I examine the role that the musical instrument plays in shaping a performer's expressive activity and emotional state. I argue that the historical development of the musical instrument has fluctuated between two key values: that of sharing with other musicians, and that of creatively exploring new possibilities. I introduce 'the mood organ'- a sensor-based computer instrument that automatically turns signals of the wearer's emotional state into expressive music
Recommended from our members
A Systemic Approach to Music Performance Learning with Multimodal Technology
- …