7,151 research outputs found
Sonic City: Prototyping a wearable experience
Sonic City is a project exploring mobile interaction and wearable technology for everyday music creation. A wearable system has been developed that creates electronic music in real-time based on sensing bodily and environmental factors - thus, a personal soundscape is co-produced by physical movement, local activity, and urban ambiance simply by walking through the city. Applying multi-disciplinary methods, we have developed the wearable from a scenario-driven, aesthetic and lifestyle perspective. A garment has been crafted for 'trying on' interaction and wearabilty options with users on-site in the city. With this prototype, we have been able to expore and rapidly iterate context and content, social and human factors of the wearable application
A wireless, real-time, social music performance system for mobile phones
The paper reports on the Cellmusic system: a real-time, wireless distributed composition and performance system designed for domestic mobile devices. During a performance, each mobile device communicates with others, and may create sonic events in a passive (non interactive) mode or may influence the output of other devices. Cellmusic distinguishes itself from other mobile phone performance environments in that it is intended for performance in ad hoc locations, with services and performances automatically and dynamically adapting to the number of devices within a given proximity. It is designed to run on a number of mobile phone platforms to allow as wider distribution as possible, again distinguishing itself from other mobile performance systems which primarily run on a single device. Rather than performances being orchestrated or managed, it is intended that users will access it and create a performance in the same manner that they use mobile phones for interacting socially at different times throughout the day. However, this does not preclude the system being used in a more traditional performance environment. This accessibility and portability make it an ideal platform for sonic artists who choose to explore a variety of physical environments (such as parks and other public spaces)
Towards musical interaction : 'Schismatics' for e-violin and computer.
This paper discusses the evolution of the Max/MSP
patch used in schismatics (2007, rev. 2010) for electric
violin (Violectra) and computer, by composer Sam
Hayden in collaboration with violinist Mieko Kanno.
schismatics involves a standard performance paradigm
of a fixed notated part for the e-violin with sonically unfixed
live computer processing. Hayden was unsatisfied
with the early version of the piece: the use of attack
detection on the live e-violin playing to trigger stochastic
processes led to an essentially reactive behaviour in the
computer, resulting in a somewhat predictable one-toone
sonic relationship between them. It demonstrated
little internal relationship between the two beyond an
initial e-violin âactionâ causing a computer âeventâ. The
revisions in 2010, enabled by an AHRC Practice-Led
research award, aimed to achieve 1) a more interactive
performance situation and 2) a subtler and more
âmusicalâ relationship between live and processed
sounds. This was realised through the introduction of
sound analysis objects, in particular machine listening
and learning techniques developed by Nick Collins. One
aspect of the programming was the mapping of analysis
data to synthesis parameters, enabling the computer
transformations of the e-violin to be directly related to
Kannoâs interpretation of the piece in performance
Recommended from our members
Collaborative music interaction on tabletops: an HCI approach
With the advent of tabletop interaction, collaborative activities are better supported than they are on single-user PCs because there exists a physical shareable space, and interaction with digital data is more embodied and social. In sound and music computing, collaborative music making has traditionally been done using interconnected networks, but using separated computers. Musical tabletops introduce opportunities of playing in collaboration through sharing physically the same musical interface. However, few tabletop musical interfaces exploit this collaborative potential (e.g. the Reactable). We are interested in looking into how collaboration can be fully supported by means of musical tabletops for music performance in contrast with more traditional settings. We are also looking at whether collective musical engagement can be enhanced by providing more suitable interfaces to collaboration. In HCI and software development, we find an iterative process approach of design and evaluationâwhere evaluation allows us to identify key issues that can be addressed in the next design iteration of the system. Using a similar iterative approach, we plan to design and evaluate some tabletop musical interfaces. The aim is to understand what design choices can enhance and enrich collaboration and collective musical engagement on these systems. In this paper, we explain the evaluation methodologies we have undertaken in three preliminary pilot studies, and the lessons we have learned. Initial findings indicate that evaluating tabletop musical interfaces is a complex endeavour which requires an approach as close as possible to a real context, with an interdisciplinary approach provided by interaction analysis techniques
Theatre Noise Conference
Three days of Performances, Installations, Residencies, Round Table Discussions, Presentations and Workshops
More than an academic conference, Theatre Noise is a diverse collection of events exploring the sound of theatre from performance to the spaces inbetween.
Featuring keynote presentations, artists in residence, electroacoustic, percussive and digital performances, industry workshops and installations, Theatre Noise is an immersive journey into sound
Editorial: Perceptual issues surrounding the electroacoustic listening experience
The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link
"It's cleaner, definitely": Collaborative Process in Audio Production.
Working from vague client instructions, how do audio producers collaborate to diagnose what specifically is wrong with a piece of music, where the problem is and what to do about it? This paper presents a design ethnography that uncovers some of the ways in which two music producers co-ordinate their understanding of complex representations of pieces of music while working together in a studio. Our analysis shows that audio producers constantly make judgements based on audio and visual evidence while working with complex digital tools, which can lead to ambiguity in assessments of issues. We show how multimodal conduct guides the process of work and that complex media objects are integrated as elements of interaction by the music producers. The findings provide an understanding how people currently collaborate when producing audio, to support the design of better tools and systems for collaborative audio production in the future
miMic: The microphone as a pencil
miMic, a sonic analogue of paper and pencil is proposed: An augmented microphone for vocal and gestural sonic sketching. Vocalizations are classified and interpreted as instances of sound models, which the user can play with by vocal and gestural control. The physical device is based on a modified microphone, with embedded inertial sensors and buttons. Sound models can be selected by vocal imitations that are automatically classified, and each model is mapped to vocal and gestural features for real-time control. With miMic, the sound designer can explore a vast sonic space and quickly produce expressive sonic sketches, which may be turned into sound prototypes by further adjustment of model parameters
Recommended from our members
Seeking out the spaces between: Using improvisation for collaborative composition and interactive technology
Copyright © 2010 ISASTThis article presents findings from experiments into piano performance live electronics undertaken by the author since early 2007. The use of improvisation has infused every step of the process---both as a methodology to obtain meaningful results using interactive technology and as a way to generate and characterize a collaborative musical space with composers. The technology used has included pre-built MIDI interfaces such as the PianoBar, actuators such as miniature DC motors and sensor interfaces including iCube and the Wii controller. Collaborators have included researchers at the Centre for Digital Music (QMUL), Richard Barrett, Pierre Alexandre Tremblay and Atau Tanaka. In seeking to create responsive âperformance environmentsâ at the piano, I explore live, performative control of electronics to create better connections for both performer (providing the same level of interpretive freedom as with a âpureâ instrumental performance) and audience (communicating clearly to them). I have been lucky to witness first-hand many live interactive performances and to work with various empathetic composers/performers in flexible working environments. Collaborating with experienced technologists and musicians, I have witnessed time and again what, for me, is a fundamental truth in interactive instrumental performance: As a living, spontaneous form it must be nurtured and informed by the performerâs physicality and imagination as much as by the creativity or knowledge of the composer and/or technologist. Specifically in the case of sensors, their dependence on the detail of each personâs body and reactions is so refined as to necessitate, I would argue, an entirely collaborative approach and therefore one that involves at least directed improvisation and, more likely, fairly extensive improvised exploration. The fundamentally personal and intimate nature of sensor readings---the amount of tension created by each performer, the shape of the ancillary gestures or the level of emotional involvement (especially relevant when using galvanic skin response or EEG)---makes creating pieces with sensors extremely difficult for a composer to do in isolation. Improvisation therefore provides a way for performer and composer to generate a common musical and gestural language. Related to these issues is the fact that the technical or notational parameters in interactive music are not yet (and may never be) standardized, thereby creating a very real and practical need for improvisation to figure at least somewhere in the process.This study is funded by the Arts and Humanities Research Council
- âŠ