117 research outputs found
Towards an interactive environment for the performance of Dubstep music
This Masters by Research project explores the integration of different concepts relating to the presence of the human body in Dubstep music performance. Three intended performance systems propose that the body is the logical site for the interactive control of live Dubstep music. The physicality and gestures of instrumentalists, choreographed dancers, and audience members will be examined in order to develop new and exciting ways to perform this genre in a live setting.
The systems take on a three-tiered hierarchical approach on two levels in regards to the extraction of gestural information from human body movements, as well as in regards to the importance ā and length ā of musical phenomena and parameters that are under control. The characteristics of Dubstep music are defined and maintained within each interactive music system. A model for this each proposed system will be examined, including discussion of the technology and methodology employed in order to apply the two hierarchies and create the interactive environment
Examining the effects of experimental/academic electroacoustic and popular electronic musics on the evolution and development of humanācomputer interaction in music
This article focuses on how the development of humanācomputer interaction in music has been aided and influenced by both experimental/academic electroacoustic art music and popular electronic music. These two genres have impacted upon this ever-changing process of evolution in different ways, but have together been paramount to the establishment of interactivity in music as we understand it today; which is itself having wide-ranging implications upon the modern-day musical landscape as a wholeāboth in the way that we, as listeners and audience members, purchase and consume music as well as conceptualise and think about it
Musical Parameter Manipulation Possibilities of a Homemade Reactable
Musical parameter control is an important part of live interactive electronic computer music. Due to the increasing availability and affordability of music technology, including powerful computer software, advances in this area are being made to enable easier and more effective parameter control. The purpose of this paper is to investigate and discuss the musical parameter manipulation possibilities of a homemade instrument with a tangible tabletop interface based on the technology of the reacTable. The design and construction of the instrument is documented, including the physical build as well as the software component of the system, which incorporates the computer software ReacTIVision, Max/MSP and Reason. The core of the paper discusses parameter manipulation abilities by way of a comparison between two controllers: the homemade instrument and the Korg nanoKONTROL. Mapping strategies ā in an interactive music sense ā are explored in detail, while the execution and capabilities of parameter control by use of the physical interface devices of the two controllers are assessed
An interactive music system based on the technology of the reacTable
The purpose of this dissertation is to investigate and document a research project undertaken in the designing, constructing and performing of an interactive music system. The project involved building a multi-user electro-acoustic music instrument with a tangible user interface, based on the technology of the reacTable. The main concept of the instrument was to integrate the ideas of 1) interpreting gestural movement into music, 2) multi-touch/multi-user technology, and 3) the exploration of timbre in computer music. The dissertation discusses the definition, basics and essentials of interactive music systems and examines the past history and key features of the three main concepts, previously mentioned. The original instrument is observed in detail, including the design and construction of the table-shaped physical build, along with an in-depth look into the computer software (ReacTIVision, Max MSP and Reason) employed. The fundamentals and workings of the instrument- sensing/processing/response, control and feedback, and mapping- are described at length, examining how tangible objects are used to generate and control parameters of music, while its instrumental limitations are also mentioned. How the three main concepts relate to, and are expressed within, the instrument is also discussed. An original piece of music, with an accompanying video, entitled Piece for homemade reacTable, composed and performed on the instrument has been created in support of this dissertation. It acts as a basic demonstration of how the interactive music system works, showcasing all the main concepts and how they are put in practice to create and perform new electronic music
mixiTUI:A Tangible Sequencer for Electronic Live Performances
With the rise of crowdsourcing and mobile crowdsensing techniques, a large
number of crowdsourcing applications or platforms (CAP) have appeared. In the
mean time, CAP-related models and frameworks based on different research
hypotheses are rapidly emerging, and they usually address specific issues from
a certain perspective. Due to different settings and conditions, different
models are not compatible with each other. However, CAP urgently needs to
combine these techniques to form a unified framework. In addition, these models
needs to be learned and updated online with the extension of crowdsourced data
and task types, thus requiring a unified architecture that integrates lifelong
learning concepts and breaks down the barriers between different modules. This
paper draws on the idea of ubiquitous operating systems and proposes a novel OS
(CrowdOS), which is an abstract software layer running between native OS and
application layer. In particular, based on an in-depth analysis of the complex
crowd environment and diverse characteristics of heterogeneous tasks, we
construct the OS kernel and three core frameworks including Task Resolution and
Assignment Framework (TRAF), Integrated Resource Management (IRM), and Task
Result quality Optimization (TRO). In addition, we validate the usability of
CrowdOS, module correctness and development efficiency. Our evaluation further
reveals TRO brings enormous improvement in efficiency and a reduction in energy
consumption
Recommended from our members
Tabletop Tangible Interfaces for Music Performance: Design and Evaluation
This thesis investigates a new generation of collaborative systems: tabletop tangible interfaces (TTIs) for music performance or musical tabletops. Musical tabletops are designed for professional musical performance, as well as for casual interaction in public settings. These systems support co-located collaboration, offered by a shared interface. However, we still know little about their challenges and opportunities for collaborative musical practice: in particular, how to best support beginners or experts or both.
This thesis explores the nature of collaboration on TTIs for music performance between beginners, experts, or both. Empirical work was done in two stages: 1) an exploratory stage; and 2) an experimental stage. In the exploratory stage we studied the Reactable, a commercial musical tabletop designed for beginners and experts. In particular, we explored its use in two environments: a multi-session study with expert musicians in a casual lab setting; and a field study with casual visitors in a science centre. In the experimental stage we conducted a controlled experiment for mixed groups using a bespoke musical tabletop interface, SoundXY4. The design of this study was informed by the previous stage about a need to support better real-time awareness of the group activity (workspace awareness) in early interactions. For the three studies, groups musical improvisation was video-captured unobtrusively with the aim of understanding natural uses during group musical practice. Rich video data was carefully analysed focusing on the nature of social interaction and how workspace awareness was manifested.
The findings suggest that musical tabletops can support peer learning during multiple sessions; fluid between-group social interaction in public settings; and a democratic and ecological approach to music performance. The findings also point to how workspace awareness can be enhanced in early interactions with TTIs using auditory feedback with ambisonics spatialisation.
The thesis concludes with theoretical, methodological, and practical implications for future research in New Interfaces for Musical Expression (NIME), tabletop studies, and Human-Computer Interaction (HCI)
Multiparametric interfaces for fine-grained control of digital music
Digital technology provides a very powerful medium for musical creativity, and the way in which we interface and interact with computers has a huge bearing on our ability to realise our artistic aims. The standard input devices available for the control of digital music tools tend to afford a low quality of embodied control; they fail to realise our innate expressiveness and dexterity of motion. This thesis looks at ways of capturing more detailed and subtle motion for the control of computer music tools; it examines how this motion can be used to control music software, and evaluates musiciansā experience of using these systems.
Two new musical controllers were created, based on a multiparametric paradigm where multiple, continuous, concurrent motion data streams are mapped to the control of musical parameters. The first controller, Phalanger, is a markerless video tracking system that enables the use of hand and finger motion for musical control. EchoFoam, the second system, is a malleable controller, operated through the manipulation of conductive foam. Both systems use machine learning techniques at the core of their functionality. These controllers are front ends to RECZ, a high-level mapping tool for multiparametric data streams.
The development of these systems and the evaluation of musiciansā experience of their use constructs a detailed picture of multiparametric musical control. This work contributes to the developing intersection between the fields of computer music and human-computer interaction. The principal contributions are the two new musical controllers, and a set of guidelines for the design and use of multiparametric interfaces for the control of digital music. This work also acts as a case study of the application of HCI user experience evaluation methodology to musical interfaces.
The results highlight important themes concerning multiparametric musical control. These include the use of metaphor and imagery, choreography and language creation, individual differences and uncontrol. They highlight how this style of interface can fit into the creative process, and advocate a pluralistic approach to the control of digital music tools where different input devices fit different creative scenarios
- ā¦