662 research outputs found
Interaction Design for Digital Musical Instruments
The thesis aims to elucidate the process of designing interactive systems for musical performance that combine software and hardware in an intuitive and elegant fashion. The original contribution to knowledge consists of: (1) a critical assessment of recent trends in digital musical instrument design, (2) a descriptive model of interaction design for the digital musician and (3) a highly customisable multi-touch performance system that was designed in accordance with the model.
Digital musical instruments are composed of a separate control interface and a sound generation system that exchange information. When designing the way in which a digital musical instrument responds to the actions of a performer, we are creating a layer of interactive behaviour that is abstracted from the physical controls. Often, the structure of this layer depends heavily upon:
1. The accepted design conventions of the hardware in use
2. Established musical systems, acoustic or digital
3. The physical configuration of the hardware devices and the grouping of controls that such configuration suggests
This thesis proposes an alternate way to approach the design of digital musical instrument behaviour – examining the implicit characteristics of its composite devices. When we separate the conversational ability of a particular sensor type from its hardware body, we can look in a new way at the actual communication tools at the heart of the device. We can subsequently combine these separate pieces using a series of generic interaction strategies in order to create rich interactive experiences that are not immediately obvious or directly inspired by the physical properties of the hardware.
This research ultimately aims to enhance and clarify the existing toolkit of interaction design for the digital musician
Design Strategies for Adaptive Social Composition: Collaborative Sound Environments
In order to develop successful collaborative music systems a variety
of subtle interactions need to be identified and integrated. Gesture
capture, motion tracking, real-time synthesis, environmental
parameters and ubiquitous technologies can each be effectively used
for developing innovative approaches to instrument design, sound
installations, interactive music and generative systems. Current
solutions tend to prioritise one or more of these approaches, refining
a particular interface technology, software design or compositional
approach developed for a specific composition, performer or
installation environment. Within this diverse field a group of novel
controllers, described as ‘Tangible Interfaces’ have been developed.
These are intended for use by novices and in many cases follow a
simple model of interaction controlling synthesis parameters through
simple user actions. Other approaches offer sophisticated
compositional frameworks, but many of these are idiosyncratic and
highly personalised. As such they are difficult to engage with and
ineffective for groups of novices. The objective of this research is to
develop effective design strategies for implementing collaborative
sound environments using key terms and vocabulary drawn from the
available literature. This is articulated by combining an empathic
design process with controlled sound perception and interaction
experiments. The identified design strategies have been applied to
the development of a new collaborative digital instrument. A range
of technical and compositional approaches was considered to define
this process, which can be described as Adaptive Social Composition.
Dan Livingston
Interaction and the Art of User-Centered Digital Musical Instrument Design
This thesis documents the formulation of a research-based practice in multimedia art, technology and digital musical instrument design. The primary goal of my research was to investigate the principles and methodologies involved in the structural design of new interactive digital musical instruments aimed at performance by members of the general public, and to identify ways that the design process could be optimized to increase user adoption of these new instruments. The research was performed over three years and moved between studies at the University of Maine, internships in New York, and specialized research at the Input Devices and Music Interaction Laboratory at McGill University.
My work is presented in two sections. The first covers early studies in user interaction and exploratory works in web and visual design, sound art, installation, and music performance. While not specifically tied to the research topic of user adoption of digital musical instruments, this work serves as the conceptual and technical background for the dedicated work to follow. The second section is dedicated to focused research on digital musical instrument design through two major projects carried out as a Graduate Research Trainee at McGill University. The first was the design and prototype of the Noisebox, a new digital musical instrument. The purpose of this project was to learn the various stages of instrument design through practical application. A working prototype has been presented and tested, and a second version is currently being built. The second project was a user study that surveyed musicians about digital musical instrument use. It asked questions about background, instrument choice, music styles played, and experiences with and attitudes towards new digital musical instruments.
Based on the results of the two research projects, a model of digital musical instrument design is proposed that adopts a user-centered focus, soliciting user input and feedback throughout the design process from conception to final testing. This approach aims to narrow the gap between conceptual design of new instruments and technologies and the actual musicians who would use them
Computational interaction techniques for 3D selection, manipulation and navigation in immersive VR
3D interaction provides a natural interplay for HCI. Many techniques involving diverse sets of hardware and software components have been proposed, which has generated an explosion of Interaction Techniques (ITes), Interactive Tasks (ITas) and input devices, increasing thus the heterogeneity of tools in 3D User Interfaces (3DUIs). Moreover, most of those techniques are based on general formulations that fail in fully exploiting human capabilities for interaction. This is because while 3D interaction enables naturalness, it also produces complexity and limitations when using 3DUIs.
In this thesis, we aim to generate approaches that better exploit the high potential human capabilities for interaction by combining human factors, mathematical formalizations and computational methods. Our approach is focussed on the exploration of the close coupling between specific ITes and ITas while addressing common issues of 3D interactions.
We specifically focused on the stages of interaction within Basic Interaction Tasks (BITas) i.e., data input, manipulation, navigation and selection. Common limitations of these tasks are: (1) the complexity of mapping generation for input devices, (2) fatigue in mid-air object manipulation, (3) space constraints in VR navigation; and (4) low accuracy in 3D mid-air selection.
Along with two chapters of introduction and background, this thesis presents five main works. Chapter 3 focusses on the design of mid-air gesture mappings based on human tacit knowledge. Chapter 4 presents a solution to address user fatigue in mid-air object manipulation. Chapter 5 is focused on addressing space limitations in VR navigation. Chapter 6 describes an analysis and a correction method to address Drift effects involved in scale-adaptive VR navigation; and Chapter 7 presents a hybrid technique 3D/2D that allows for precise selection of virtual objects in highly dense environments (e.g., point clouds). Finally, we conclude discussing how the contributions obtained from this exploration, provide techniques and guidelines to design more natural 3DUIs
The UFO controller: Gestural music performance
This thesis introduces The UFO Controller, a free-space gestural controller for performing electronic music. It documents the design process and the main features of the UFO, analyses my experiences of performing with the controller and compares the UFO to other known free-space control instruments. The thesis also examines the domain of electronic music, critically analyzes the live performances in that field and investigates the importance of body gestures for the performances.
The UFO is a MIDI controller that uses ultrasonic rangefinder sensors for detecting the hand gestures of a performer. It is a non-tactile controller that is played without physically touching the device. The sensors measure the distance of the performer's hands moving on top of the device and convert that into control data, which can be mapped to any music software or synthesizer.
The use of body gestures, which is commonly reported lacking from the live performances of electronic music, is crucially important for engaging live music performances. The laptop computer has become the de-facto instrument of the concert stages where electronic music is performed. The UFO can help the electronic music performances to become more interesting by moving them towards a more gestural direction. This thesis aims to validate the following claims. Firstly, a novelty free-space controller makes electronic music performances more compelling both for the audience and the performer. Secondly, the use of body gestures is important for the largely disembodied electronic music performances.
The UFO has been seen and heard on concert stages all around the world with my band Phantom. The audiences have been excited and thrilled about it and the UFO has become a subject of wondering for many. Without a doubt, the UFO has raised the bar of my own live performances and helped Phantom to stand out amongst the masses of new electronic indie bands. Furthermore, the UFO has got the attention of various online technology and music blogs (e.g., TechCrunch, Create Digital Music, Synthtopia, NME and The Line Of Best Fit)
Exploring The Impact Of Configuration And Mode Of Input On Group Dynamics In Computing
Objectives: Large displays and new technologies for interacting with computers offer a rich area for the development of new tools to facilitate collaborative concept mapping activities. In this thesis, WiiConcept is described as a tool designed to allow the use of multiple WiiRemotes for the collaborative creation of concept maps, with and without gestures. Subsequent investigation of participants' use of the system considers the effect of single and multiple input streams when using the software with and without gestures and the impact upon group concept mapping process outcomes and interactions when using a large display.
Methods: Data is presented from an exploratory study of twenty two students who have used the tool. Half of the pairs used two WiiRemotes, while the remainder used one WiiRemote. All pairs created one map without gestures and one map with gestures. Data about their maps, interactions and responses to the tool were collected.
Results: Analysis of coded transcripts indicates that one-controller afforded higher levels of interaction, with the use of gestures also increasing the number of interactions seen. Additionally, the result indicated that there were significantly more interactions of the 'shows solidarity', 'gives orientation', and 'gives opinion' categories (defined by the Bales' interaction processes assessment), when using one-controller as opposed to two. Furthermore, there were more interactions for the 'shows solidarity', 'tension release', 'gives orientation' and 'shows tension' categories when using gestures as opposed to the non-use of gestures. Additionally, there were no significant differences in the perceived dominance of individuals, as measured on the social dominance scales, for the amount of interaction displayed, however, there was a significant main effect of group conversational control score on the 'gives orientation' construct, with a higher number of interactions for low, mixed and high scores of this type when dyads had one-controller as opposed to two-controllers. There was also a significant interaction effect of group conversational control score on the 'shows solidarity' construct with a higher number of interactions for all scores of this type when dyads had one-controller as opposed to two-controllers.
The results also indicate that for the WiiConcept there was no difference between number of controllers in the detail in the maps, and that all users found the tool to be useful for the collaborative creation of concept maps. At the same time, engaging in disagreement was related to the amount of nodes created with disagreement leading to more nodes being created.
Conclusions: Use of one-controller afforded higher levels of interaction, with gestures also increasing the number of interactions seen. If a particular type of interaction is associated with more nodes, there might also be some argument for only using one-controller with gestures enabled to promote cognitive conflict within groups. All participants responded that the tool was relatively easy to use and engaging, which suggests that this tool could be integrated into collaborative concept mapping activities, allowing for greater collaborative knowledge building and sharing of knowledge, due to the increased levels of interaction for one-controller. As research has shown concept mapping can be useful for promoting the understanding of complex ideas, therefore the adoption of the WiiConcept tool as part of a small group learning activity may lead to deeper levels of understanding. Additionally, the use of gestures suggests that this mode of input does not affect the amount of words, nodes, and edges created in a concept map. Further research, over a longer period of time, may see improvement with this form of interaction, with increased mastery of gestural movement leading to greater detail of conceptual mapping
Interactive Spaces Natural interfaces supporting gestures and manipulations in interactive spaces
This doctoral dissertation focuses on the development of interactive spaces through the use of
natural interfaces based on gestures and manipulative actions. In the real world people use their
senses to perceive the external environment and they use manipulations and gestures to
explore the world around them, communicate and interact with other individuals. From this
perspective the use of natural interfaces that exploit the human sensorial and explorative
abilities helps filling the gap between physical and digital world.
In the first part of this thesis we describe the work made for improving interfaces and devices for
tangible, multi touch and free hand interactions. The idea is to design devices able to work also
in uncontrolled environments, and in situations where control is mostly of the physical type
where even the less experienced users can express their manipulative exploration and gesture
communication abilities.
We also analyze how it can be possible to mix these techniques to create an interactive space,
specifically designed for teamwork where the natural interfaces are distributed in order to
encourage collaboration.
We then give some examples of how these interactive scenarios can host various types of
applications facilitating, for instance, the exploration of 3D models, the enjoyment of multimedia
contents and social interaction.
Finally we discuss our results and put them in a wider context, focusing our attention particularly
on how the proposed interfaces actually improve people’s lives and activities and the interactive
spaces become a place of aggregation where we can pursue objectives that are both personal
and shared with others
Interactive Spaces Natural interfaces supporting gestures and manipulations in interactive spaces
This doctoral dissertation focuses on the development of interactive spaces through the use of
natural interfaces based on gestures and manipulative actions. In the real world people use their
senses to perceive the external environment and they use manipulations and gestures to
explore the world around them, communicate and interact with other individuals. From this
perspective the use of natural interfaces that exploit the human sensorial and explorative
abilities helps filling the gap between physical and digital world.
In the first part of this thesis we describe the work made for improving interfaces and devices for
tangible, multi touch and free hand interactions. The idea is to design devices able to work also
in uncontrolled environments, and in situations where control is mostly of the physical type
where even the less experienced users can express their manipulative exploration and gesture
communication abilities.
We also analyze how it can be possible to mix these techniques to create an interactive space,
specifically designed for teamwork where the natural interfaces are distributed in order to
encourage collaboration.
We then give some examples of how these interactive scenarios can host various types of
applications facilitating, for instance, the exploration of 3D models, the enjoyment of multimedia
contents and social interaction.
Finally we discuss our results and put them in a wider context, focusing our attention particularly
on how the proposed interfaces actually improve people’s lives and activities and the interactive
spaces become a place of aggregation where we can pursue objectives that are both personal
and shared with others
Recommended from our members
Redesigning the human-robot interface : intuitive teleoperation of anthropomorphic robots
textA novel interface for robotic teleoperation was developed to enable accurate and highly efficient teleoperation of the Industrial Reconfigurable Anthropomorphic Dual-arm (IRAD) system and other robotic systems. In order to achieve a revolutionary increase in operator productivity, the bilateral/master-slave approach must give way to shared autonomy and unilateral control; autonomy must be employed where possible, and appropriate sensory feedback only where autonomy is impossible; and today’s low-information/high feedback model must be replaced by one that emphasizes feedforward precision and minimal corrective feedback. This is emphasized for task spaces outside of the traditional anthropomorphic scale such as mobile manipulation (i.e. large task spaces) and high precision tasks (i.e. very small task spaces). The system is demonstrated using an anthropomorphically dimensioned industrial manipulator working in task spaces from one meter to less than one millimeter, in both simulation and hardware. This thesis discusses the design requirements and philosophy of this interface, provides a summary of prototype teleoperation hardware, simulation environment, test-bed hardware, and experimental results.Mechanical Engineerin
- …