735 research outputs found

    Buttons, Handles, and Keys: Advances in Continuous-Control Keyboard Instruments

    Get PDF
    This is the peer reviewed version of the following article: Buttons, Handles, and Keys: Advances in Continuous-Control Keyboard Instruments, which has been published in final form at http://dx.doi.org/10.1162/COMJ_a_00297. This article may be used for non-commercial purposes in accordance with MIT Press Journal's Terms and Conditions for Self-Archiving. © 2015, MIT Press Journal

    Multiple Media Interfaces for Music Therapy

    Get PDF
    This article describes interfaces (and the supporting technological infrastructure) to create audiovisual instruments for use in music therapy. In considering how the multidimensional nature of sound requires multidimensional input control, we propose a model to help designers manage the complex mapping between input devices and multiple media software. We also itemize a research agenda

    Beyond key velocity: Continuous sensing for expressive control on the Hammond Organ and Digital keyboards

    Get PDF
    In this thesis we seek to explore the potential for continuous key position to be used as an expressive control in keyboard musical instruments, and how preexisting skills can be adapted to leverage this additional control. Interaction between performer and sound generation on a keyboard instrument is often restricted to a number of discrete events on the keys themselves (notes onsets and offsets), while complementary continuous control is provided via additional interfaces, such as pedals, modulation wheels and knobs. The rich vocabulary of gestures that skilled performers can achieve on the keyboard is therefore often simplified to a single, discrete velocity measurement. A limited number of acoustical and electromechanical keyboard instruments do, however, present affordances of continuous key control, so that the role of the key is not limited to delivering discrete events, but its instantaneous position is, to a certain extent, an element of expressive control. Recent evolutions in sensing technologies allow to leverage continuous key position as an expressive element in the sound generation of digital keyboard musical instruments. We start by exploring the expression available on the keys of the Hammond organ, where nine contacts are closed at different points of the key throw for each key onset and we find that the velocity and the percussiveness of the touch affect the way the contacts close and bounce, producing audible differences in the onset transient of each note. We develop an embedded hardware and software environment for low-latency sound generation controlled by continuous key position, which we use to create two digital keyboard instruments. The first of these emulates the sound of a Hammond and can be controlled with continuous key position, so that it allows for arbitrary mapping between the key position and the nine virtual contacts of the digital sound generator. A study with 10 musicians shows that, when exploring the instrument on their own, the players can appreciate the differences between different settings and tend to develop a personal preference for one of them. In the second instrument, continuous key position is the fundamental means of expression: percussiveness, key position and multi-key gestures control the parameters of a physical model of a flute. In a study with 6 professional musicians playing this instrument we gather insights on the adaptation process, the limitations of the interface and the transferability of traditional keyboard playing techniques

    Physical Interactions with Digital Strings - A hybrid approach to a digital keyboard instrument

    Get PDF
    A new hybrid approach to digital keyboard playing is presented, where the actual acoustic sounds from a digital keyboard are captured with contact microphones and applied as excitation signals to a digital model of a prepared piano, i.e., an extended wave-guide model of strings with the possibility of stopping and muting the strings at arbitrary positions. The parameters of the string model are controlled through TouchKeys multitouch sensors on each key, combined with MIDI data and acoustic signals from the digital keyboard frame, using a novel mapping. The instrument is evaluated from a performing musician's perspective, and emerging playing techniques are discussed. Since the instrument is a hybrid acoustic-digital system with several feedback paths between the domains, it provides for expressive and dynamic playing, with qualities approaching that of an acoustic instrument, yet with new kinds of control. The contributions are two-fold. First, the use of acoustic sounds from a physical keyboard for excitations and resonances results in a novel hybrid keyboard instrument in itself. Second, the digital model of "inside piano" playing, using multitouch keyboard data, allows for performance techniques going far beyond conventional keyboard playing

    Not All Gestures Are Created Equal: Gesture and Visual Feedback in Interaction Spaces.

    Full text link
    As multi-touch mobile computing devices and open-air gesture sensing technology become increasingly commoditized and affordable, they are also becoming more widely adopted. It became necessary to create new interaction design specifically for gesture-based interfaces to meet the growing needs of users. However, a deeper understanding of the interplay between gesture, and visual and sonic output is needed to make meaningful advances in design. This thesis addresses this crucial step in development by investigating the interrelation between gesture-based input, and visual representation and feedback, in gesture-driven creative computing. This thesis underscores the importance that not all gestures are created equal, and there are multiple factors that affect their performance. For example, a drag gesture in visual programming scenario performs differently than in a target acquisition task. The work presented here (i) examines the role of visual representation and mapping in gesture input, (ii) quantifies user performance differences in gesture input to examine the effect of multiple factors on gesture interactions, and (iii) develops tools and platforms for exploring visual representations of gestures. A range of gesture spaces and scenarios, from continuous sound control with open-air gestures to mobile visual programming with discrete gesture-driven commands, was assessed. Findings from this thesis reveals a rich space of complex interrelations between gesture input and visual feedback and representations. The contributions of this thesis also includes the development of an augmented musical keyboard with 3-D continuous gesture input and projected visualization, as well as a touch-driven visual programming environment for interactively constructing dynamic interfaces. These designs were evaluated by a series of user studies in which gesture-to-sound mapping was found to have a significant affect on user performance, along with other factors such as the selection of visual representation and device size. A number of counter-intuitive findings point to the potentially complex interactions between factors such as device size, task and scenarios, which exposes the need for further research. For example, the size of the device was found to have contradictory effects in two different scenarios. Furthermore, this work presents a multi-touch gestural environment to support the prototyping of gesture interactions.PhDComputer Science and EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/113456/1/yangqi_1.pd

    Interaction Design for Digital Musical Instruments

    Get PDF
    The thesis aims to elucidate the process of designing interactive systems for musical performance that combine software and hardware in an intuitive and elegant fashion. The original contribution to knowledge consists of: (1) a critical assessment of recent trends in digital musical instrument design, (2) a descriptive model of interaction design for the digital musician and (3) a highly customisable multi-touch performance system that was designed in accordance with the model. Digital musical instruments are composed of a separate control interface and a sound generation system that exchange information. When designing the way in which a digital musical instrument responds to the actions of a performer, we are creating a layer of interactive behaviour that is abstracted from the physical controls. Often, the structure of this layer depends heavily upon: 1. The accepted design conventions of the hardware in use 2. Established musical systems, acoustic or digital 3. The physical configuration of the hardware devices and the grouping of controls that such configuration suggests This thesis proposes an alternate way to approach the design of digital musical instrument behaviour – examining the implicit characteristics of its composite devices. When we separate the conversational ability of a particular sensor type from its hardware body, we can look in a new way at the actual communication tools at the heart of the device. We can subsequently combine these separate pieces using a series of generic interaction strategies in order to create rich interactive experiences that are not immediately obvious or directly inspired by the physical properties of the hardware. This research ultimately aims to enhance and clarify the existing toolkit of interaction design for the digital musician

    Exciting Instrumental Data: Toward an Expanded Action-Oriented Ontology for Digital Music Performance

    Get PDF
    Musical performance using digital musical instruments has obfuscated the relationship between observable musical gestures and the resultant sound. This is due to the sound producing mechanisms of digital musical instruments being hidden within the digital music making system. The difficulty in observing embodied artistic expression is especially true for musical instruments that are comprised of digital components only. Despite this characteristic of digital music performance practice, this thesis argues that it is possible to bring digital musical performance further within our action-oriented ontology by understanding the digital musician through the lens of Lévi-Strauss’ notion of the bricoleur. Furthermore, by examining musical gestures with these instruments through a multi-tiered analytical framework that accounts for the physical computing elements necessarily present in all digital music making systems, we can further understand and appreciate the intricacies of digital music performance practice and culture

    Illuminating music : impact of color hue for background lighting on emotional arousal in piano performance videos

    Get PDF
    This study sought to determine if hues overlayed on a video recording of a piano performance would systematically influence perception of its emotional arousal level. The hues were artificially added to a series of four short video excerpts of different performances using video editing software. Over two experiments 106 participants were sorted into 4 conditions, with each viewing different combinations of musical excerpts (two excerpts with nominally high arousal and two excerpts with nominally low arousal) and hue (red or blue) combinations. Participants rated the emotional arousal depicted by each excerpt. Results indicated that the overall arousal ratings were consistent with the nominal arousal of the selected excerpts. However, hues added to video produced no significant effect on arousal ratings, contrary to predictions. This could be due to the domination of the combined effects of other channels of information (e.g., the music and player movement) over the emotional effects of the hypothesized influence of hue on perceived performance (red expected to enhance and blue to reduce arousal of the performance). To our knowledge this is the first study to investigate the impact of these hues upon perceived arousal of music performance, and has implications for musical performers and stage lighting. Further research that investigates reactions during live performance and manipulation of a wider range of lighting hues, saturation and brightness levels, and editing techniques, is recommended to further scrutinize the veracity of the findings

    Musical Gesture through the Human Computer Interface: An Investigation using Information Theory

    Get PDF
    This study applies information theory to investigate human ability to communicate using continuous control sensors with a particular focus on informing the design of digital musical instruments. There is an active practice of building and evaluating such instruments, for instance, in the New Interfaces for Musical Expression (NIME) conference community. The fidelity of the instruments can depend on the included sensors, and although much anecdotal evidence and craft experience informs the use of these sensors, relatively little is known about the ability of humans to control them accurately. This dissertation addresses this issue and related concerns, including continuous control performance in increasing degrees-of-freedom, pursuit tracking in comparison with pointing, and the estimations of musical interface designers and researchers of human performance with continuous control sensors. The methodology used models the human-computer system as an information channel while applying concepts from information theory to performance data collected in studies of human subjects using sensing devices. These studies not only add to knowledge about human abilities, but they also inform on issues in musical mappings, ergonomics, and usability
    • …
    corecore