212 research outputs found

    Musical Haptics

    Get PDF
    Haptic Musical Instruments; Haptic Psychophysics; Interface Design and Evaluation; User Experience; Musical Performanc

    Musical Haptics

    Get PDF
    Haptic Musical Instruments; Haptic Psychophysics; Interface Design and Evaluation; User Experience; Musical Performanc

    Musical Haptics

    Get PDF
    This Open Access book offers an original interdisciplinary overview of the role of haptic feedback in musical interaction. Divided into two parts, part I examines the tactile aspects of music performance and perception, discussing how they affect user experience and performance in terms of usability, functionality and perceived quality of musical instruments. Part II presents engineering, computational, and design approaches and guidelines that have been applied to render and exploit haptic feedback in digital musical interfaces. Musical Haptics introduces an emerging field that brings together engineering, human-computer interaction, applied psychology, musical aesthetics, and music performance. The latter, defined as the complex system of sensory-motor interactions between musicians and their instruments, presents a well-defined framework in which to study basic psychophysical, perceptual, and biomechanical aspects of touch, all of which will inform the design of haptic musical interfaces. Tactile and proprioceptive cues enable embodied interaction and inform sophisticated control strategies that allow skilled musicians to achieve high performance and expressivity. The use of haptic feedback in digital musical interfaces is expected to enhance user experience and performance, improve accessibility for disabled persons, and provide an effective means for musical tuition and guidance

    Multi-Sensory Interaction for Blind and Visually Impaired People

    Get PDF
    This book conveyed the visual elements of artwork to the visually impaired through various sensory elements to open a new perspective for appreciating visual artwork. In addition, the technique of expressing a color code by integrating patterns, temperatures, scents, music, and vibrations was explored, and future research topics were presented. A holistic experience using multi-sensory interaction acquired by people with visual impairment was provided to convey the meaning and contents of the work through rich multi-sensory appreciation. A method that allows people with visual impairments to engage in artwork using a variety of senses, including touch, temperature, tactile pattern, and sound, helps them to appreciate artwork at a deeper level than can be achieved with hearing or touch alone. The development of such art appreciation aids for the visually impaired will ultimately improve their cultural enjoyment and strengthen their access to culture and the arts. The development of this new concept aids ultimately expands opportunities for the non-visually impaired as well as the visually impaired to enjoy works of art and breaks down the boundaries between the disabled and the non-disabled in the field of culture and arts through continuous efforts to enhance accessibility. In addition, the developed multi-sensory expression and delivery tool can be used as an educational tool to increase product and artwork accessibility and usability through multi-modal interaction. Training the multi-sensory experiences introduced in this book may lead to more vivid visual imageries or seeing with the mind’s eye

    Ashitaka: an audiovisual instrument

    Get PDF
    This thesis looks at how sound and visuals may be linked in a musical instrument, with a view to creating such an instrument. Though it appears to be an area of significant interest, at the time of writing there is very little existing - written, or theoretical - research available in this domain. Therefore, based on Michel Chion’s notion of synchresis in film, the concept of a fused, inseparable audiovisual material is presented. The thesis then looks at how such a material may be created and manipulated in a performance situation. A software environment named Heilan was developed in order to provide a base for experimenting with different approaches to the creation of audiovisual instruments. The software and a number of experimental instruments are discussed prior to a discussion and evaluation of the final ‘Ashitaka’ instrument. This instrument represents the culmination of the work carried out for this thesis, and is intended as a first step in identifying the issues and complications involved in the creation of such an instrument

    Human-Computer interaction methodologies applied in the evaluation of haptic digital musical instruments

    Get PDF
    Recent developments in interactive technologies have seen major changes in the manner in which artists, performers, and creative individuals interact with digital music technology; this is due to the increasing variety of interactive technologies that are readily available today. Digital Musical Instruments (DMIs) present musicians with performance challenges that are unique to this form of computer music. One of the most significant deviations from conventional acoustic musical instruments is the level of physical feedback conveyed by the instrument to the user. Currently, new interfaces for musical expression are not designed to be as physically communicative as acoustic instruments. Specifically, DMIs are often void of haptic feedback and therefore lack the ability to impart important performance information to the user. Moreover, there currently is no standardised way to measure the effect of this lack of physical feedback. Best practice would expect that there should be a set of methods to effectively, repeatedly, and quantifiably evaluate the functionality, usability, and user experience of DMIs. Earlier theoretical and technological applications of haptics have tried to address device performance issues associated with the lack of feedback in DMI designs and it has been argued that the level of haptic feedback presented to a user can significantly affect the user’s overall emotive feeling towards a musical device. The outcome of the investigations contained within this thesis are intended to inform new haptic interface

    Tangibility and Richness in Digital Musical Instrument Design

    Get PDF
    PhDThe sense of touch plays a fundamental role in musical performance: alongside hearing, it is the primary sensory modality used when interacting with musical instruments. Learning to play a musical instrument is one of the most developed haptic cultural practices, and within acoustic musical practice at large, the importance of touch and its close relationship to virtuosity and expression is well recognised. With digital musical instruments (DMIs) – instruments involving a combination of sensors and a digital sound engine – touch-mediated interaction remains the foremost means of control, but the interfaces of such instruments do not yet engage with the full spectrum of sensorimotor capabilities of a performer. This poses compelling questions for digital instrument design: how does the nuance and richness of physical interaction with an instrument manifest itself in the digital domain? Which design parameters are most important for haptic experience, and how do these parameters affect musical performance? Built around three practical studies which utilise DMIs as technology probes, this thesis addresses these questions from the point of view of design, of empirical musicology, and of tangible computing. In the first study musicians played a DMI with continuous pitch control and vibrotactile feedback in order to understand how dynamic tactile feedback can be implemented and how it influences musician experience and performance. The results suggest that certain vibrotactile feedback conditions can increase musicians’ tuning accuracy, but also disrupt temporal performance. The second study examines the influence of asynchronies between audio and haptic feedback. Two groups of musicians, amateurs and professional percussionists, were tasked with performing on a percussive DMI with variable action-sound latency. Differences between the two groups in terms of temporal accuracy and quality judgements illustrate the complex effects of asynchronous multimodal feedback. In the third study guitar-derivative DMIs with variable levels of control richness were observed with non-musicians and guitarists. The results from this study help clarify the relationship between tangible design factors, sensorimotor expertise and instrument behaviour. This thesis introduces a descriptive model of performer-instrument interaction, the projection model, which unites the design investigations from each study and provides a series of reflections and suggestions on the role of touch in DMI design.Doctoral Training Centre for Media and Arts Technolog

    Creating and evaluating embodied interactive experiences: case studies of full-body, sonic and tactile enaction.

    Get PDF
    This thesis contributes to the field of embodied and multimodal interaction by presenting the development of different original interactive systems. Using a constructive approach, a variety of real-time user interaction situations were designed and tested, two cases of human-virtual character bodily interaction, two interactive sonifications of trampoline jumping, collaborative interaction in mobile music performance and tangible and tactile interaction with virtual sounds. While diverse in terms of application, all the explored interaction techniques belong to the context of augmentation and are grounded in the theory of embodiment and strategies for natural human-computer interaction (HCI). The cases have been contextualized within the umbrella of enaction, a paradigm of cognitive science that addresses the user as an embodied agent situated in an environment and coupled to it through sensorimotor activity. This activity of sensing and action is studied through different modalities: auditory, tactile and visual and combinations of these. The designed applications aim at a natural interaction with the system, being full-body, tangible and spatially aware. Particularly sonic interaction has been explored in the context of music creation, sports and auditory display. These technology-mediated scenarios are evaluated in order to understand what the adopted interaction techniques can bring to the user experience, how they modify impressions and enjoyment. The publications also discuss the enabling technologies used for the development, including motion tracking and programmed hardware for the tactile-sonic interaction and sonic and tangible interaction. Results show that combining full-body interaction with auditory augmentation and sonic interaction can modify the perception, observed behavior and emotion during the experience. Using spatial interaction together with tangible interaction or tactile feedback provides for a multimodal experience of exploring a mixed reality environment where audio can be accessed and manipulated with natural interaction. Embodied and spatial interaction brings playfulness to a mobile music improvisation, shifting the focus of the experience from music-making towards movement-based gaming. Finally, two novel implementations of full-body interaction based on the enactive paradigm are presented. In these designed scenarios of enaction the participant is motion tracked and a virtual character rendered as a stick figure is displayed in front of her on a screen. Results from the user studies show how the involvement of the body is crucial in understanding the behavior of a virtual character or a digital representation of the self in a gaming scenario

    Design Strategies for Adaptive Social Composition: Collaborative Sound Environments

    Get PDF
    In order to develop successful collaborative music systems a variety of subtle interactions need to be identified and integrated. Gesture capture, motion tracking, real-time synthesis, environmental parameters and ubiquitous technologies can each be effectively used for developing innovative approaches to instrument design, sound installations, interactive music and generative systems. Current solutions tend to prioritise one or more of these approaches, refining a particular interface technology, software design or compositional approach developed for a specific composition, performer or installation environment. Within this diverse field a group of novel controllers, described as ‘Tangible Interfaces’ have been developed. These are intended for use by novices and in many cases follow a simple model of interaction controlling synthesis parameters through simple user actions. Other approaches offer sophisticated compositional frameworks, but many of these are idiosyncratic and highly personalised. As such they are difficult to engage with and ineffective for groups of novices. The objective of this research is to develop effective design strategies for implementing collaborative sound environments using key terms and vocabulary drawn from the available literature. This is articulated by combining an empathic design process with controlled sound perception and interaction experiments. The identified design strategies have been applied to the development of a new collaborative digital instrument. A range of technical and compositional approaches was considered to define this process, which can be described as Adaptive Social Composition. Dan Livingston

    Sonic interactions in virtual environments

    Get PDF
    This book tackles the design of 3D spatial interactions in an audio-centered and audio-first perspective, providing the fundamental notions related to the creation and evaluation of immersive sonic experiences. The key elements that enhance the sensation of place in a virtual environment (VE) are: Immersive audio: the computational aspects of the acoustical-space properties of Virutal Reality (VR) technologies Sonic interaction: the human-computer interplay through auditory feedback in VE VR systems: naturally support multimodal integration, impacting different application domains Sonic Interactions in Virtual Environments will feature state-of-the-art research on real-time auralization, sonic interaction design in VR, quality of the experience in multimodal scenarios, and applications. Contributors and editors include interdisciplinary experts from the fields of computer science, engineering, acoustics, psychology, design, humanities, and beyond. Their mission is to shape an emerging new field of study at the intersection of sonic interaction design and immersive media, embracing an archipelago of existing research spread in different audio communities and to increase among the VR communities, researchers, and practitioners, the awareness of the importance of sonic elements when designing immersive environments
    • 

    corecore