2,895 research outputs found

    Sampling the past:a tactile approach to interactive musical instrument exhibits in the heritage sector

    Get PDF
    In the last decade, the heritage sector has had to adapt to a shifting cultural landscape of public expectations and attitudes towards ownership and intellectual property. One way it has done this is to focus on each visitor’s encounter and provide them with a sense of experiential authenticity.There is a clear desire by the public to engage with music collections in this way, and a sound museological rationale for providing such access, but the approach raises particular curatorial problems, specifically how do we meaningfully balance access with the duty to preserve objects for future generations?This paper charts the development of one such project. Based at Fenton House in Hampstead, and running since 2008, the project seeks to model digitally the keyboard instruments in the Benton Fletcher Collection and provide a dedicated interactive exhibit, which allows visitors to view all of the instruments in situ, and then play them through a custom-built two-manual MIDI controller with touch-screen interface.We discuss the approach to modelling, which uses high-definition sampling, and highlight the strengths and weaknesses of the exhibit as it currently stands, with particular focus on its key shortcoming: at present, there is no way to effectively model the key feel of a historic keyboard instrument.This issue is of profound importance, since the feel of any instrument is fundamental to its character, and shapes the way performers relate to it. The issue is further compounded if we are to consider a single dedicated keyboard as being the primary mode of interface for several instrument models of different classes, each with its own characteristic feel.We conclude by proposing an outline solution to this problem, detailing early work on a real-time adaptive haptic keyboard interface that changes its action in response to sampled resistance curves, measured on a key-by-key basis from the original instruments

    Haptic Wave

    Get PDF
    We present the Haptic Wave, a device that allows cross-modal mapping of digital audio to the haptic domain, intended for use by audio producers/engineers with visual impairments. We describe a series of participatory design activities adapted to non-sighted users where the act of prototyping facilitates dialog. A series of workshops scoping user needs, and testing a technology mock up and lo-fidelity prototype fed into the design of a final high-spec prototype. The Haptic Wave was tested in the laboratory, then deployed in real world settings in recording studios and audio production facilities. The cross-modal mapping is kinesthetic and allows the direct manipulation of sound without the translation of an existing visual interface. The research gleans insight into working with users with visual impairments, and transforms perspective to think of them as experts in non-visual interfaces for all users. This received the Best Paper Award at CHI 2016, the most prestigious human-computer interaction conference and one of the top-ranked conferences in computer science

    A novel haptic model and environment for maxillofacial surgical operation planning and manipulation

    Get PDF
    This paper presents a practical method and a new haptic model to support manipulations of bones and their segments during the planning of a surgical operation in a virtual environment using a haptic interface. To perform an effective dental surgery it is important to have all the operation related information of the patient available beforehand in order to plan the operation and avoid any complications. A haptic interface with a virtual and accurate patient model to support the planning of bone cuts is therefore critical, useful and necessary for the surgeons. The system proposed uses DICOM images taken from a digital tomography scanner and creates a mesh model of the filtered skull, from which the jaw bone can be isolated for further use. A novel solution for cutting the bones has been developed and it uses the haptic tool to determine and define the bone-cutting plane in the bone, and this new approach creates three new meshes of the original model. Using this approach the computational power is optimized and a real time feedback can be achieved during all bone manipulations. During the movement of the mesh cutting, a novel friction profile is predefined in the haptical system to simulate the force feedback feel of different densities in the bone

    Feeling for Sound:Mapping Sonic Data to Haptic Perceptions

    Get PDF
    This paper presents a system for exploring different dimensions of a soundthrough the use of haptic feedback. The Novint Falcon force feedback interfaceis used to scan through soundfiles as a subject moves their hand horizontallyfrom left to right, and to relay information about volume, frequency content,noisiness, or potentially any analysable parameter back to the subject throughforces acting on their hand. General practicalities of mapping sonic elements to physical forces areconsidered, such as the problem of representing detailed data through vaguephysical sensation, approaches to applying forces to the hand that do notinterfering with the smooth operation of the device, and the relative merits ofdiscreet and continuous mappings. Three approaches to generating the forcevector are discussed: 1) the use of simulated detents to identify areas of anaudio parameter over a certain threshold, 2) applying friction proportional tothe level of the audio parameter along the axis of movement, and 3) creatingforces perpendicular to the subject's hand movements.Presentation of audio information in this manner could be beneficial for`pre-feeling' as a method for selecting material to play during a liveperformance, assisting visually impaired audio engineers, and as a generalaugmentation of standard audio editing environments

    Embodied Musical Interaction

    Get PDF
    Music is a natural partner to human-computer interaction, offering tasks and use cases for novel forms of interaction. The richness of the relationship between a performer and their instrument in expressive musical performance can provide valuable insight to human-computer interaction (HCI) researchers interested in applying these forms of deep interaction to other fields. Despite the longstanding connection between music and HCI, it is not an automatic one, and its history arguably points to as many differences as it does overlaps. Music research and HCI research both encompass broad issues, and utilize a wide range of methods. In this chapter I discuss how the concept of embodied interaction can be one way to think about music interaction. I propose how the three “paradigms” of HCI and three design accounts from the interaction design literature can serve as a lens through which to consider types of music HCI. I use this conceptual framework to discuss three different musical projects—Haptic Wave, Form Follows Sound, and BioMuse
    corecore