57 research outputs found

    Resurrecting the Tromba Marina:A Bowed Virtual Reality Instrument using Haptic Feedback and Accurate Physical Modelling

    Get PDF
    This paper proposes a multisensory simulation of a tromba marina – a bowed string instrument in virtual reality. The auditory feedback is generated by an accurate physical model, the haptic feedback is provided by the PHANTOM Omni, and the visual feedback is rendered through an Oculus Rift CV1 head-mounted display (HMD). Moreover, a user study exploring the experience of interacting with a virtual bowed string instrument is presented, as well as evaluating the playability of the system. The study comprises of both qualitative (observations, think aloud and interviews) and quantitative (survey) data collection methods. The results indicate that the implementation was successful, offering participants realistic feedback, as well as a satisfactory multisensory experience, allowing them to use the system as a musical instrument

    Bowing virtual strings with realistic haptic feedback

    Get PDF
    We present a music interface implementing a bowed string. The bow is realised using a commercially available haptic device, consisting of a stylus attached to a robotic arm. While playing the virtual strings with the stylus reproducing the bow, users feel both the elastic force from the strings and the friction resulting from the interaction with their surfaces. The audio-haptic feedback is obtained by a physical model: four stiff strings are simulated using a finite difference time domain method, modelled as 1-Delements in the virtual 3-D space. The bow is simply modelled as a rigid cylinder that can move free in this space, and interact with the strings. Finally, the frictional interaction between such elements is modelled by a nonlinear friction model capable of reproducing the characteristic stick-slip phenomenon observed during string bowing. Moreover, the model can be dynamically controlled in one parameter so as to become more sticky or slippery. By turning on and off the frictional feedback, users can appreciate the significance of this interaction. A real-time visualisation of the bowed strings complements the audio-haptic displa

    A MODELLER-SIMULATOR FOR INSTRUMENTAL PLAYING OF VIRTUAL MUSICAL INSTRUMENTS

    No full text
    International audienceThis paper presents a musician-oriented modelling and simulation environment for designing physically modelled virtual instruments and interacting with them via a high performance haptic device. In particular, our system allows restoring the physical coupling between the user and the manipulated virtual instrument, a key factor for expressive playing of traditional acoustical instruments that is absent in the vast majority of computer-based musical systems. We first analyse the various uses of haptic devices in Computer Music, and introduce the various technologies involved in our system. We then present the modeller and simulation environments, and examples of musical virtual instruments created with this new environment

    INTERACTIVE PHYSICAL DESIGN AND HAPTIC PLAYING OF VIRTUAL MUSICAL INSTRUMENTS

    No full text
    International audienceIn Computer Music, a practical approach of many Digital Musical Instruments is to separate the gestural input stage from the sound synthesis stage. While these instruments offer many creative possibilities, they present a strong rupture with traditional acoustic instruments, as the physical coupling between human and sound is broken. This coupling plays a crucial role for the expressive musical playing of acoustic instruments; we believe restoring it in a digital context is of equal importance for revealing the full expressive potential of digital instruments. This paper first presents haptic and physical modelling technologies for representing the mechano-acoustical instrumental situation in the context of DMIs. From these technologies, a prototype environment has been implemented for both designing virtual musical instruments and interacting with them via a force feedback device, able to preserve the energetic coherency of the musician-sound chain

    Rubbing a Physics Based Synthesis Model: From Mouse Control to Frictional Haptic Feedback

    Get PDF
    This paper investigates three kinds of interactions for a friction based virtual music instrument. The sound synthesis model consists of a bank of mass-spring-dampers excited via rubbing. A nonlinear static friction model capable of reproducing the characteristic stick-slip phenomenon observed in frictional interaction is employed, allowing for dynamic variation of the sliding friction. The different controls developed allow for gradually increasing the interplay between performer and instrument. The key excitation parameters, e.g., the rubbing velocity and the rubbing normal force are controlled using three different interfaces: a standard mouse, a Sensel Morph, and a 3D Systems Touch X. The Sensel Morph is a touchpad with pressure sensitivity, allowing for a natural exertion of the normal force; the 3D Systems Touch X is a haptic device that renders both resistance to the applied normal force, as well as the stick-slip motion resulting from the friction interaction. A preliminary user study aiming to compare the experience of performing with the different interfaces was carried out. The results indicate that the haptic feedback provides a more intuitive and enjoyable experience. However, extra features do not necessarily improve the user interaction, as the results suggest a preference for the mouse over the Sensel

    Multisensory instrumental dynamics as an emergent paradigm for digital musical creation

    Get PDF
    The nature of human/instrument interaction is a long-standing area of study, drawing interest from fields as diverse as philosophy, cognitive sciences, anthropology, human–computer-interaction, and artistic creation. In particular, the case of the interaction between performer and musical instrument provides an enticing framework for studying the instrumental dynamics that allow for embodiment, skill acquisition and virtuosity with (electro-)acoustical instruments, and questioning how such notions may be transferred into the realm of digital music technologies and virtual instruments. This paper offers a study of concepts and technologies allowing for instrumental dynamics with Digital Musical Instruments, through an analysis of haptic-audio creation centred on (a) theoretical and conceptual frameworks, (b) technological components—namely physical modelling techniques for the design of virtual mechanical systems and force-feedback technologies allowing mechanical coupling with them, and (c) a corpus of artistic works based on this approach. Through this retrospective, we argue that artistic works created in this field over the last 20 years—and those yet to come—may be of significant importance to the haptics community as new objects that question physicality, tangibility, and creativity from a fresh and rather singular angle. Following which, we discuss the convergence of efforts in this field, challenges still ahead, and the possible emergence of a new transdisciplinary community focused on multisensory digital art forms

    Physical modelling meets machine learning: performing music with a virtual string ensemble

    Get PDF
    This dissertation describes a new method of computer performance of bowed string instruments (violin, viola, cello) using physical simulations and intelligent feedback control. Computer synthesis of music performed by bowed string instruments is a challenging problem. Unlike instruments whose notes originate with a single discrete excitation (e.g., piano, guitar, drum), bowed string instruments are controlled with a continuous stream of excitations (i.e. the bow scraping against the string). Most existing synthesis methods utilize recorded audio samples, which perform quite well for single-excitation instruments but not continuous-excitation instruments. This work improves the realism of synthesis of violin, viola, and cello sound by generating audio through modelling the physical behaviour of the instruments. A string's wave equation is decomposed into 40 modes of vibration, which can be acted upon by three forms of external force: A bow scraping against the string, a left-hand finger pressing down, and/or a right-hand finger plucking. The vibration of each string exerts force against the instrument bridge; these forces are summed and convolved with the instrument body impulse response to create the final audio output. In addition, right-hand haptic output is created from the force of the bow against the string. Physical constants from ten real instruments (five violins, two violas, and three cellos) were measured and used in these simulations. The physical modelling was implemented in a high-performance library capable of simulating audio on a desktop computer one hundred times faster than real-time. The program also generates animated video of the instruments being performed. To perform music with the physical models, a virtual musician interprets the musical score and generates actions which are then fed into the physical model. The resulting audio and haptic signals are examined with a support vector machine, which adjusts the bow force in order to establish and maintain a good timbre. This intelligent feedback control is trained with human input, but after the initial training is completed the virtual musician performs autonomously. A PID controller is used to adjust the position of the left-hand finger to correct any flaws in the pitch. Some performance parameters (initial bow force, force correction, and lifting factors) require an initial value for each string and musical dynamic; these are calibrated automatically using the previously-trained support vector machines. The timbre judgements are retained after each performance and are used to pre-emptively adjust bowing parameters to avoid or mitigate problematic timbre for future performances of the same music. The system is capable of playing sheet music with approximately the same ability level as a human music student after two years of training. Due to the number of instruments measured and the generality of the machine learning, music can be performed with ensembles of up to ten stringed instruments, each with a distinct timbre. This provides a baseline for future work in computer control and expressive music performance of virtual bowed string instruments

    Re-Sonification of Objects, Events, and Environments

    Get PDF
    abstract: Digital sound synthesis allows the creation of a great variety of sounds. Focusing on interesting or ecologically valid sounds for music, simulation, aesthetics, or other purposes limits the otherwise vast digital audio palette. Tools for creating such sounds vary from arbitrary methods of altering recordings to precise simulations of vibrating objects. In this work, methods of sound synthesis by re-sonification are considered. Re-sonification, herein, refers to the general process of analyzing, possibly transforming, and resynthesizing or reusing recorded sounds in meaningful ways, to convey information. Applied to soundscapes, re-sonification is presented as a means of conveying activity within an environment. Applied to the sounds of objects, this work examines modeling the perception of objects as well as their physical properties and the ability to simulate interactive events with such objects. To create soundscapes to re-sonify geographic environments, a method of automated soundscape design is presented. Using recorded sounds that are classified based on acoustic, social, semantic, and geographic information, this method produces stochastically generated soundscapes to re-sonify selected geographic areas. Drawing on prior knowledge, local sounds and those deemed similar comprise a locale's soundscape. In the context of re-sonifying events, this work examines processes for modeling and estimating the excitations of sounding objects. These include plucking, striking, rubbing, and any interaction that imparts energy into a system, affecting the resultant sound. A method of estimating a linear system's input, constrained to a signal-subspace, is presented and applied toward improving the estimation of percussive excitations for re-sonification. To work toward robust recording-based modeling and re-sonification of objects, new implementations of banded waveguide (BWG) models are proposed for object modeling and sound synthesis. Previous implementations of BWGs use arbitrary model parameters and may produce a range of simulations that do not match digital waveguide or modal models of the same design. Subject to linear excitations, some models proposed here behave identically to other equivalently designed physical models. Under nonlinear interactions, such as bowing, many of the proposed implementations exhibit improvements in the attack characteristics of synthesized sounds.Dissertation/ThesisPh.D. Electrical Engineering 201
    • …
    corecore