5 research outputs found

    Multisensory plucked instrumentmodeling in unity3D: FromKeytar to accurate string prototyping

    Get PDF
    Keytar is a plucked guitar simulationmockup developed with Unity3D that provides auditory, visual, and haptic feedback to the player through a PhantomOmni robotic arm. Starting froma description of the implementation of the virtual instrument, we discuss our ongoing work. The ultimate goal is the creation of a set of software tools available for developing plucked instruments in Unity3D. Using such tools, sonic interaction designers can efficiently simulate plucked string prototypes and realize multisensory interactions with virtual instruments for unprecedented purposes, such as testing innovative plucked string interfaces or training machine learning algorithms with data about the dynamics of the performance, which are immediately accessible from the machine

    No strings attached: Force and vibrotactile feedback in a virtual guitar simulation

    Get PDF
    The poster describes a multisensory simulation of plucking guitar strings in virtual reality and a user study evaluating the simulation. Auditory feedback is generated by a physics-based simulation of guitar strings, and haptic feedback is provided by a combination of high fidelity vibrotactile actuators and a Phantom Omni. The study compared four conditions: no haptic feedback, vibrotactile feedback, force feedback, and a combination of force and vibrotactile feedback. The results indicate that the combination of vibrotactile and force feedback elicits the most realistic experience, and during this condition, participants were less likely to inadvertently hit strings. Notably, no significant differences were found between the conditions involving either vibrotactile or force feedback

    Navigate as a bat. Real-time echolocation system in virtual reality

    No full text
    Several studies provide evidence that blind people orient themselves using echolocation, transmitting signals with mouth clicks. Our previous study within embodiment in Virtual Reality (VR) showed the possibility to enhance a Virtual Body Ownership (VBO) illusion over a body morphologically different from human in the presence of agency. In this paper, we explore real-time audio navigation with echolocation in Virtual Environment (VE) in order to create a feeling of being a virtual bat. This includes imitation of the sonar system, which might help to achieve a stronger VBO illusion in the future, as well as build an echolocation training simulator. Two pilot tests were conducted using a within-subject study design, exploring time and traveled distance during spatial orientation in VE. Both studies, involved four conditions - early reflections, reverb, early reflections-reverb (with deprived visual cues) and finally vision. This resulted in preferred reflection pulses for the test subjects with musical background, while only reverberation features were favored by non-musicians, when being exposed to VE walking-based task
    corecore