7 research outputs found

    Multisensory texture exploration at the tip of the pen

    Get PDF
    A tool for the multisensory stylus-based exploration of virtual textures was used to investigate how different feedback modalities (static or dynamically deformed images, vibration, sound) affect exploratory gestures. To this end, we ran an experiment where participants had to steer a path with the stylus through a curved corridor on the surface of a graphic tablet/display, and we measured steering time, dispersion of trajectories, and applied force. Despite the variety of subjective impressions elicited by the different feedback conditions, we found that only nonvisual feedback induced significant variations in trajectories and an increase in movement time. In a post-experiment, using a paper-and-wood physical realization of the same texture, we recorded a variety of gestural behaviors markedly different from those found with the virtual texture. With the physical setup, movement time was shorter and texture-dependent lateral accelerations could be observed. This work highlights the limits of multisensory pseudo-haptic techniques in the exploration of surface textures

    Intuitive Control of Scraping and Rubbing Through Audio-tactile Synthesis

    Full text link
    Intuitive control of synthesis processes is an ongoing challenge within the domain of auditory perception and cognition. Previous works on sound modelling combined with psychophysical tests have enabled our team to develop a synthesizer that provides intuitive control of actions and objects based on semantic descriptions for sound sources. In this demo we present an augmented version of the synthesizer in which we added tactile stimulations to increase the sensation of true continuous friction interactions (rubbing and scratching) with the simulated objects. This is of interest for several reasons. Firstly, it enables to evaluate the realism of our sound model in presence of stimulations from other modalities. Secondly it enables to compare tactile and auditory signal structures linked to the same evocation, and thirdly it provides a tool to investigate multimodal perception and how stimulations from different modalities should be combined to provide realistic user interfaces

    A Real-time Synthesizer of Naturalistic Congruent Audio-Haptic Textures

    Get PDF
    International audienceThis demo paper presents a multi-modal device able to generate real-time audio-haptic signal as response to the users' motion and produce naturalistic sensation. The device consists in a touch screen with haptic feedback based on ultrasonic friction modulation and a sound synthesizer. The device will help investigate audio-haptic interaction. In particular the system is built to allow for an exploration of di↵erent strategy of mapping audio and haptic signal to explore the limits of congruence. Such interactions could be the key to more informative and user-friendly touchscreens for Human-Machine-Interfaces

    RealPen: Providing Realism in Handwriting Tasks on Touch Surfaces using Auditory-Tactile Feedback

    Get PDF
    We present RealPen, an augmented stylus for capacitive tablet screens that recreates the physical sensation of writing on paper with a pencil, ball-point pen or marker pen. The aim is to create a more engaging experience when writing on touch surfaces, such as screens of tablet computers. This is achieved by regenerating the friction-induced oscillation and sound of a real writing tool in contact with paper. To generate realistic tactile feedback, our algorithm analyzes the frequency spectrum of the friction oscillation generated when writing with traditional tools, extracts principal frequencies, and uses the actuator's frequency response profile for an adjustment weighting function. We enhance the realism by providing the sound feedback aligned with the writing pressure and speed. Furthermore, we investigated the effects of superposition and fluctuation of several frequencies on human tactile perception, evaluated the performance of RealPen, and characterized users' perception and preference of each feedback type

    Tangibility and Richness in Digital Musical Instrument Design

    Get PDF
    PhDThe sense of touch plays a fundamental role in musical performance: alongside hearing, it is the primary sensory modality used when interacting with musical instruments. Learning to play a musical instrument is one of the most developed haptic cultural practices, and within acoustic musical practice at large, the importance of touch and its close relationship to virtuosity and expression is well recognised. With digital musical instruments (DMIs) – instruments involving a combination of sensors and a digital sound engine – touch-mediated interaction remains the foremost means of control, but the interfaces of such instruments do not yet engage with the full spectrum of sensorimotor capabilities of a performer. This poses compelling questions for digital instrument design: how does the nuance and richness of physical interaction with an instrument manifest itself in the digital domain? Which design parameters are most important for haptic experience, and how do these parameters affect musical performance? Built around three practical studies which utilise DMIs as technology probes, this thesis addresses these questions from the point of view of design, of empirical musicology, and of tangible computing. In the first study musicians played a DMI with continuous pitch control and vibrotactile feedback in order to understand how dynamic tactile feedback can be implemented and how it influences musician experience and performance. The results suggest that certain vibrotactile feedback conditions can increase musicians’ tuning accuracy, but also disrupt temporal performance. The second study examines the influence of asynchronies between audio and haptic feedback. Two groups of musicians, amateurs and professional percussionists, were tasked with performing on a percussive DMI with variable action-sound latency. Differences between the two groups in terms of temporal accuracy and quality judgements illustrate the complex effects of asynchronous multimodal feedback. In the third study guitar-derivative DMIs with variable levels of control richness were observed with non-musicians and guitarists. The results from this study help clarify the relationship between tangible design factors, sensorimotor expertise and instrument behaviour. This thesis introduces a descriptive model of performer-instrument interaction, the projection model, which unites the design investigations from each study and provides a series of reflections and suggestions on the role of touch in DMI design.Doctoral Training Centre for Media and Arts Technolog
    corecore