749 research outputs found
Multisensory texture exploration at the tip of the pen
A tool for the multisensory stylus-based exploration of virtual textures was used to investigate how different feedback modalities (static or dynamically deformed images, vibration, sound) affect exploratory gestures. To this end, we ran an experiment where participants had to steer a path with the stylus through a curved corridor on the surface of a graphic tablet/display, and we measured steering time, dispersion of trajectories, and applied force. Despite the variety of subjective impressions elicited by the different feedback conditions, we found that only nonvisual feedback induced significant variations in trajectories and an increase in movement time. In a post-experiment, using a paper-and-wood physical realization of the same texture, we recorded a variety of gestural behaviors markedly different from those found with the virtual texture. With the physical setup, movement time was shorter and texture-dependent lateral accelerations could be observed. This work highlights the limits of multisensory pseudo-haptic techniques in the exploration of surface textures
To âSketch-a-Scratchâ
A surface can be harsh and raspy, or smooth and silky, and everything in between. We are used to sense these features with our fingertips as well as with our eyes and ears: the exploration of a surface is a multisensory experience. Tools, too, are often employed in the interaction with surfaces, since they augment our manipulation capabilities. âSketch-a-Scratchâ is a tool for the multisensory exploration and sketching of surface textures. The userâs actions drive a physical sound model of real materialsâ response to interactions such as scraping, rubbing or rolling. Moreover, different input signals can be converted into 2D visual surface profiles, thus enabling to experience them visually, aurally and haptically
To âSketch-a-Scratchâ
A surface can be harsh and raspy, or smooth and silky, and everything in between. We are used to sense these features with our fingertips as well as with our eyes and ears: the exploration of a surface is a multisensory experience.
Tools, too, are often employed in the interaction with surfaces, since they augment our manipulation capabilities.
âSketch-a-Scratchâ is a tool for the multisensory exploration and sketching of surface textures. The userâs actions drive a physical sound model of real materialsâ response to interactions such as scraping, rubbing or rolling.
Moreover, different input signals can be converted into 2D visual surface profiles, thus enabling to experience them visually, aurally and haptically
To Sketch-a-Scratch
A surface can be harsh and raspy, or smooth and silky, and everything in between. We are used to sense these features with our fingertips as well as with our eyes and ears: the exploration of a surface is a multisensory experience. Tools, too, are often employed in the interaction with surfaces, since they augment our manipulation capabilities. âSketch-a-Scratchâ is a tool for the multisensory exploration and sketching of surface textures. The userâs actions drive a physical sound model of real materialsâ response to interactions such as scraping, rubbing or rolling. Moreover, different input signals can be converted into 2D visual surface profiles, thus enabling to experience them visually, aurally and haptically
Multisensory Adaptations: Creating Art with Students Who Are Blind and Low Vision
The main purpose of this article is to present approaches and strategies to making 2-D visual arts instruction meaningful and accessible for students who are blind or low vision. The suggestions provided within this article are based on current literature, researcher observations, and the contributions of an experienced, practicing art teacher at the Indiana School for the Blind and Visually Impaired
Pictures in Your Mind: Using Interactive Gesture-Controlled Reliefs to Explore Art
Tactile reliefs offer many benefits over the more classic raised line drawings or tactile diagrams, as depth, 3D shape, and surface textures are directly perceivable. Although often created for blind and visually impaired (BVI) people, a wider range of people may benefit from such multimodal material. However, some reliefs are still difficult to understand without proper guidance or accompanying verbal descriptions, hindering autonomous exploration.
In this work, we present a gesture-controlled interactive audio guide (IAG) based on recent low-cost depth cameras that can be operated directly with the hands on relief surfaces during tactile exploration. The interactively explorable, location-dependent verbal and captioned descriptions promise rapid tactile accessibility to 2.5D spatial information in a home or education setting, to online resources, or as a kiosk installation at public places.
We present a working prototype, discuss design decisions, and present the results of two evaluation studies: the first with 13 BVI test users and the second follow-up study with 14 test users across a wide range of people with differences and difficulties associated with perception, memory, cognition, and communication. The participant-led research method of this latter study prompted new, significant and innovative developments
Intuitive Control of Scraping and Rubbing Through Audio-tactile Synthesis
Intuitive control of synthesis processes is an ongoing challenge within the
domain of auditory perception and cognition. Previous works on sound modelling
combined with psychophysical tests have enabled our team to develop a
synthesizer that provides intuitive control of actions and objects based on
semantic descriptions for sound sources. In this demo we present an augmented
version of the synthesizer in which we added tactile stimulations to increase
the sensation of true continuous friction interactions (rubbing and scratching)
with the simulated objects. This is of interest for several reasons. Firstly,
it enables to evaluate the realism of our sound model in presence of
stimulations from other modalities. Secondly it enables to compare tactile and
auditory signal structures linked to the same evocation, and thirdly it
provides a tool to investigate multimodal perception and how stimulations from
different modalities should be combined to provide realistic user interfaces
A Real-time Synthesizer of Naturalistic Congruent Audio-Haptic Textures
International audienceThis demo paper presents a multi-modal device able to generate real-time audio-haptic signal as response to the users' motion and produce naturalistic sensation. The device consists in a touch screen with haptic feedback based on ultrasonic friction modulation and a sound synthesizer. The device will help investigate audio-haptic interaction. In particular the system is built to allow for an exploration of diâ”erent strategy of mapping audio and haptic signal to explore the limits of congruence. Such interactions could be the key to more informative and user-friendly touchscreens for Human-Machine-Interfaces
RealPen: Providing Realism in Handwriting Tasks on Touch Surfaces using Auditory-Tactile Feedback
We present RealPen, an augmented stylus for capacitive tablet screens that recreates the physical sensation of writing on paper with a pencil, ball-point pen or marker pen. The aim is to create a more engaging experience when writing on touch surfaces, such as screens of tablet computers. This is achieved by regenerating the friction-induced oscillation and sound of a real writing tool in contact with paper. To generate realistic tactile feedback, our algorithm analyzes the frequency spectrum of the friction oscillation generated when writing with traditional tools, extracts principal frequencies, and uses the actuator's frequency response profile for an adjustment weighting function. We enhance the realism by providing the sound feedback aligned with the writing pressure and speed. Furthermore, we investigated the effects of superposition and fluctuation of several frequencies on human tactile perception, evaluated the performance of RealPen, and characterized users' perception and preference of each feedback type
- âŠ