2,780 research outputs found

    Haptic Experience and the Design of Drawing Interfaces

    Get PDF
    Haptic feedback has the potential to enhance users’ sense of being engaged and creative in their artwork. Current work on providing haptic feedback in computer-based drawing applications has focused mainly on the realism of the haptic sensation rather than the users’ experience of that sensation in the context of their creative work. We present a study that focuses on user experience of three haptic drawing interfaces. These interfaces were based on two different haptic metaphors, one of which mimicked familiar drawing tools (such as pen, pencil or crayon on smooth or rough paper) and the other of which drew on abstract descriptors of haptic experience (roughness, stickiness, scratchiness and smoothness). It was found that users valued having control over the haptic sensation; that each metaphor was preferred by approximately half of the participants; and that the real world metaphor interface was considered more helpful than the abstract one, whereas the abstract interface was considered to better support creativity. This suggests that future interfaces for artistic work should have user-modifiable interaction styles for controlling the haptic sensation

    Haptic Stylus and Empirical Studies on Braille, Button, and Texture Display

    Get PDF
    This paper presents a haptic stylus interface with a built-in compact tactile display module and an impact module as well as empirical studies on Braille, button, and texture display. We describe preliminary evaluations verifying the tactile display's performance indicating that it can satisfactorily represent Braille numbers for both the normal and the blind. In order to prove haptic feedback capability of the stylus, an experiment providing impact feedback mimicking the click of a button has been conducted. Since the developed device is small enough to be attached to a force feedback device, its applicability to combined force and tactile feedback display in a pen-held haptic device is also investigated. The handle of pen-held haptic interface was replaced by the pen-like interface to add tactile feedback capability to the device. Since the system provides combination of force, tactile and impact feedback, three haptic representation methods for texture display have been compared on surface with 3 texture groups which differ in direction, groove width, and shape. In addition, we evaluate its capacity to support touch screen operations by providing tactile sensations when a user rubs against an image displayed on a monitor

    RealPen: Providing Realism in Handwriting Tasks on Touch Surfaces using Auditory-Tactile Feedback

    Get PDF
    We present RealPen, an augmented stylus for capacitive tablet screens that recreates the physical sensation of writing on paper with a pencil, ball-point pen or marker pen. The aim is to create a more engaging experience when writing on touch surfaces, such as screens of tablet computers. This is achieved by regenerating the friction-induced oscillation and sound of a real writing tool in contact with paper. To generate realistic tactile feedback, our algorithm analyzes the frequency spectrum of the friction oscillation generated when writing with traditional tools, extracts principal frequencies, and uses the actuator's frequency response profile for an adjustment weighting function. We enhance the realism by providing the sound feedback aligned with the writing pressure and speed. Furthermore, we investigated the effects of superposition and fluctuation of several frequencies on human tactile perception, evaluated the performance of RealPen, and characterized users' perception and preference of each feedback type

    Tactile Displays with Parallel Mechanism

    Get PDF

    Electrostatic Friction Displays to Enhance Touchscreen Experience

    Get PDF
    Touchscreens are versatile devices that can display visual content and receive touch input, but they lack the ability to provide programmable tactile feedback. This limitation has been addressed by a few approaches generally called surface haptics technology. This technology modulates the friction between a user’s fingertip and a touchscreen surface to create different tactile sensations when the finger explores the touchscreen. This functionality enables the user to see and feel digital content simultaneously, leading to improved usability and user experiences. One major approach in surface haptics relies on the electrostatic force induced between the finger and an insulating surface on the touchscreen by supplying high AC voltage. The use of AC also induces a vibrational sensation called electrovibration to the user. Electrostatic friction displays require only electrical components and provide uniform friction over the screen. This tactile feedback technology not only allows easy and lightweight integration into touchscreen devices but also provides dynamic, rich, and satisfactory user interfaces. In this chapter, we review the fundamental operation of the electrovibration technology as well as applications have been built upon

    Impact of haptic 'touching' technology on cultural applications

    Get PDF
    No abstract available

    Tactile Arrays for Virtual Textures

    Get PDF
    This thesis describes the development of three new tactile stimulators for active touch, i.e. devices to deliver virtual touch stimuli to the fingertip in response to exploratory movements by the user. All three stimulators are designed to provide spatiotemporal patterns of mechanical input to the skin via an array of contactors, each under individual computer control. Drive mechanisms are based on piezoelectric bimorphs in a cantilever geometry. The first of these is a 25-contactor array (5 × 5 contactors at 2 mm spacing). It is a rugged design with a compact drive system and is capable of producing strong stimuli when running from low voltage supplies. Combined with a PC mouse, it can be used for active exploration tasks. Pilot studies were performed which demonstrated that subjects could successfully use the device for discrimination of line orientation, simple shape identification and line following tasks. A 24-contactor stimulator (6 × 4 contactors at 2 mm spacing) with improved bandwidth was then developed. This features control electronics designed to transmit arbitrary waveforms to each channel (generated on-the-fly, in real time) and software for rapid development of experiments. It is built around a graphics tablet, giving high precision position capability over a large 2D workspace. Experiments using two-component stimuli (components at 40 Hz and 320 Hz) indicate that spectral balance within active stimuli is discriminable independent of overall intensity, and that the spatial variation (texture) within the target is easier to detect at 320 Hz that at 40 Hz. The third system developed (again 6 × 4 contactors at 2 mm spacing) was a lightweight modular stimulator developed for fingertip and thumb grasping tasks; furthermore it was integrated with force-feedback on each digit and a complex graphical display, forming a multi-modal Virtual Reality device for the display of virtual textiles. It is capable of broadband stimulation with real-time generated outputs derived from a physical model of the fabric surface. In an evaluation study, virtual textiles generated from physical measurements of real textiles were ranked in categories reflecting key mechanical and textural properties. The results were compared with a similar study performed on the real fabrics from which the virtual textiles had been derived. There was good agreement between the ratings of the virtual textiles and the real textiles, indicating that the virtual textiles are a good representation of the real textiles and that the system is delivering appropriate cues to the user

    Crossmodal audio and tactile interaction with mobile touchscreens

    Get PDF
    Touchscreen mobile devices often use cut-down versions of desktop user interfaces placing high demands on the visual sense that may prove awkward in mobile settings. The research in this thesis addresses the problems encountered by situationally impaired mobile users by using crossmodal interaction to exploit the abundant similarities between the audio and tactile modalities. By making information available to both senses, users can receive the information in the most suitable way, without having to abandon their primary task to look at the device. This thesis begins with a literature review of related work followed by a definition of crossmodal icons. Two icons may be considered to be crossmodal if and only if they provide a common representation of data, which is accessible interchangeably via different modalities. Two experiments investigated possible parameters for use in crossmodal icons with results showing that rhythm, texture and spatial location are effective. A third experiment focused on learning multi-dimensional crossmodal icons and the extent to which this learning transfers between modalities. The results showed identification rates of 92% for three-dimensional audio crossmodal icons when trained in the tactile equivalents, and identification rates of 89% for tactile crossmodal icons when trained in the audio equivalent. Crossmodal icons were then incorporated into a mobile touchscreen QWERTY keyboard. Experiments showed that keyboards with audio or tactile feedback produce fewer errors and greater speeds of text entry compared to standard touchscreen keyboards. The next study examined how environmental variables affect user performance with the same keyboard. The data showed that each modality performs differently with varying levels of background noise or vibration and the exact levels at which these performance decreases occur were established. The final study involved a longitudinal evaluation of a touchscreen application, CrossTrainer, focusing on longitudinal effects on performance with audio and tactile feedback, the impact of context on performance and personal modality preference. The results show that crossmodal audio and tactile icons are a valid method of presenting information to situationally impaired mobile touchscreen users with recognitions rates of 100% over time. This thesis concludes with a set of guidelines on the design and application of crossmodal audio and tactile feedback to enable application and interface designers to employ such feedback in all systems
    corecore