16 research outputs found

    Rendering of Pressure and Textures Using Wearable Haptics in Immersive VR Environments

    Get PDF
    Haptic systems have only recently started to be designed with wearability in mind. Compact, unobtrusive, inexpensive, easy-to-wear, and lightweight haptic devices enable researchers to provide compelling touch sensations to multiple parts of the body, significantly increasing the applicability of haptics in many fields, such as robotics, rehabilitation, gaming, and immersive systems. In this respect, wearable haptics has a great potential in the fields of virtual and augmented reality. Being able to touch virtual objects in a wearable and unobtrusive way may indeed open new exciting avenues for the fields of haptics and VR. This work presents a novel wearable haptic system for immersive virtual reality experiences. It conveys the sensation of touching objects made of different materials, rendering pressure and texture stimuli through a moving platform and a vibrotactile abbrv-doi-hyperref-narrowmotor. The device is composed of two platforms: one placed on the nail side of the finger and one in contact with the finger pad, connected by three cables. One small servomotor controls the length of the cables, moving the platform towards or away from the fingertip. One voice coil actuator, embedded in the platform, provides vibrotactile stimuli to the user

    Doctor of Philosophy

    Get PDF
    dissertationThe study of haptic interfaces focuses on the use of the sense of touch in human-machine interaction. This document presents a detailed investigation of lateral skin stretch at the fingertip as a means of direction communication. Such tactile communication has applications in a variety of situations where traditional audio and visual channels are inconvenient, unsafe, or already saturated. Examples include handheld consumer electronics, where tactile communication would allow a user to control a device without having to look at it, or in-car navigation systems, where the audio and visual directions provided by existing GPS devices can distract the driver's attention away from the road. Lateral skin stretch, the displacement of the skin of the fingerpad in a plane tangent to the fingerpad, is a highly effective means of communicating directional information. Users are able to correctly identify the direction of skin stretch stimuli with skin displacements as small as 0.1 mm at rates as slow as 2 mm/s. Such stimuli can be rendered by a small, portable device suitable for integration into handheld devices. The design of the device-finger interface affects the ability of the user to perceive the stimuli accurately. A properly designed conical aperture effectively constrains the motion of the finger and provides an interface that is practical for use in handheld devices. When a handheld device renders directional tactile cues on the fingerpad, the user must often mentally rotate those cues from the reference frame of the finger to the world-centered reference frame where those cues are to be applied. Such mental rotation incurs a cognitive cost, requiring additional time to mentally process the stimuli. The magnitude of these cognitive costs is a function of the angle of rotation, and of the specific orientations of the arm, wrist and finger. Even with the difficulties imposed by required mental rotations, lateral skin stretch is a promising means of communicating information using the sense of touch with potential to substantially improve certain types of human-machine interaction

    Perspectives on the Evolution of Tactile, Haptic, and Thermal Displays

    Get PDF

    In Contact:Pinching, Squeezing and Twisting for Mediated Social Touch

    Get PDF

    First validation of the Haptic Sandwich: a shape changing handheld haptic navigation aid

    Get PDF
    This paper presents the Haptic Sandwich, a handheld robotic device that designed to provide pedestrian navigation instructions through a novel shape changing modality. The device resembles a cube with an articulated upper half that is able to rotate and translate (extend) relative to the bottom half, which is grounded in the user’s hand when the device is held. The poses assumed by the device simultaneously correspond to heading and proximity to a navigational target. The Haptic Sandwich provides an alternative to screen and/or audio based pedestrian navigation technologies for both visually impaired and sighted users. Unlike other robotic or haptic navigational solutions, the haptic sandwich is discrete in terms of form and sensory stimulus. Due to the novel and unexplored nature of shape changing interfaces, two user studies were undertaken to validate the concept and device. In the first experiment, stationary participants attempted to identify poses assumed by the device, which was hidden from view. In the second experiment, participants attempted to locate a sequence of invisible navigational targets while walking with the device. Of 1080 pose presentations to 10 individuals in experiment one, 80% were correctly identified and 17.5% had the minimal possible error. Multi-DOF errors accounted for only 1.1% of all answers. The role of simultaneous or independent actuator motion on final shape perception was tested with no significant performance difference. The rotation and extension DOF had significantly different perception accuracy. In the second experiment, participants demonstrated good navigational ability with the device after minimal training and were able to locate all presented targets. Mean motion efficiency of the participants was between 32%-56%. Participants made use of both DOF

    Design of a wearable skin stretch cutaneous device for the upper limb

    Get PDF
    This paper presents a novel cutaneous device capable of providing independent skin stretches at the palmar, dorsal, ulnar, and radial sides of the arm. It consists of a lightweight bracelet with four servo motors. Each motor actuates a cylindrical shaped end-effector that is able to rotate, generating skin stretch stimuli. To understand how to control and wear the device on the forearm to evoke the most effective cutaneous sensations, we carried out perceptual experiments evaluating its absolute and differential thresholds. Finally, we carried out an experiment of haptic navigation to assess the effectiveness of our device as a navigation feedback system to guide a desired rotation and translation of the forearm. Results demonstrate an average rotation and translation error of 1.87â—‹ and 2.84 mm, respectively. Moreover, all the subjects found our device easy to wear and comfortable. Nine out of ten found it effective in transmitting navigation information to the forearm

    Optimization-Based wearable tactile rendering

    Get PDF
    Novel wearable tactile interfaces offer the possibility to simulate tactile interactions with virtual environments directly on our skin. But, unlike kinesthetic interfaces, for which haptic rendering is a well explored problem, they pose new questions about the formulation of the rendering problem. In this work, we propose a formulation of tactile rendering as an optimization problem, which is general for a large family of tactile interfaces. Based on an accurate simulation of contact between a finger model and the virtual environment, we pose tactile rendering as the optimization of the device configuration, such that the contact surface between the device and the actual finger matches as close as possible the contact surface in the virtual environment. We describe the optimization formulation in general terms, and we also demonstrate its implementation on a thimble-like wearable device. We validate the tactile rendering formulation by analyzing its force error, and we show that it outperforms other approaches

    Doctor of Philosophy

    Get PDF
    dissertationWhen interacting with objects, humans utilize their sense of touch to provide information about the object and surroundings. However, in video games, virtual reality, and training exercises, humans do not always have information available through the sense of touch. Several types of haptic feedback devices have been created to provide touch information in these scenarios. This dissertation describes the use of tactile skin stretch feedback to provide cues that convey direction information to a user. The direction cues can be used to guide a user or provide information about the environment. The tactile skin stretch feedback devices described herein provide feedback directly to the hands, just as in many real life interactions involving the sense of touch. The devices utilize a moving tactor (actuated skin contact surface, also called a contactor) and surrounding material to give the user a sense of the relative motion. Several game controller prototypes with skin stretch feedback embedded into the device to interface with the fingers were constructed. Experiments were conducted to evaluate user performance in moving the joysticks to match the direction of the stimulus. These experiments investigated stimulus masking effects with both skin stretch feedback and vibrotactile feedback. A controller with feedback on the thumb joysticks was found to have higher user accuracy. Next, precision grip and power grip skin stretch feedback devices were created to investigate cues to convey motion in a three-dimensional space. Experiments were conducted to compare the two devices and to explore user accuracy in identifying different direction cue types. The precision grip device was found to be superior in communicating direction cues to users in four degrees of freedom. Finally, closed-loop control was implemented to guide users to a specific location and orientation within a three-dimensional space. Experiments were conducted to improve controller feedback which in turn improved user performance. Experiments were also conducted to investigate the feasibility of providing multiple cues in succession, in order to guide a user with multiple motions of the hand. It was found that users can successfully reach multiple target locations and orientations in succession

    The S-BAN: insights into the perception of shape-changing haptic interfaces via virtual pedestrian navigation

    Get PDF
    Screen-based pedestrian navigation assistance can be distracting or inaccessible to users. Shape-changing haptic interfaces can overcome these concerns. The S-BAN is a new handheld haptic interface that utilizes a parallel kinematic structure to deliver 2-DOF spatial information over a continuous workspace, with a form factor suited to integration with other travel aids. The ability to pivot, extend and retract its body opens possibilities and questions around spatial data representation. We present a static study to understand user perception of absolute pose and relative motion for two spatial mappings, showing highest sensitivity to relative motions in the cardinal directions. We then present an embodied navigation experiment in virtual reality. User motion efficiency when guided by the S-BAN was statistically equivalent to using a vision-based tool (a smartphone proxy). Although haptic trials were slower than visual trials, participants’ heads were more elevated with the S-BAN, allowing greater visual focus on the environment

    Master of Science

    Get PDF
    thesisHaptic interactions with smartphones are generally restricted to vibrotactile feedback that offers limited distinction between delivered tactile cues. The lateral movement of a small, high-friction contactor at the fingerpad can be used to induce skin stretch tangent to the skin's surface. This method has been demonstrated to reliably communicate four cardinal directions with 1 mm translations of the device's contactor, when finger motion is properly restrained. While earlier research has used a thimble to restrain the finger, this interface has been made portable by incorporating a simple conical hole as a finger restraint. An initial portable device design used RC hobby servos and the conical hole finger restraint, but the shape and size of this portable device wasn't compatible with smartphone form factors. This design also had significant compliance and backlash that must be compensated for with additional control schemes. In contrast, this thesis presents the design, fabrication, and testing of a low-profile skin-stretch display (LPSSD) with a novel actuation design for delivering complex tactile cues with minimal backlash or hysteresis of the skin contactor or "tactor." This flatter mechanism features embedded sensors for fingertip cursor control and selection. This device's nonlinear tactor motions are compensated for using table look-up and high-frequency open-loop control to create direction cues with 1.8 mm radial tactor displacements in 16 directions (distributed evenly every 22.5°) before returning to center. Two LPSSDs are incorporated into a smartphone peripheral and used in single-handed and bimanual tests to identify 16 directions. Users also participated in "relative" identification tests where they were first provided a reference direction cue in the forward/north direction followed by the cue direction that they were to identify. Tests were performed with the user's thumbs oriented in the forward direction and with thumbs angled inward slightly, similar to the angledthumb orientation console game controllers. Users are found to have increased performance with an angled-thumb orientation. They performed similarly when stimuli were delivered to their right or left thumbs, and had significantly better performance judging direction cues with both thumbs simultaneously. Participants also performed slightly better in identifying the relative direction cues than the absolute
    corecore