563 research outputs found

    Scalability of the Size of Patterns Drawn Using Tactile Hand Guidance

    Get PDF
    Haptic feedback for handwriting training has been extensively studied, but with primary focus on kinematic feedback. We provide vibrotactile feedback through a wrist worn sleeve to guide the user to recreate unknown patterns and study the impact of vibrational duration (1, 2, 3 seconds) on pattern scaling. User traces a line at 90° angles, while attempting to maintain a constant speed, in the direction of the motor activated till a different motor activation is perceived. Shape and size are two features of good letter formation. Study performed on three subjects showed the ability to utilize four vibrotactile motors to guide the hand towards correct shape formation with high accuracy (\u3e 95%). The overall size of the letter was observed to scale linearly with the vibrational duration. Implications for utilizing the vibrational feedback for handwriting correction are discussed

    Personalising Vibrotactile Displays through Perceptual Sensitivity Adjustment

    Get PDF
    Haptic displays are commonly limited to transmitting a discrete set of tactile motives. In this paper, we explore the transmission of real-valued information through vibrotactile displays. We simulate spatial continuity with three perceptual models commonly used to create phantom sensations: the linear, logarithmic and power model. We show that these generic models lead to limited decoding precision, and propose a method for model personalization adjusting to idiosyncratic and spatial variations in perceptual sensitivity. We evaluate this approach using two haptic display layouts: circular, worn around the wrist and the upper arm, and straight, worn along the forearm. Results of a user study measuring continuous value decoding precision show that users were able to decode continuous values with relatively high accuracy (4.4% mean error), circular layouts performed particularly well, and personalisation through sensitivity adjustment increased decoding precision

    Sensory Augmentation for Balance Rehabilitation Using Skin Stretch Feedback

    Get PDF
    This dissertation focuses on the development and evaluation of portable sensory augmentation systems that render skin-stretch feedback of posture for standing balance training and for postural control improvement. Falling is one of the main causes of fatal injuries among all members of the population. The high incidence of fall-related injuries also leads to high medical expenses, which cost approximately $34 billion annually in the United States. People with neurological diseases, e.g., stroke, multiple sclerosis, spinal cord injuries, and the elderly are more prone to falling when compared to healthy individuals. Falls among these populations can also lead to hip fracture, or even death. Thus, several balance and gait rehabilitation approaches have been developed to reduce the risk of falling. Traditionally, a balance-retraining program includes a series of exercises for trainees to strengthen their sensorimotor and musculoskeletal systems. Recent advances in technology have incorporated biofeedback such as visual, auditory, or haptic feedback to provide the users with extra cues about their postural sway. Studies have also demonstrated the positive effects of biofeedback on balance control. However, current applications of biofeedback for interventions in people with impaired balance are still lacking some important characteristics such as portability (in-home care), small-size, and long-term viability. Inspired by the concept of light touch, a light, small, and wearable sensory augmentation system that detects body sway and supplements skin stretch on one’s fingertip pad was first developed. The addition of a shear tactile display could significantly enhance the sensation to body movement. Preliminary results have shown that the application of passive skin stretch feedback at the fingertip enhanced standing balance of healthy young adults. Based on these findings, two research directions were initiated to investigate i) which dynamical information of postural sway could be more effectively conveyed by skin stretch feedback, and ii) how can such feedback device be easily used in the clinical setting or on a daily basis. The major sections of this research are focused on understanding how the skin stretch feedback affects the standing balance and on quantifying the ability of humans to interpret the cutaneous feedback as the cues of their physiological states. Experimental results from both static and dynamic balancing tasks revealed that healthy subjects were able to respond to the cues and subsequently correct their posture. However, it was observed that the postural sway did not generally improve in healthy subjects due to skin stretch feedback. A possible reason was that healthy subjects already had good enough quality sensory information such that the additional artificial biofeedback may have interfered with other sensory cues. Experiments incorporating simulated sensory deficits were further conducted and it was found that subjects with perturbed sensory systems (e.g., unstable surface) showed improved balance due to skin stretch feedback when compared to the neutral standing conditions. Positive impacts on balance performance have also been demonstrated among multiple sclerosis patients when they receive skin stretch feedback from a sensory augmentation walker. The findings in this research indicated that the skin stretch feedback rendered by the developed devices affected the human balance and can potentially compensate underlying neurological or musculoskeletal disorders, therefore enhancing quiet standing postural control

    Sensory Augmentation for Balance Rehabilitation Using Skin Stretch Feedback

    Get PDF
    This dissertation focuses on the development and evaluation of portable sensory augmentation systems that render skin-stretch feedback of posture for standing balance training and for postural control improvement. Falling is one of the main causes of fatal injuries among all members of the population. The high incidence of fall-related injuries also leads to high medical expenses, which cost approximately $34 billion annually in the United States. People with neurological diseases, e.g., stroke, multiple sclerosis, spinal cord injuries, and the elderly are more prone to falling when compared to healthy individuals. Falls among these populations can also lead to hip fracture, or even death. Thus, several balance and gait rehabilitation approaches have been developed to reduce the risk of falling. Traditionally, a balance-retraining program includes a series of exercises for trainees to strengthen their sensorimotor and musculoskeletal systems. Recent advances in technology have incorporated biofeedback such as visual, auditory, or haptic feedback to provide the users with extra cues about their postural sway. Studies have also demonstrated the positive effects of biofeedback on balance control. However, current applications of biofeedback for interventions in people with impaired balance are still lacking some important characteristics such as portability (in-home care), small-size, and long-term viability. Inspired by the concept of light touch, a light, small, and wearable sensory augmentation system that detects body sway and supplements skin stretch on one’s fingertip pad was first developed. The addition of a shear tactile display could significantly enhance the sensation to body movement. Preliminary results have shown that the application of passive skin stretch feedback at the fingertip enhanced standing balance of healthy young adults. Based on these findings, two research directions were initiated to investigate i) which dynamical information of postural sway could be more effectively conveyed by skin stretch feedback, and ii) how can such feedback device be easily used in the clinical setting or on a daily basis. The major sections of this research are focused on understanding how the skin stretch feedback affects the standing balance and on quantifying the ability of humans to interpret the cutaneous feedback as the cues of their physiological states. Experimental results from both static and dynamic balancing tasks revealed that healthy subjects were able to respond to the cues and subsequently correct their posture. However, it was observed that the postural sway did not generally improve in healthy subjects due to skin stretch feedback. A possible reason was that healthy subjects already had good enough quality sensory information such that the additional artificial biofeedback may have interfered with other sensory cues. Experiments incorporating simulated sensory deficits were further conducted and it was found that subjects with perturbed sensory systems (e.g., unstable surface) showed improved balance due to skin stretch feedback when compared to the neutral standing conditions. Positive impacts on balance performance have also been demonstrated among multiple sclerosis patients when they receive skin stretch feedback from a sensory augmentation walker. The findings in this research indicated that the skin stretch feedback rendered by the developed devices affected the human balance and can potentially compensate underlying neurological or musculoskeletal disorders, therefore enhancing quiet standing postural control

    Haptic Interaction with a Guide Robot in Zero Visibility

    Get PDF
    Search and rescue operations are often undertaken in dark and noisy environment in which rescue team must rely on haptic feedback for exploration and safe exit. However, little attention has been paid specifically to haptic sensitivity in such contexts or the possibility of enhancing communicational proficiency in the haptic mode as a life-preserving measure. The potential of root swarms for search and rescue has been shown by the Guardians project (EU, 2006-2010); however the project also showed the problem of human robot interaction in smoky (non-visibility) and noisy conditions. The REINS project (UK, 2011-2015) focused on human robot interaction in such conditions. This research is a body of work (done as a part of he REINS project) which investigates the haptic interaction of a person wit a guide robot in zero visibility. The thesis firstly reflects upon real world scenarios where people make use of the haptic sense to interact in zero visibility (such as interaction among firefighters and symbiotic relationship between visually impaired people and guide dogs). In addition, it reflects on the sensitivity and trainability of the haptic sense, to be used for the interaction. The thesis presents an analysis and evaluation of the design of a physical interface (Designed by the consortium of the REINS project) connecting the human and the robotic guide in poor visibility conditions. Finally, it lays a foundation for the design of test cases to evaluate human robot haptic interaction, taking into consideration the two aspects of the interaction, namely locomotion guidance and environmental exploration

    Doctor of Philosophy

    Get PDF
    dissertationWhen interacting with objects, humans utilize their sense of touch to provide information about the object and surroundings. However, in video games, virtual reality, and training exercises, humans do not always have information available through the sense of touch. Several types of haptic feedback devices have been created to provide touch information in these scenarios. This dissertation describes the use of tactile skin stretch feedback to provide cues that convey direction information to a user. The direction cues can be used to guide a user or provide information about the environment. The tactile skin stretch feedback devices described herein provide feedback directly to the hands, just as in many real life interactions involving the sense of touch. The devices utilize a moving tactor (actuated skin contact surface, also called a contactor) and surrounding material to give the user a sense of the relative motion. Several game controller prototypes with skin stretch feedback embedded into the device to interface with the fingers were constructed. Experiments were conducted to evaluate user performance in moving the joysticks to match the direction of the stimulus. These experiments investigated stimulus masking effects with both skin stretch feedback and vibrotactile feedback. A controller with feedback on the thumb joysticks was found to have higher user accuracy. Next, precision grip and power grip skin stretch feedback devices were created to investigate cues to convey motion in a three-dimensional space. Experiments were conducted to compare the two devices and to explore user accuracy in identifying different direction cue types. The precision grip device was found to be superior in communicating direction cues to users in four degrees of freedom. Finally, closed-loop control was implemented to guide users to a specific location and orientation within a three-dimensional space. Experiments were conducted to improve controller feedback which in turn improved user performance. Experiments were also conducted to investigate the feasibility of providing multiple cues in succession, in order to guide a user with multiple motions of the hand. It was found that users can successfully reach multiple target locations and orientations in succession

    Doctor of Philosophy

    Get PDF
    dissertationThe study of haptic interfaces focuses on the use of the sense of touch in human-machine interaction. This document presents a detailed investigation of lateral skin stretch at the fingertip as a means of direction communication. Such tactile communication has applications in a variety of situations where traditional audio and visual channels are inconvenient, unsafe, or already saturated. Examples include handheld consumer electronics, where tactile communication would allow a user to control a device without having to look at it, or in-car navigation systems, where the audio and visual directions provided by existing GPS devices can distract the driver's attention away from the road. Lateral skin stretch, the displacement of the skin of the fingerpad in a plane tangent to the fingerpad, is a highly effective means of communicating directional information. Users are able to correctly identify the direction of skin stretch stimuli with skin displacements as small as 0.1 mm at rates as slow as 2 mm/s. Such stimuli can be rendered by a small, portable device suitable for integration into handheld devices. The design of the device-finger interface affects the ability of the user to perceive the stimuli accurately. A properly designed conical aperture effectively constrains the motion of the finger and provides an interface that is practical for use in handheld devices. When a handheld device renders directional tactile cues on the fingerpad, the user must often mentally rotate those cues from the reference frame of the finger to the world-centered reference frame where those cues are to be applied. Such mental rotation incurs a cognitive cost, requiring additional time to mentally process the stimuli. The magnitude of these cognitive costs is a function of the angle of rotation, and of the specific orientations of the arm, wrist and finger. Even with the difficulties imposed by required mental rotations, lateral skin stretch is a promising means of communicating information using the sense of touch with potential to substantially improve certain types of human-machine interaction

    Analysis of joint and hand impedance during teleoperation and free-hand task execution

    Get PDF
    partially_open4Teleoperated robotic surgery allows filtering andscaling the hand motion to achieve high precision during thesurgical interventions. Teleoperation represents a very complexsensory-motor task, mainly due to the kinematic and kineticredundancies that characterize the human motor control. Itrequires an intensive training phase to acquire sufficient famil-iarity with the master-slave architecture.We estimated the hand stiffness modulation during theexecution of a simulated suturing task in teleoperation, withtwo different master devices, and in free-hand. Kinematicdata of eight right-handed users were acquired, using elec-tromagnetic and optical tracking systems, and analysed usinga musculoskeletal model. Through inverse dynamics, muscularactivation was computed and used to obtain the joint torqueand stiffness, leading to end-point stiffness estimation. Themaximal stiffness value and its angular displacement withrespect to the trajectory tangent was computed. The resultsshow that there is a difference in how the main stiffness axiswas modulated by using the two master devices with respectto free-hand, with higher values and variability for the seriallink manipulator. Moreover, a directional modulation of thehand stiffness through the trajectory was found, showing thatthe users were aligning the direction of the main stiffness axisperpendicularly to the trajectory.openBuzzi, Jacopo; Gatti, Cecilia; Ferrigno, Giancarlo; De Momi, ElenaBuzzi, Jacopo; Gatti, Cecilia; Ferrigno, Giancarlo; DE MOMI, Elen

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications

    Understanding interaction mechanics in touchless target selection

    Get PDF
    Indiana University-Purdue University Indianapolis (IUPUI)We use gestures frequently in daily life—to interact with people, pets, or objects. But interacting with computers using mid-air gestures continues to challenge the design of touchless systems. Traditional approaches to touchless interaction focus on exploring gesture inputs and evaluating user interfaces. I shift the focus from gesture elicitation and interface evaluation to touchless interaction mechanics. I argue for a novel approach to generate design guidelines for touchless systems: to use fundamental interaction principles, instead of a reactive adaptation to the sensing technology. In five sets of experiments, I explore visual and pseudo-haptic feedback, motor intuitiveness, handedness, and perceptual Gestalt effects. Particularly, I study the interaction mechanics in touchless target selection. To that end, I introduce two novel interaction techniques: touchless circular menus that allow command selection using directional strokes and interface topographies that use pseudo-haptic feedback to guide steering–targeting tasks. Results illuminate different facets of touchless interaction mechanics. For example, motor-intuitive touchless interactions explain how our sensorimotor abilities inform touchless interface affordances: we often make a holistic oblique gesture instead of several orthogonal hand gestures while reaching toward a distant display. Following the Gestalt theory of visual perception, we found similarity between user interface (UI) components decreased user accuracy while good continuity made users faster. Other findings include hemispheric asymmetry affecting transfer of training between dominant and nondominant hands and pseudo-haptic feedback improving touchless accuracy. The results of this dissertation contribute design guidelines for future touchless systems. Practical applications of this work include the use of touchless interaction techniques in various domains, such as entertainment, consumer appliances, surgery, patient-centric health settings, smart cities, interactive visualization, and collaboration
    • …
    corecore