49 research outputs found

    Hand-to-hand: an intermanual illusion of movement

    Get PDF
    Apparent tactile motion has been shown to occur across many contiguous parts of the body, such as fingers, forearms, and back. A recent study demonstrated the possibility of eliciting the illusion of movement from one hand to the other when interconnected by a tablet. In this paper, we explore intermanual apparent tactile motion without any object between them. In a series of psychophysical experiments, we determine the control space for generating smooth and consistent motion, using two vibrating handles which we refer to as the Hand-to-Hand vibrotactile device. In a first experiment, we investigated the occurrence of the phenomenon (i.e., movement illusion) and the generation of a perceptive model. In a second experiment, based on those results, we investigated the effect of hand postures on the illusion. Finally, in a third experiment, we explored two visuo-tactile matching tasks in a multimodal VR setting. Our results can be applied in VR applications with intermanual tactile interactions

    Creating an illusion of movement between the hands using mid-air touch

    Get PDF
    Apparent tactile motion (ATM) has been shown to occur across many contiguous parts of the body, such as fingers, forearms and the back. More recently, the illusion has also been elicited on non-contiguous part of the body, such as from one hand to the other when interconnected or not interconnected by an object in between the hands. Here we explore the reproducibility of the intermanual tactile illusion of movement between two free hands by employing mid-air tactile stimulation. We investigate the optimal parameters to generate a continuous and smooth motion using two arrays of ultrasound speakers, and two stimulation techniques (i.e. static vs. dynamic focal point). In the first experiment, we investigate the occurrence of the illusion when using a static focal point, and we define a perceptive model. In the second experiment, we examine the illusion using a dynamic focal point, defining a second perceptive model. Finally, we discuss the differences between the two techniques

    Representing Interpersonal Touch Directions by Tactile Apparent Motion Using Smart Bracelets

    Get PDF
    We present a novel haptic interaction to vibro-tactually connect an interpersonal touch using bracelet devices. A pair of bracelet devices identifies the user who is actively touching and the other who is passively touched, defining the direction as being from the former to the latter. By controlling the vibrational feedback, the pair induces a tactile apparent motion representing the direction between two hands. The bracelets are comprised of our developed interpersonal body area network module, an acceleration sensor, and a vibrator. The devices communicate with each other through electrical current flowing along the hands to identify the direction by sharing accelerations just before a touch and to synchronize the feedback in less than ten milliseconds. Experiment 1 demonstrates that the vibration propagated from a bracelet device to the wearer\u27s hand is perceivable by another. Experiment 2 determines sets of optimal actuation parameters, stimulus onset asynchrony, and duration of vibration to induce the tactile apparent motion based on a psychophysical approach. In addition, vibration propagation between hands is observed. Experiment 3 demonstrates the capability of the developed device to present the haptic interaction

    Bimanual cross-coupling in space telerobotics

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2012.Cataloged from PDF version of thesis.Includes bibliographical references (p. 54-61).Astronauts spend much time training to control the robotic arm aboard the International Space Station, and must perform a variety of challenging, three-dimensional tasks. They use a unique, bimanual control system to control the velocity of the end-effector; the left hand controls translation in three axes while the right hand simultaneously controls rotation in three axes. Operator inputs to the bimanual controllers can cross-couple through not only inter-manual neuromotor pathways, when movement of one hand affects movement of the other hand, but also through intramanual pathways, when movement of one hand affects movement of the same hand in an unintended control axis. We developed a measurement technique to quantify these directional cross-coupling pathways based on the detection of frequency-coded command signals in a bimanual tracking task. The technique allowed us to characterize the interactions among all six control axes in the form of a cross-coupling matrix of coupling strengths. An experiment using these techniques suggested two principal pathways of intermanual coupling and one of intramanual coupling. By combining information across 18 human subjects to typify the cross-coupling response due to the bimanual control system, we found that the two intermanual pathways exhibited 21% yaw to lateral translation and 15% pitch to vertical translation mean coupling even after significant training. The intramanual pathway exhibited 41% roll to yaw mean coupling. We found significant differences in bimanual cross-coupling between subjects, and demonstrated that subjects could significantly reduce intermanual cross-coupling with practice, suggesting that these metrics may be useful indicators of control device mastery. We found statistically significant negative correlations between early-stage intramanual coupling and subsequent performance in a simulated space telerobotics track and capture task, suggesting that an intramanual coupling metric may be useful as a predictor of human telerobotic performance. The test technique could ultimately be applied to evaluate cross-coupling during astronaut training and also to reduce undesired cross-coupling through improved hand controller design. Our results supported an ergonomic basis for intermanual cross-coupling that incorporated both biomechanical effects and sensorimotor effects.by Victor Wang.S.M

    Sensorimotor experience in virtual environments

    Get PDF
    The goal of rehabilitation is to reduce impairment and provide functional improvements resulting in quality participation in activities of life, Plasticity and motor learning principles provide inspiration for therapeutic interventions including movement repetition in a virtual reality environment, The objective of this research work was to investigate functional specific measurements (kinematic, behavioral) and neural correlates of motor experience of hand gesture activities in virtual environments stimulating sensory experience (VE) using a hand agent model. The fMRI compatible Virtual Environment Sign Language Instruction (VESLI) System was designed and developed to provide a number of rehabilitation and measurement features, to identify optimal learning conditions for individuals and to track changes in performance over time. Therapies and measurements incorporated into VESLI target and track specific impairments underlying dysfunction. The goal of improved measurement is to develop targeted interventions embedded in higher level tasks and to accurately track specific gains to understand the responses to treatment, and the impact the response may have upon higher level function such as participation in life. To further clarify the biological model of motor experiences and to understand the added value and role of virtual sensory stimulation and feedback which includes seeing one\u27s own hand movement, functional brain mapping was conducted with simultaneous kinematic analysis in healthy controls and in stroke subjects. It is believed that through the understanding of these neural activations, rehabilitation strategies advantaging the principles of plasticity and motor learning will become possible. The present research assessed successful practice conditions promoting gesture learning behavior in the individual. For the first time, functional imaging experiments mapped neural correlates of human interactions with complex virtual reality hands avatars moving synchronously with the subject\u27s own hands, Findings indicate that healthy control subjects learned intransitive gestures in virtual environments using the first and third person avatars, picture and text definitions, and while viewing visual feedback of their own hands, virtual hands avatars, and in the control condition, hidden hands. Moreover, exercise in a virtual environment with a first person avatar of hands recruited insular cortex activation over time, which might indicate that this activation has been associated with a sense of agency. Sensory augmentation in virtual environments modulated activations of important brain regions associated with action observation and action execution. Quality of the visual feedback was modulated and brain areas were identified where the amount of brain activation was positively or negatively correlated with the visual feedback, When subjects moved the right hand and saw unexpected response, the left virtual avatar hand moved, neural activation increased in the motor cortex ipsilateral to the moving hand This visual modulation might provide a helpful rehabilitation therapy for people with paralysis of the limb through visual augmentation of skills. A model was developed to study the effects of sensorimotor experience in virtual environments, and findings of the effect of sensorimotor experience in virtual environments upon brain activity and related behavioral measures. The research model represents a significant contribution to neuroscience research, and translational engineering practice, A model of neural activations correlated with kinematics and behavior can profoundly influence the delivery of rehabilitative services in the coming years by giving clinicians a framework for engaging patients in a sensorimotor environment that can optimally facilitate neural reorganization

    EEG coherence between the verbal-analytical region (T3) and the motor-planning region (Fz) increases under stress in explicit motor learners but not implicit motor learners

    Get PDF
    This journal supplement contains abstracts of NASPSPA 2010Free Communications - Verbal and Poster: Motor Learning and Controlpublished_or_final_versionThe Annual Conference of the North American Society for the Psychology of Sport and Physical Activity (NASPSPA 2010), Tucson, AZ., 10-12 June 2010. In Journal of Sport and Exercise Psychology, 2010, v. 32 suppl., p. S13

    Haptics Rendering and Applications

    Get PDF
    There has been significant progress in haptic technologies but the incorporation of haptics into virtual environments is still in its infancy. A wide range of the new society's human activities including communication, education, art, entertainment, commerce and science would forever change if we learned how to capture, manipulate and reproduce haptic sensory stimuli that are nearly indistinguishable from reality. For the field to move forward, many commercial and technological barriers need to be overcome. By rendering how objects feel through haptic technology, we communicate information that might reflect a desire to speak a physically- based language that has never been explored before. Due to constant improvement in haptics technology and increasing levels of research into and development of haptics-related algorithms, protocols and devices, there is a belief that haptics technology has a promising future

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications
    corecore