1,099 research outputs found
Tactile-STAR: A Novel Tactile STimulator And Recorder System for Evaluating and Improving Tactile Perception
Many neurological diseases impair the motor and somatosensory systems. While several different technologies are used in clinical practice to assess and improve motor functions, somatosensation is evaluated subjectively with qualitative clinical scales. Treatment of somatosensory deficits has received limited attention. To bridge the gap between the assessment and training of motor vs. somatosensory abilities, we designed, developed, and tested a novel, low-cost, two-component (bimanual) mechatronic system targeting tactile somatosensation: the Tactile-STAR—a tactile stimulator and recorder. The stimulator is an actuated pantograph structure driven by two servomotors, with an end-effector covered by a rubber material that can apply two different types of skin stimulation: brush and stretch. The stimulator has a modular design, and can be used to test the tactile perception in different parts of the body such as the hand, arm, leg, big toe, etc. The recorder is a passive pantograph that can measure hand motion using two potentiometers. The recorder can serve multiple purposes: participants can move its handle to match the direction and amplitude of the tactile stimulator, or they can use it as a master manipulator to control the tactile stimulator as a slave. Our ultimate goal is to assess and affect tactile acuity and somatosensory deficits. To demonstrate the feasibility of our novel system, we tested the Tactile-STAR with 16 healthy individuals and with three stroke survivors using the skin-brush stimulation. We verified that the system enables the mapping of tactile perception on the hand in both populations. We also tested the extent to which 30 min of training in healthy individuals led to an improvement of tactile perception. The results provide a first demonstration of the ability of this new system to characterize tactile perception in healthy individuals, as well as a quantification of the magnitude and pattern of tactile impairment in a small cohort of stroke survivors. The finding that short-term training with Tactile-STARcan improve the acuity of tactile perception in healthy individuals suggests that Tactile-STAR may have utility as a therapeutic intervention for somatosensory deficits
Learning Haptic-based Object Pose Estimation for In-hand Manipulation Control with Underactuated Robotic Hands
Unlike traditional robotic hands, underactuated compliant hands are
challenging to model due to inherent uncertainties. Consequently, pose
estimation of a grasped object is usually performed based on visual perception.
However, visual perception of the hand and object can be limited in occluded or
partly-occluded environments. In this paper, we aim to explore the use of
haptics, i.e., kinesthetic and tactile sensing, for pose estimation and in-hand
manipulation with underactuated hands. Such haptic approach would mitigate
occluded environments where line-of-sight is not always available. We put an
emphasis on identifying the feature state representation of the system that
does not include vision and can be obtained with simple and low-cost hardware.
For tactile sensing, therefore, we propose a low-cost and flexible sensor that
is mostly 3D printed along with the finger-tip and can provide implicit contact
information. Taking a two-finger underactuated hand as a test-case, we analyze
the contribution of kinesthetic and tactile features along with various
regression models to the accuracy of the predictions. Furthermore, we propose a
Model Predictive Control (MPC) approach which utilizes the pose estimation to
manipulate objects to desired states solely based on haptics. We have conducted
a series of experiments that validate the ability to estimate poses of various
objects with different geometry, stiffness and texture, and show manipulation
to goals in the workspace with relatively high accuracy
In-home and remote use of robotic body surrogates by people with profound motor deficits
By controlling robots comparable to the human body, people with profound
motor deficits could potentially perform a variety of physical tasks for
themselves, improving their quality of life. The extent to which this is
achievable has been unclear due to the lack of suitable interfaces by which to
control robotic body surrogates and a dearth of studies involving substantial
numbers of people with profound motor deficits. We developed a novel, web-based
augmented reality interface that enables people with profound motor deficits to
remotely control a PR2 mobile manipulator from Willow Garage, which is a
human-scale, wheeled robot with two arms. We then conducted two studies to
investigate the use of robotic body surrogates. In the first study, 15 novice
users with profound motor deficits from across the United States controlled a
PR2 in Atlanta, GA to perform a modified Action Research Arm Test (ARAT) and a
simulated self-care task. Participants achieved clinically meaningful
improvements on the ARAT and 12 of 15 participants (80%) successfully completed
the simulated self-care task. Participants agreed that the robotic system was
easy to use, was useful, and would provide a meaningful improvement in their
lives. In the second study, one expert user with profound motor deficits had
free use of a PR2 in his home for seven days. He performed a variety of
self-care and household tasks, and also used the robot in novel ways. Taking
both studies together, our results suggest that people with profound motor
deficits can improve their quality of life using robotic body surrogates, and
that they can gain benefit with only low-level robot autonomy and without
invasive interfaces. However, methods to reduce the rate of errors and increase
operational speed merit further investigation.Comment: 43 Pages, 13 Figure
W-FYD: a Wearable Fabric-based Display for Haptic Multi-Cue Delivery and Tactile Augmented Reality
Despite the importance of softness, there is no evidence of wearable haptic systems able to deliver controllable softness cues. Here, we present the Wearable Fabric Yielding Display (W-FYD), a fabric-based display for multi-cue delivery that can be worn on user's finger and enables, for the first time, both active and passive softness exploration. It can also induce a sliding effect under the finger-pad. A given stiffness profile can be obtained by modulating the stretching state of the fabric through two motors. Furthermore, a lifting mechanism allows to put the fabric in contact with the user's finger-pad, to enable passive softness rendering. In this paper, we describe the architecture of W-FYD, and a thorough characterization of its stiffness workspace, frequency response and softness rendering capabilities. We also computed device Just Noticeable Difference in both active and passive exploratory conditions, for linear and non-linear stiffness rendering as well as for sliding direction perception. The effect of device weight was also considered. Furthermore, performance of participants and their subjective quantitative evaluation in detecting sliding direction and softness discrimination tasks are reported. Finally, applications of W-FYD in tactile augmented reality for open palpation are discussed, opening interesting perspectives in many fields of human-machine interaction
Mechanosensitivity and Neural Adaptation in Human Somatosensory System
Magnetoencephalography (MEG) was utilized to characterize the adaptation in the somatosensory cortical network due to repeated cutaneous tactile stimulation applied unilaterally on the face and hand using a custom-built pneumatic stimulator called the TAC-Cell. Face stimulation invoked neuromagnetic responses reflecting cortical activity in the contralateral primary somatosensory cortex (SI), while hand stimulation resulted in robust contralateral SI and posterior parietal cortex (PPC) activation. There was also activity observed in regions of the secondary somatosensory cortical areas (SII), although with a reduced amplitude and higher variability across subjects. There was a significant difference in adaptation rates between SI, and higher-order sensory cortices like the PPC for hand stimulation. Adaptation was also significantly dependent on the stimulus frequency and pulse index number within the stimulus train for both hand and face stimulation. The latency of the peak responses was significantly dependent on stimulus site and response component (SI, PPC). The difference in the latency of peak SI and PPC responses can be reflective of a hierarchical serial-processing network in the somatosensory cortex. Age- and sex-related changes of vibrotactile sensitivity in the orofacial and hand skin surfaces of healthy adults was demonstrated using an established psychophysical protocol. Vibrotactile threshold sensitivity increased as a function of age for finger stimulation, but remained unaltered for the face. Increase in the finger threshold sensitivity is due to age-related changes in the number and morphology of Pacinian corpuscles (absent in the face). Vibrotactile threshold sensitivity is significantly dependent on stimulation site, stimulus frequency, and sex of the participant. These differences are presumably due to dissimilarities in the type and density of mechanoreceptors present in the face and hand. A novel-method was developed to couple the use of fiber-optic displacement sensors with the pneumatic stimulator built in our laboratory called the TAC-Cell. This displacement sensor which is commonly used in industrial applications was successfully utilized to characterize the skin response to TAC-Cell stimulation. Skin displacement was significantly dependent on input stimulus amplitudes and varied as a function of the participants' sex. Power spectrum analysis and rise-fall time measurement of the skin-displacement showed that the TAC-Cell stimulus consists of a spectrally rich, high velocity signal that is capable of evoking a cortical response due to stimulation of the medial-lemniscus and trigeminal pathways
- …