9,598 research outputs found

    A robot hand testbed designed for enhancing embodiment and functional neurorehabilitation of body schema in subjects with upper limb impairment or loss.

    Get PDF
    Many upper limb amputees experience an incessant, post-amputation "phantom limb pain" and report that their missing limbs feel paralyzed in an uncomfortable posture. One hypothesis is that efferent commands no longer generate expected afferent signals, such as proprioceptive feedback from changes in limb configuration, and that the mismatch of motor commands and visual feedback is interpreted as pain. Non-invasive therapeutic techniques for treating phantom limb pain, such as mirror visual feedback (MVF), rely on visualizations of postural changes. Advances in neural interfaces for artificial sensory feedback now make it possible to combine MVF with a high-tech "rubber hand" illusion, in which subjects develop a sense of embodiment with a fake hand when subjected to congruent visual and somatosensory feedback. We discuss clinical benefits that could arise from the confluence of known concepts such as MVF and the rubber hand illusion, and new technologies such as neural interfaces for sensory feedback and highly sensorized robot hand testbeds, such as the "BairClaw" presented here. Our multi-articulating, anthropomorphic robot testbed can be used to study proprioceptive and tactile sensory stimuli during physical finger-object interactions. Conceived for artificial grasp, manipulation, and haptic exploration, the BairClaw could also be used for future studies on the neurorehabilitation of somatosensory disorders due to upper limb impairment or loss. A remote actuation system enables the modular control of tendon-driven hands. The artificial proprioception system enables direct measurement of joint angles and tendon tensions while temperature, vibration, and skin deformation are provided by a multimodal tactile sensor. The provision of multimodal sensory feedback that is spatiotemporally consistent with commanded actions could lead to benefits such as reduced phantom limb pain, and increased prosthesis use due to improved functionality and reduced cognitive burden

    Embodied Precision : Intranasal Oxytocin Modulates Multisensory Integration

    Get PDF
    © 2018 Massachusetts Institute of Technology.Multisensory integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size-weight illusion (SWI), allow us to investigate how the brain resolves conflicting multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant's hidden hand and a visible rubber hand creates illusory body ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting multisensory input. Recent hypotheses propose that the precision of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine, and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied multisensory integration. The present, double-blind, placebo-controlled, crossover study ( N = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on multisensory integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased an embodied version of the SWI (quantified as estimation error during a weight estimation task). These findings suggest that oxytocin might modulate processes of visuotactile multisensory integration by increasing the precision of top-down signals against bottom-up sensory input.Peer reviewedFinal Accepted Versio

    Perceiving Mass in Mixed Reality through Pseudo-Haptic Rendering of Newton's Third Law

    Get PDF
    In mixed reality, real objects can be used to interact with virtual objects. However, unlike in the real world, real objects do not encounter any opposite reaction force when pushing against virtual objects. The lack of reaction force during manipulation prevents users from perceiving the mass of virtual objects. Although this could be addressed by equipping real objects with force-feedback devices, such a solution remains complex and impractical.In this work, we present a technique to produce an illusion of mass without any active force-feedback mechanism. This is achieved by simulating the effects of this reaction force in a purely visual way. A first study demonstrates that our technique indeed allows users to differentiate light virtual objects from heavy virtual objects. In addition, it shows that the illusion is immediately effective, with no prior training. In a second study, we measure the lowest mass difference (JND) that can be perceived with this technique. The effectiveness and ease of implementation of our solution provides an opportunity to enhance mixed reality interaction at no additional cost

    Touching on elements for a non-invasive sensory feedback system for use in a prosthetic hand

    Get PDF
    Hand amputation results in the loss of motor and sensory functions, impacting activities of daily life and quality of life. Commercially available prosthetic hands restore the motor function but lack sensory feedback, which is crucial to receive information about the prosthesis state in real-time when interacting with the external environment. As a supplement to the missing sensory feedback, the amputee needs to rely on visual and audio cues to operate the prosthetic hand, which can be mentally demanding. This thesis revolves around finding potential solutions to contribute to an intuitive non-invasive sensory feedback system that could be cognitively less burdensome and enhance the sense of embodiment (the feeling that an artificial limb belongs to one’s own body), increasing acceptance of wearing a prosthesis.A sensory feedback system contains sensors to detect signals applied to the prosthetics. The signals are encoded via signal processing to resemble the detected sensation delivered by actuators on the skin. There is a challenge in implementing commercial sensors in a prosthetic finger. Due to the prosthetic finger’s curvature and the fact that some prosthetic hands use a covering rubber glove, the sensor response would be inaccurate. This thesis shows that a pneumatic touch sensor integrated into a rubber glove eliminates these errors. This sensor provides a consistent reading independent of the incident angle of stimulus, has a sensitivity of 0.82 kPa/N, a hysteresis error of 2.39±0.17%, and a linearity error of 2.95±0.40%.For intuitive tactile stimulation, it has been suggested that the feedback stimulus should be modality-matched with the intention to provide a sensation that can be easily associated with the real touch on the prosthetic hand, e.g., pressure on the prosthetic finger should provide pressure on the residual limb. A stimulus should also be spatially matched (e.g., position, size, and shape). Electrotactile stimulation has the ability to provide various sensations due to it having several adjustable parameters. Therefore, this type of stimulus is a good candidate for discrimination of textures. A microphone can detect texture-elicited vibrations to be processed, and by varying, e.g., the median frequency of the electrical stimulation, the signal can be presented on the skin. Participants in a study using electrotactile feedback showed a median accuracy of 85% in differentiating between four textures.During active exploration, electrotactile and vibrotactile feedback provide spatially matched modality stimulations, providing continuous feedback and providing a displaced sensation or a sensation dispatched on a larger area. Evaluating commonly used stimulation modalities using the Rubber Hand Illusion, modalities which resemble the intended sensation provide a more vivid illusion of ownership for the rubber hand.For a potentially more intuitive sensory feedback, the stimulation can be somatotopically matched, where the stimulus is experienced as being applied on a site corresponding to their missing hand. This is possible for amputees who experience referred sensation on their residual stump. However, not all amputees experience referred sensations. Nonetheless, after a structured training period, it is possible to learn to associate touch with specific fingers, and the effect persisted after two weeks. This effect was evaluated on participants with intact limbs, so it remains to evaluate this effect for amputees.In conclusion, this thesis proposes suggestions on sensory feedback systems that could be helpful in future prosthetic hands to (1) reduce their complexity and (2) enhance the sense of body ownership to enhance the overall sense of embodiment as an addition to an intuitive control system

    From rubber hands to neuroprosthetics: Neural correlates of embodiment

    Get PDF
    © 2023 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/)Our interaction with the world rests on the knowledge that we are a body in space and time, which can interact with the environment. This awareness is usually referred to as sense of embodiment. For the good part of the past 30 years, the rubber hand illusion (RHI) has been a prime tool to study embodiment in healthy and people with a variety of clinical conditions. In this paper, we provide a critical overview of this research with a focus on the RHI paradigm as a tool to study prothesis embodiment in individuals with amputation. The RHI relies on well-documented multisensory integration mechanisms based on sensory precision, where parietal areas are involved in resolving the visuo-tactile conflict, and premotor areas in updating the conscious bodily representation. This mechanism may be transferable to prosthesis ownership in amputees. We discuss how these results might transfer to technological development of sensorised prostheses, which in turn might progress the acceptability by users.Peer reviewe

    Electrocutaneous stimulation to close the loop in myoelectric prosthesis control

    Get PDF
    Current commercially available prosthetic systems still lack sensory feedback and amputees are forced to maintain eye-contact with the prosthesis when interacting with their environment. Electrocutaneous stimulation is a promising approach to convey sensory feedback via the skin. However, when discussed in the context of prosthetic applications, it is often refused due to its supposed incompatibility with myocontrol. This dissertation now addresses electrocutaneous stimulation as means to provide sensory feedback to prosthesis users, and its implications on myoelectric control, possible use for improved or accelerated mastering of prosthesis control through closing of the control loop, as well as its potential in aiding in the embodiment of prosthetic components. First, a comparison of different paradigms for encoding sensory feedback variables in electrocutaneous stimulation patterns was done. For this, subject ability to employ spatially and intensity-coded electrocutaneous feedback in a simulated closed-loop control task was evaluated. The task was to stabilise an invisible virtual inverted pendulum under ideal feedforward control conditions (joystick). Pendulum inclination was either presented spatially (12 stimulation sites), encoded by stimulation strength (≧ 2 stimulation sites), or a combination of the two. The tests indicated that spatial encoding was perceived as more intuitive, but intensity encoding yielded better performance and lower energy expenditure. The second study investigated the detrimental influence of stimulation artefacts on myoelectric control of prostheses for a wide range of stimulation parameters and two prosthesis control approaches (pattern recognition of eight motion primitives, direct proportional control). Artefact blanking is introduced and discussed as a practical approach to handle stimulation artefacts and restore control performance back to the baseline. This was shown with virtual and applied artefact blanking (pattern recognition on six electromyographic channels), as well as in a practical task-related test with a real prosthesis (proportional control). The information transfer of sensory feedback necessary to master a routine grasping task using electromyographic control of a prosthesis was investigated in another study. Subjects controlled a real prosthesis to repeatedly grasp a dummy object, which implemented two different objects with previously unknown slip and fragility properties. Three feedback conditions (basic feedback on grasp success, visual grasp force feedback, tactile grasp force feedback) were compared with regard to their influence on subjects’ task performance and variability in exerted grasp force. It was found that online force feedback via a visual or tactile channel did not add significant advantages, and that basic feedback was sufficient and was employed by subjects to improve both performance and force variability with time. Importantly, there was no adverse effect of the additional feedback, either. This has important implications for other non-functional applications of sensory feedback, such as facilitation of embodiment of prosthetic devices. The final study investigated the impact of electrocutaneous stimulation on embodiment of an artificial limb. For this purpose, a sensor finger was employed in a rubber-hand-illusion-like experiment. Two independent groups (test, control), were compared with regard to two objective measures of embodiment: proprioceptive drift, and change in skin temperature. Though proprioceptive drift measures did not reveal differences between conditions, they indicated trends generally associated to a successful illusion. Additionally, significant changes in skin temperature between test and control group indicated that embodiment of the artificial digit could be induced by providing sensory substitution feedback on the forearm. In conclusion, it has been shown that humans can employ electrocutaneous stimulation feedback in challenging closed-loop control tasks. It was found that transition from simple intuitive encodings (spatial) to those providing better resolution (intensity) further improves feedback exploitation. Blanking and segmentation approaches facilitate simultaneous application of electrocutaneous stimulation and electromyographic control of prostheses, using both pattern recognition and classic proportional approaches. While it was found that force feedback may not aid in the mastering of routine grasping, the presence of the feedback was also found to not impede the user performance. This is an important implication for the application of feedback for non-functional purposes, such as facilitation of embodiment. Regarding this, it was shown that providing sensory feedback via electrocutaneous stimulation did indeed promote embodiment of an artificial finger, even if the feedback was applied to the forearm. Based on the results of this work, the next step should be integration of sensory feedback into commercial devices, so that all amputees can benefit from its advantages. Electrocutaneous stimulation has been shown to be an ideal means for realising this. Hitherto existing concerns about the compatibility of electrocutaneous stimulation and myocontrol could be resolved by presenting appropriate methods to deal with stimulation artefacts

    The Human Touch: Skin Temperature During the Rubber Hand Illusion in Manual and Automated Stroking Procedures

    Get PDF
    Rohde M, Wold A, Karnath H-O, Ernst MO. The Human Touch: Skin Temperature During the Rubber Hand Illusion in Manual and Automated Stroking Procedures. PLoS ONE. 2013;8(11): e80688.A difference in skin temperature between the hands has been identified as a physiological correlate of the rubber hand illusion (RHI). The RHI is an illusion of body ownership, where participants perceive body ownership over a rubber hand if they see it being stroked in synchrony with their own occluded hand. The current study set out to replicate this result, i.e., psychologically induced cooling of the stimulated hand using an automated stroking paradigm, where stimulation was delivered by a robot arm (PHANToMTM force-feedback device). After we found no evidence for hand cooling in two experiments using this automated procedure, we reverted to a manual stroking paradigm, which is closer to the one employed in the study that first produced this effect. With this procedure, we observed a relative cooling of the stimulated hand in both the experimental and the control condition. The subjective experience of ownership, as rated by the participants, by contrast, was strictly linked to synchronous stroking in all three experiments. This implies that hand-cooling is not a strict correlate of the subjective feeling of hand ownership in the RHI. Factors associated with the differences between the two designs (differences in pressure of tactile stimulation, presence of another person) that were thus far considered irrelevant to the RHI appear to play a role in bringing about this temperature effect

    I’m sensing in the rain: spatial incongruity in visual-tactile mid-air stimulation can elicit ownership in VR users

    Get PDF
    Major virtual reality (VR) companies are trying to enhance the sense of immersion in virtual environments by implementing haptic feedback in their systems (e.g., Oculus Touch). It is known that tactile stimulation adds realism to a virtual environment. In addition, when users are not limited by wearing any attachments (e.g., gloves), it is even possible to create more immersive experiences. Mid-air haptic technology provides contactless haptic feedback and offers the potential for creating such immersive VR experiences. However, one of the limitations of mid-air haptics resides in the need for freehand tracking systems (e.g., Leap Motion) to deliver tactile feedback to the user's hand. These tracking systems are not accurate, limiting designers capability of delivering spatially precise tactile stimulation. Here, we investigated an alternative way to convey incongruent visual-tactile stimulation that can be used to create the illusion of a congruent visual-tactile experience, while participants experience the phenomenon of the rubber hand illusion in VR
    • 

    corecore