6,065 research outputs found

    A Review of Smart Materials in Tactile Actuators for Information Delivery

    Full text link
    As the largest organ in the human body, the skin provides the important sensory channel for humans to receive external stimulations based on touch. By the information perceived through touch, people can feel and guess the properties of objects, like weight, temperature, textures, and motion, etc. In fact, those properties are nerve stimuli to our brain received by different kinds of receptors in the skin. Mechanical, electrical, and thermal stimuli can stimulate these receptors and cause different information to be conveyed through the nerves. Technologies for actuators to provide mechanical, electrical or thermal stimuli have been developed. These include static or vibrational actuation, electrostatic stimulation, focused ultrasound, and more. Smart materials, such as piezoelectric materials, carbon nanotubes, and shape memory alloys, play important roles in providing actuation for tactile sensation. This paper aims to review the background biological knowledge of human tactile sensing, to give an understanding of how we sense and interact with the world through the sense of touch, as well as the conventional and state-of-the-art technologies of tactile actuators for tactile feedback delivery

    Tactile-STAR: A Novel Tactile STimulator And Recorder System for Evaluating and Improving Tactile Perception

    Get PDF
    Many neurological diseases impair the motor and somatosensory systems. While several different technologies are used in clinical practice to assess and improve motor functions, somatosensation is evaluated subjectively with qualitative clinical scales. Treatment of somatosensory deficits has received limited attention. To bridge the gap between the assessment and training of motor vs. somatosensory abilities, we designed, developed, and tested a novel, low-cost, two-component (bimanual) mechatronic system targeting tactile somatosensation: the Tactile-STAR—a tactile stimulator and recorder. The stimulator is an actuated pantograph structure driven by two servomotors, with an end-effector covered by a rubber material that can apply two different types of skin stimulation: brush and stretch. The stimulator has a modular design, and can be used to test the tactile perception in different parts of the body such as the hand, arm, leg, big toe, etc. The recorder is a passive pantograph that can measure hand motion using two potentiometers. The recorder can serve multiple purposes: participants can move its handle to match the direction and amplitude of the tactile stimulator, or they can use it as a master manipulator to control the tactile stimulator as a slave. Our ultimate goal is to assess and affect tactile acuity and somatosensory deficits. To demonstrate the feasibility of our novel system, we tested the Tactile-STAR with 16 healthy individuals and with three stroke survivors using the skin-brush stimulation. We verified that the system enables the mapping of tactile perception on the hand in both populations. We also tested the extent to which 30 min of training in healthy individuals led to an improvement of tactile perception. The results provide a first demonstration of the ability of this new system to characterize tactile perception in healthy individuals, as well as a quantification of the magnitude and pattern of tactile impairment in a small cohort of stroke survivors. The finding that short-term training with Tactile-STARcan improve the acuity of tactile perception in healthy individuals suggests that Tactile-STAR may have utility as a therapeutic intervention for somatosensory deficits

    Whisking with robots from rat vibrissae to biomimetic technology for active touch

    Get PDF
    This article summarizes some of the key features of the rat vibrissal system, including the actively controlled sweeping movements of the vibrissae known as whisking, and reviews the past and ongoing research aimed at replicating some of this functionality in biomimetic robots

    A robot hand testbed designed for enhancing embodiment and functional neurorehabilitation of body schema in subjects with upper limb impairment or loss.

    Get PDF
    Many upper limb amputees experience an incessant, post-amputation "phantom limb pain" and report that their missing limbs feel paralyzed in an uncomfortable posture. One hypothesis is that efferent commands no longer generate expected afferent signals, such as proprioceptive feedback from changes in limb configuration, and that the mismatch of motor commands and visual feedback is interpreted as pain. Non-invasive therapeutic techniques for treating phantom limb pain, such as mirror visual feedback (MVF), rely on visualizations of postural changes. Advances in neural interfaces for artificial sensory feedback now make it possible to combine MVF with a high-tech "rubber hand" illusion, in which subjects develop a sense of embodiment with a fake hand when subjected to congruent visual and somatosensory feedback. We discuss clinical benefits that could arise from the confluence of known concepts such as MVF and the rubber hand illusion, and new technologies such as neural interfaces for sensory feedback and highly sensorized robot hand testbeds, such as the "BairClaw" presented here. Our multi-articulating, anthropomorphic robot testbed can be used to study proprioceptive and tactile sensory stimuli during physical finger-object interactions. Conceived for artificial grasp, manipulation, and haptic exploration, the BairClaw could also be used for future studies on the neurorehabilitation of somatosensory disorders due to upper limb impairment or loss. A remote actuation system enables the modular control of tendon-driven hands. The artificial proprioception system enables direct measurement of joint angles and tendon tensions while temperature, vibration, and skin deformation are provided by a multimodal tactile sensor. The provision of multimodal sensory feedback that is spatiotemporally consistent with commanded actions could lead to benefits such as reduced phantom limb pain, and increased prosthesis use due to improved functionality and reduced cognitive burden

    Neuromorphic vibrotactile stimulation of fingertips for encoding object stiffness in telepresence sensory substitution and augmentation applications

    Get PDF
    We present a tactile telepresence system for real-time transmission of information about object stiffness to the human fingertips. Experimental tests were performed across two laboratories (Italy and Ireland). In the Italian laboratory, a mechatronic sensing platform indented different rubber samples. Information about rubber stiffness was converted into on-off events using a neuronal spiking model and sent to a vibrotactile glove in the Irish laboratory. Participants discriminated the variation of the stiffness of stimuli according to a two-alternative forced choice protocol. Stiffness discrimination was based on the variation of the temporal pattern of spikes generated during the indentation of the rubber samples. The results suggest that vibrotactile stimulation can effectively simulate surface stiffness when using neuronal spiking models to trigger vibrations in the haptic interface. Specifically, fractional variations of stiffness down to 0.67 were significantly discriminated with the developed neuromorphic haptic interface. This is a performance comparable, though slightly worse, to the threshold obtained in a benchmark experiment evaluating the same set of stimuli naturally with the own hand. Our paper presents a bioinspired method for delivering sensory feedback about object properties to human skin based on contingency-mimetic neuronal models, and can be useful for the design of high performance haptic devices

    MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

    Full text link
    MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a full-body avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video: http://living.media.mit.edu/projects/metaspace-ii
    • …
    corecore