725 research outputs found

    A Review of Non-Invasive Haptic Feedback stimulation Techniques for Upper Extremity Prostheses

    Get PDF
    A sense of touch is essential for amputees to reintegrate into their social and work life. The design of the next generation of the prostheses will have the ability to effectively convey the tactile information between the amputee and the artificial limbs. This work reviews non-invasive haptic feedback stimulation techniques to convey the tactile information from the prosthetic hand to the amputee’s brain. Various types of actuators that been used to stimulate the patient’s residual limb for different types of artificial prostheses in previous studies have been reviewed in terms of functionality, effectiveness, wearability and comfort. The non-invasive hybrid feedback stimulation system was found to be better in terms of the stimulus identification rate of the haptic prostheses’ users. It can be conclude that integrating hybrid haptic feedback stimulation system with the upper limb prostheses leads to improving its acceptance among users

    Dataset with Tactile and Kinesthetic Information from a Human Forearm and Its Application to Deep Learning

    Get PDF
    There are physical Human–Robot Interaction (pHRI) applications where the robot has to grab the human body, such as rescue or assistive robotics. Being able to precisely estimate the grasping location when grabbing a human limb is crucial to perform a safe manipulation of the human. Computer vision methods provide pre-grasp information with strong constraints imposed by the field environments. Force-based compliant control, after grasping, limits the amount of applied strength. On the other hand, valuable tactile and proprioceptive information can be obtained from the pHRI gripper, which can be used to better know the features of the human and the contact state between the human and the robot. This paper presents a novel dataset of tactile and kinesthetic data obtained from a robot gripper that grabs a human forearm. The dataset is collected with a three-fingered gripper with two underactuated fingers and a fixed finger with a high-resolution tactile sensor. A palpation procedure is performed to record the shape of the forearm and to recognize the bones and muscles in different sections. Moreover, an application for the use of the database is included. In particular, a fusion approach is used to estimate the actual grasped forearm section using both kinesthetic and tactile information on a regression deep-learning neural network. First, tactile and kinesthetic data are trained separately with Long Short-Term Memory (LSTM) neural networks, considering the data are sequential. Then, the outputs are fed to a Fusion neural network to enhance the estimation. The experiments conducted show good results in training both sources separately, with superior performance when the fusion approach is considered.This research was funded by the University of Málaga, the Ministerio de Ciencia, Innovación y Universidades, Gobierno de España, grant number RTI2018-093421-B-I00 and the European Commission, grant number BES-2016-078237. Partial funding for open access charge: Universidad de Málag

    Intelligent Haptic Perception for Physical Robot Interaction

    Get PDF
    Doctorado en Ingeniería mecatrónica. Fecha de entrega de la Tesis doctoral: 8 de enero de 2020. Fecha de lectura de Tesis doctoral: 30 de marzo 2020.The dream of having robots living among us is coming true thanks to the recent advances in Artificial Intelligence (AI). The gap that still exists between that dream and reality will be filled by scientific research, but manifold challenges are yet to be addressed. Handling the complexity and uncertainty of real-world scenarios is still the major challenge in robotics nowadays. In this respect, novel AI methods are giving the robots the capability to learn from experience and therefore to cope with real-life situations. Moreover, we live in a physical world in which physical interactions are both vital and natural. Thus, those robots that are being developed to live among humans must perform tasks that require physical interactions. Haptic perception, conceived as the idea of feeling and processing tactile and kinesthetic sensations, is essential for making this physical interaction possible. This research is inspired by the dream of having robots among us, and therefore, addresses the challenge of developing robots with haptic perception capabilities that can operate in real-world scenarios. This PhD thesis tackles the problems related to physical robot interaction by employing machine learning techniques. Three AI solutions are proposed for different physical robot interaction challenges: i) Grasping and manipulation of humans’ limbs; ii) Tactile object recognition; iii) Control of Variable-Stiffness-Link (VSL) manipulators. The ideas behind this research work have potential robotic applications such as search and rescue, healthcare or rehabilitation. This dissertation consists of a compendium of publications comprising as the main body a compilation of previously published scientific articles. The baseline of this research is composed of a total of five papers published in prestigious peer-reviewed scientific journals and international robotics conferences

    Myoelectric forearm prostheses: State of the art from a user-centered perspective

    Get PDF
    User acceptance of myoelectric forearm prostheses is currently low. Awkward control, lack of feedback, and difficult training are cited as primary reasons. Recently, researchers have focused on exploiting the new possibilities offered by advancements in prosthetic technology. Alternatively, researchers could focus on prosthesis acceptance by developing functional requirements based on activities users are likely to perform. In this article, we describe the process of determining such requirements and then the application of these requirements to evaluating the state of the art in myoelectric forearm prosthesis research. As part of a needs assessment, a workshop was organized involving clinicians (representing end users), academics, and engineers. The resulting needs included an increased number of functions, lower reaction and execution times, and intuitiveness of both control and feedback systems. Reviewing the state of the art of research in the main prosthetic subsystems (electromyographic [EMG] sensing, control, and feedback) showed that modern research prototypes only partly fulfill the requirements. We found that focus should be on validating EMG-sensing results with patients, improving simultaneous control of wrist movements and grasps, deriving optimal parameters for force and position feedback, and taking into account the psychophysical aspects of feedback, such as intensity perception and spatial acuity

    The Feeling of Color: A Haptic Feedback Device for the Visually Disabled

    Get PDF
    Tapson J, Gurari N, Diaz J, et al. The Feeling of Color: A Haptic Feedback Device for the Visually Disabled. Presented at the Biomedical Circuits and Systems Conference (BIOCAS), Baltimore, MD.We describe a sensory augmentation system designed to provide the visually disabled with a sense of color. Our system consists of a glove with short-range optical color sensors mounted on its fingertips, and a torso-worn belt on which tactors (haptic feedback actuators) are mounted. Each fingertip sensor detects the observed objectpsilas color. This information is encoded to the tactor through vibrations in respective locations and varying modulations. Early results suggest that detection of primary colors is possible with near 100% accuracy and moderate latency, with a minimum amount of training

    Force-Aware Interface via Electromyography for Natural VR/AR Interaction

    Full text link
    While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR.Comment: ACM Transactions on Graphics (SIGGRAPH Asia 2022

    Distance Feedback Travel Aid Haptic Display Design

    Get PDF
    corecore