1,267 research outputs found

    Viia-hand: a Reach-and-grasp Restoration System Integrating Voice interaction, Computer vision and Auditory feedback for Blind Amputees

    Full text link
    Visual feedback plays a crucial role in the process of amputation patients completing grasping in the field of prosthesis control. However, for blind and visually impaired (BVI) amputees, the loss of both visual and grasping abilities makes the "easy" reach-and-grasp task a feasible challenge. In this paper, we propose a novel multi-sensory prosthesis system helping BVI amputees with sensing, navigation and grasp operations. It combines modules of voice interaction, environmental perception, grasp guidance, collaborative control, and auditory/tactile feedback. In particular, the voice interaction module receives user instructions and invokes other functional modules according to the instructions. The environmental perception and grasp guidance module obtains environmental information through computer vision, and feedbacks the information to the user through auditory feedback modules (voice prompts and spatial sound sources) and tactile feedback modules (vibration stimulation). The prosthesis collaborative control module obtains the context information of the grasp guidance process and completes the collaborative control of grasp gestures and wrist angles of prosthesis in conjunction with the user's control intention in order to achieve stable grasp of various objects. This paper details a prototyping design (named viia-hand) and presents its preliminary experimental verification on healthy subjects completing specific reach-and-grasp tasks. Our results showed that, with the help of our new design, the subjects were able to achieve a precise reach and reliable grasp of the target objects in a relatively cluttered environment. Additionally, the system is extremely user-friendly, as users can quickly adapt to it with minimal training

    Sensory Integration of Electrotactile Stimulation as Supplementary Feedback for Human-Machine Interface

    Get PDF

    Sensors for Robotic Hands: A Survey of State of the Art

    Get PDF
    Recent decades have seen significant progress in the field of artificial hands. Most of the surveys, which try to capture the latest developments in this field, focused on actuation and control systems of these devices. In this paper, our goal is to provide a comprehensive survey of the sensors for artificial hands. In order to present the evolution of the field, we cover five year periods starting at the turn of the millennium. At each period, we present the robot hands with a focus on their sensor systems dividing them into categories, such as prosthetics, research devices, and industrial end-effectors.We also cover the sensors developed for robot hand usage in each era. Finally, the period between 2010 and 2015 introduces the reader to the state of the art and also hints to the future directions in the sensor development for artificial hands

    Distributed Sensing and Stimulation Systems Towards Sense of Touch Restoration in Prosthetics

    Get PDF
    Modern prostheses aim at restoring the functional and aesthetic characteristics of the lost limb. To foster prosthesis embodiment and functionality, it is necessary to restitute both volitional control and sensory feedback. Contemporary feedback interfaces presented in research use few sensors and stimulation units to feedback at most two discrete feedback variables (e.g. grasping force and aperture), whereas the human sense of touch relies on a distributed network of mechanoreceptors providing high-fidelity spatial information. To provide this type of feedback in prosthetics, it is necessary to sense tactile information from artificial skin placed on the prosthesis and transmit tactile feedback above the amputation in order to map the interaction between the prosthesis and the environment. This thesis proposes the integration of distributed sensing systems (e-skin) to acquire tactile sensation, and non-invasive multichannel electrotactile feedback and virtual reality to deliver high-bandwidth information to the user. Its core focus addresses the development and testing of close-loop sensory feedback human-machine interface, based on the latest distributed sensing and stimulation techniques for restoring the sense of touch in prosthetics. To this end, the thesis is comprised of two introductory chapters that describe the state of art in the field, the objectives and the used methodology and contributions; as well as three studies distributed over stimulation system level and sensing system level. The first study presents the development of close-loop compensatory tracking system to evaluate the usability and effectiveness of electrotactile sensory feedback in enabling real-time close-loop control in prosthetics. It examines and compares the subject\u2019s adaptive performance and tolerance to random latencies while performing the dynamic control task (i.e. position control) and simultaneously receiving either visual feedback or electrotactile feedback for communicating the momentary tracking error. Moreover, it reported the minimum time delay needed for an abrupt impairment of users\u2019 performance. The experimental results have shown that electrotactile feedback performance is less prone to changes with longer delays. However, visual feedback drops faster than electrotactile with increased time delays. This is a good indication for the effectiveness of electrotactile feedback in enabling close- loop control in prosthetics, since some delays are inevitable. The second study describes the development of a novel non-invasive compact multichannel interface for electrotactile feedback, containing 24 pads electrode matrix, with fully programmable stimulation unit, that investigates the ability of able-bodied human subjects to localize the electrotactile stimulus delivered through the electrode matrix. Furthermore, it designed a novel dual parameter -modulation (interleaved frequency and intensity) and compared it to conventional stimulation (same frequency for all pads). In addition and for the first time, it compared the electrotactile stimulation to mechanical stimulation. More, it exposes the integration of virtual prosthesis with the developed system in order to achieve better user experience and object manipulation through mapping the acquired real-time collected tactile data and feedback it simultaneously to the user. The experimental results demonstrated that the proposed interleaved coding substantially improved the spatial localization compared to same-frequency stimulation. Furthermore, it showed that same-frequency stimulation was equivalent to mechanical stimulation, whereas the performance with dual-parameter modulation was significantly better. The third study presents the realization of a novel, flexible, screen- printed e-skin based on P(VDF-TrFE) piezoelectric polymers, that would cover the fingertips and the palm of the prosthetic hand (particularly the Michelangelo hand by Ottobock) and an assistive sensorized glove for stroke patients. Moreover, it developed a new validation methodology to examine the sensors behavior while being solicited. The characterization results showed compatibility between the expected (modeled) behavior of the electrical response of each sensor to measured mechanical (normal) force at the skin surface, which in turn proved the combination of both fabrication and assembly processes was successful. This paves the way to define a practical, simplified and reproducible characterization protocol for e-skin patches In conclusion, by adopting innovative methodologies in sensing and stimulation systems, this thesis advances the overall development of close-loop sensory feedback human-machine interface used for restoration of sense of touch in prosthetics. Moreover, this research could lead to high-bandwidth high-fidelity transmission of tactile information for modern dexterous prostheses that could ameliorate the end user experience and facilitate it acceptance in the daily life

    Psychometric characterization of incidental feedback sources during grasping with a hand prosthesis

    Get PDF
    Background A prosthetic system should ideally reinstate the bidirectional communication between the user’s brain and its end effector by restoring both motor and sensory functions lost after an amputation. However, current commercial prostheses generally do not incorporate somatosensory feedback. Even without explicit feedback, grasping using a prosthesis partly relies on sensory information. Indeed, the prosthesis operation is characterized by visual and sound cues that could be exploited by the user to estimate the prosthesis state. However, the quality of this incidental feedback has not been objectively evaluated. Methods In this study, the psychometric properties of the auditory and visual feedback of prosthesis motion were assessed and compared to that of a vibro-tactile interface. Twelve able-bodied subjects passively observed prosthesis closing and grasping an object, and they were asked to discriminate (experiment I) or estimate (experiment II) the closing velocity of the prosthesis using visual (VIS), acoustic (SND), or combined (VIS?+?SND) feedback. In experiment II, the subjects performed the task also with a vibrotactile stimulus (VIB) delivered using a single tactor. The outcome measures for the discrimination and estimation experiments were just noticeable difference (JND) and median absolute estimation error (MAE), respectively. Results The results demonstrated that the incidental sources provided a remarkably good discrimination and estimation of the closing velocity, significantly outperforming the vibrotactile feedback. Using incidental sources, the subjects could discriminate almost the minimum possible increment/decrement in velocity that could be commanded to the prosthesis (median JND?<?2% for SND and VIS?+?SND). Similarly, the median MAE in estimating the prosthesis velocity randomly commanded from the full working range was also low, i.e., approximately 5% in SND and VIS?+?SND. Conclusions Since the closing velocity is proportional to grasping force in state-of-the-art myoelectric prostheses, the results of the present study imply that the incidental feedback, when available, could be usefully exploited for grasping force control. Therefore, the impact of incidental feedback needs to be considered when designing a feedback interface in prosthetics, especially since the quality of estimation using supplemental sources (e.g., vibration) can be worse compared to that of the intrinsic cues

    Electronic systems for the restoration of the sense of touch in upper limb prosthetics

    Get PDF
    In the last few years, research on active prosthetics for upper limbs focused on improving the human functionalities and the control. New methods have been proposed for measuring the user muscle activity and translating it into the prosthesis control commands. Developing the feed-forward interface so that the prosthesis better follows the intention of the user is an important step towards improving the quality of life of people with limb amputation. However, prosthesis users can neither feel if something or someone is touching them over the prosthesis and nor perceive the temperature or roughness of objects. Prosthesis users are helped by looking at an object, but they cannot detect anything otherwise. Their sight gives them most information. Therefore, to foster the prosthesis embodiment and utility, it is necessary to have a prosthetic system that not only responds to the control signals provided by the user, but also transmits back to the user the information about the current state of the prosthesis. This thesis presents an electronic skin system to close the loop in prostheses towards the restoration of the sense of touch in prosthesis users. The proposed electronic skin system inlcudes an advanced distributed sensing (electronic skin), a system for (i) signal conditioning, (ii) data acquisition, and (iii) data processing, and a stimulation system. The idea is to integrate all these components into a myoelectric prosthesis. Embedding the electronic system and the sensing materials is a critical issue on the way of development of new prostheses. In particular, processing the data, originated from the electronic skin, into low- or high-level information is the key issue to be addressed by the embedded electronic system. Recently, it has been proved that the Machine Learning is a promising approach in processing tactile sensors information. Many studies have been shown the Machine Learning eectiveness in the classication of input touch modalities.More specically, this thesis is focused on the stimulation system, allowing the communication of a mechanical interaction from the electronic skin to prosthesis users, and the dedicated implementation of algorithms for processing tactile data originating from the electronic skin. On system level, the thesis provides design of the experimental setup, experimental protocol, and of algorithms to process tactile data. On architectural level, the thesis proposes a design ow for the implementation of digital circuits for both FPGA and integrated circuits, and techniques for the power management of embedded systems for Machine Learning algorithms

    Reproducing tactile and proprioception based on the human-in-the-closed-loop conceptual approach

    Get PDF
    Prosthetic limb embodiment remains a significant challenge for many amputees due to traditional designs' lack of sensory feedback. To address this challenge, the effectiveness of non-invasive neuromuscular electrical stimulation (NMES) controlled by a hybrid proportional-differential (PD)-Fuzzy logic system was evaluated for providing real-time proprioception and tactile feedback. The study used a human-in-the-closed-loop approach with ten participants: five upper limb amputees and five non-disabled individuals as the control group. An applied force, the joint angle of a prosthetic hand's finger, and surface electromyography signals generated by the biceps muscle all regulate the intensity of sensory feedback. Additionally, the C6 and C7 myotomes were selected as elicitation sites. The average threshold for detecting action motion and force was around 21° and 1.524N, respectively. The participants successfully reproduced desired joint angles within the range of 0°-110° at five separate intervals. In the weight recognition experiment, the amputee participant's minimum number of false predictions was four. The highest accuracy achieved was 80.66% in detecting object size and stiffness. Additionally, unpaired t-tests were performed for the means of the results of the experiments to determine statistically significant differences between groups. The results suggest that stimulation of myotomes by NMES is an effective non-invasive method for delivering rich multimodal sensation information to individuals with disabilities, including upper limb amputees, without needing visual or auditory cues. These findings contribute to the development of non-invasive sensory substitution in prostheses
    corecore