361 research outputs found

    A robot hand testbed designed for enhancing embodiment and functional neurorehabilitation of body schema in subjects with upper limb impairment or loss.

    Get PDF
    Many upper limb amputees experience an incessant, post-amputation "phantom limb pain" and report that their missing limbs feel paralyzed in an uncomfortable posture. One hypothesis is that efferent commands no longer generate expected afferent signals, such as proprioceptive feedback from changes in limb configuration, and that the mismatch of motor commands and visual feedback is interpreted as pain. Non-invasive therapeutic techniques for treating phantom limb pain, such as mirror visual feedback (MVF), rely on visualizations of postural changes. Advances in neural interfaces for artificial sensory feedback now make it possible to combine MVF with a high-tech "rubber hand" illusion, in which subjects develop a sense of embodiment with a fake hand when subjected to congruent visual and somatosensory feedback. We discuss clinical benefits that could arise from the confluence of known concepts such as MVF and the rubber hand illusion, and new technologies such as neural interfaces for sensory feedback and highly sensorized robot hand testbeds, such as the "BairClaw" presented here. Our multi-articulating, anthropomorphic robot testbed can be used to study proprioceptive and tactile sensory stimuli during physical finger-object interactions. Conceived for artificial grasp, manipulation, and haptic exploration, the BairClaw could also be used for future studies on the neurorehabilitation of somatosensory disorders due to upper limb impairment or loss. A remote actuation system enables the modular control of tendon-driven hands. The artificial proprioception system enables direct measurement of joint angles and tendon tensions while temperature, vibration, and skin deformation are provided by a multimodal tactile sensor. The provision of multimodal sensory feedback that is spatiotemporally consistent with commanded actions could lead to benefits such as reduced phantom limb pain, and increased prosthesis use due to improved functionality and reduced cognitive burden

    Sensors for Robotic Hands: A Survey of State of the Art

    Get PDF
    Recent decades have seen significant progress in the field of artificial hands. Most of the surveys, which try to capture the latest developments in this field, focused on actuation and control systems of these devices. In this paper, our goal is to provide a comprehensive survey of the sensors for artificial hands. In order to present the evolution of the field, we cover five year periods starting at the turn of the millennium. At each period, we present the robot hands with a focus on their sensor systems dividing them into categories, such as prosthetics, research devices, and industrial end-effectors.We also cover the sensors developed for robot hand usage in each era. Finally, the period between 2010 and 2015 introduces the reader to the state of the art and also hints to the future directions in the sensor development for artificial hands

    Body-Borne Computers as Extensions of Self

    Get PDF
    The opportunities for wearable technologies go well beyond always-available information displays or health sensing devices. The concept of the cyborg introduced by Clynes and Kline, along with works in various fields of research and the arts, offers a vision of what technology integrated with the body can offer. This paper identifies different categories of research aimed at augmenting humans. The paper specifically focuses on three areas of augmentation of the human body and its sensorimotor capabilities: physical morphology, skin display, and somatosensory extension. We discuss how such digital extensions relate to the malleable nature of our self-image. We argue that body-borne devices are no longer simply functional apparatus, but offer a direct interplay with the mind. Finally, we also showcase some of our own projects in this area and shed light on future challenges

    Anthropomorphism Index of Mobility for Artificial Hands

    Get PDF
    The increasing development of anthropomorphic artificial hands makes necessary quick metrics that analyze their anthropomorphism. In this study, a human grasp experiment on the most important grasp types was undertaken in order to obtain an Anthropomorphism Index of Mobility (AIM) for artificial hands. The AIM evaluates the topology of the whole hand, joints and degrees of freedom (DoFs), and the possibility to control these DoFs independently. It uses a set of weighting factors, obtained from analysis of human grasping, depending on the relevance of the different groups of DoFs of the hand. The computation of the index is straightforward, making it a useful tool for analyzing new artificial hands in early stages of the design process and for grading human-likeness of existing artificial hands. Thirteen artificial hands, both prosthetic and robotic, were evaluated and compared using the AIM, highlighting the reasons behind their differences. The AIM was also compared with other indexes in the literature with more cumbersome computation, ranking equally different artificial hands. As the index was primarily proposed for prosthetic hands, normally used as nondominant hands in unilateral amputees, the grasp types selected for the human grasp experiment were the most relevant for the human nondominant hand to reinforce bimanual grasping in activities of daily living. However, it was shown that the effect of using the grasping information from the dominant hand is small, indicating that the index is also valid for evaluating the artificial hand as dominant and so being valid for bilateral amputees or robotic hands

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice

    A Human-Robot Interaction Perspective on Assistive and Rehabilitation Robotics

    Get PDF
    Assistive and rehabilitation devices are a promising and challenging field of recent robotics research. Motivated by societal needs such as aging populations, such devices can support motor functionality and subject training. The design, control, sensing, and assessment of the devices become more sophisticated due to a human in the loop. This paper gives a human–robot interaction perspective on current issues and opportunities in the field. On the topic of control and machine learning, approaches that support but do not distract subjects are reviewed. Options to provide sensory user feedback that are currently missing from robotic devices are outlined. Parallels between device acceptance and affective computing are made. Furthermore, requirements for functional assessment protocols that relate to real-world tasks are discussed. In all topic areas, the design of human-oriented frameworks and methods is dominated by challenges related to the close interaction between the human and robotic device. This paper discusses the aforementioned aspects in order to open up new perspectives for future robotic solutions

    A Human–Robot Interaction Perspective on Assistive and Rehabilitation Robotics

    Get PDF
    abstract: Assistive and rehabilitation devices are a promising and challenging field of recent robotics research. Motivated by societal needs such as aging populations, such devices can support motor functionality and subject training. The design, control, sensing, and assessment of the devices become more sophisticated due to a human in the loop. This paper gives a human–robot interaction perspective on current issues and opportunities in the field. On the topic of control and machine learning, approaches that support but do not distract subjects are reviewed. Options to provide sensory user feedback that are currently missing from robotic devices are outlined. Parallels between device acceptance and affective computing are made. Furthermore, requirements for functional assessment protocols that relate to real-world tasks are discussed. In all topic areas, the design of human-oriented frameworks and methods is dominated by challenges related to the close interaction between the human and robotic device. This paper discusses the aforementioned aspects in order to open up new perspectives for future robotic solutions.View the article as published at http://journal.frontiersin.org/article/10.3389/fnbot.2017.00024/ful

    Design and experimental evaluation of a new modular underactuated multi-fingered robot hand

    Get PDF
    © IMechE 2020. In this paper, a modular underactuated multi-fingered robot hand is proposed. The robot hand can be freely configured with different number and configuration of modular fingers according to the work needs. Driving motion is achieved by the rigid structure of the screw and the connecting rod. A finger-connecting mechanism is designed on the palm of the robot hand to meet the needs of modular finger’s installation, drive, rotation, and sensor connections. The fingertips are made of hollow rubber to enhance the stability of grasping. Details about the design of the robot hand and analysis of the robot kinematics and grasping process are described. Last, a prototype is developed, and a grab test is carried out. Experimental results demonstrate that the structure of proposed modular robot hand is reasonable, which enables the adaptability and flexibility of the modular robot hand to meet the requirements of various grasping modes in practice

    Crossmodal representation of a functional robotic hand arises after extensive training in healthy participants

    Get PDF
    a b s t r a c t The way in which humans represent their own bodies is critical in guiding their interactions with the environment. To achieve successful body-space interactions, the body representation is strictly connected with that of the space immediately surrounding it through efficient visuo-tactile crossmodal integration. Such a body-space integrated representation is not fixed, but can be dynamically modulated by the use of external tools. Our study aims to explore the effect of using a complex tool, namely a functional prosthesis, on crossmodal visuo-tactile spatial interactions in healthy participants. By using the crossmodal visuo-tactile congruency paradigm, we found that prolonged training with a mechanical hand capable of distal hand movements and providing sensory feedback induces a pattern of interference, which is not observed after a brief training, between visual stimuli close to the prosthesis and touches on the body. These results suggest that after extensive, but not short, training the functional prosthesis acquires a visuo-tactile crossmodal representation akin to real limbs. This finding adds to previous evidence for the embodiment of functional prostheses in amputees, and shows that their use may also improve the crossmodal combination of somatosensory feedback delivered by the prosthesis with visual stimuli in the space around it, thus effectively augmenting the patients' visuomotor abilities
    corecore