493 research outputs found

    A robot hand testbed designed for enhancing embodiment and functional neurorehabilitation of body schema in subjects with upper limb impairment or loss.

    Get PDF
    Many upper limb amputees experience an incessant, post-amputation "phantom limb pain" and report that their missing limbs feel paralyzed in an uncomfortable posture. One hypothesis is that efferent commands no longer generate expected afferent signals, such as proprioceptive feedback from changes in limb configuration, and that the mismatch of motor commands and visual feedback is interpreted as pain. Non-invasive therapeutic techniques for treating phantom limb pain, such as mirror visual feedback (MVF), rely on visualizations of postural changes. Advances in neural interfaces for artificial sensory feedback now make it possible to combine MVF with a high-tech "rubber hand" illusion, in which subjects develop a sense of embodiment with a fake hand when subjected to congruent visual and somatosensory feedback. We discuss clinical benefits that could arise from the confluence of known concepts such as MVF and the rubber hand illusion, and new technologies such as neural interfaces for sensory feedback and highly sensorized robot hand testbeds, such as the "BairClaw" presented here. Our multi-articulating, anthropomorphic robot testbed can be used to study proprioceptive and tactile sensory stimuli during physical finger-object interactions. Conceived for artificial grasp, manipulation, and haptic exploration, the BairClaw could also be used for future studies on the neurorehabilitation of somatosensory disorders due to upper limb impairment or loss. A remote actuation system enables the modular control of tendon-driven hands. The artificial proprioception system enables direct measurement of joint angles and tendon tensions while temperature, vibration, and skin deformation are provided by a multimodal tactile sensor. The provision of multimodal sensory feedback that is spatiotemporally consistent with commanded actions could lead to benefits such as reduced phantom limb pain, and increased prosthesis use due to improved functionality and reduced cognitive burden

    Sensors for Robotic Hands: A Survey of State of the Art

    Get PDF
    Recent decades have seen significant progress in the field of artificial hands. Most of the surveys, which try to capture the latest developments in this field, focused on actuation and control systems of these devices. In this paper, our goal is to provide a comprehensive survey of the sensors for artificial hands. In order to present the evolution of the field, we cover five year periods starting at the turn of the millennium. At each period, we present the robot hands with a focus on their sensor systems dividing them into categories, such as prosthetics, research devices, and industrial end-effectors.We also cover the sensors developed for robot hand usage in each era. Finally, the period between 2010 and 2015 introduces the reader to the state of the art and also hints to the future directions in the sensor development for artificial hands

    Body-Borne Computers as Extensions of Self

    Get PDF
    The opportunities for wearable technologies go well beyond always-available information displays or health sensing devices. The concept of the cyborg introduced by Clynes and Kline, along with works in various fields of research and the arts, offers a vision of what technology integrated with the body can offer. This paper identifies different categories of research aimed at augmenting humans. The paper specifically focuses on three areas of augmentation of the human body and its sensorimotor capabilities: physical morphology, skin display, and somatosensory extension. We discuss how such digital extensions relate to the malleable nature of our self-image. We argue that body-borne devices are no longer simply functional apparatus, but offer a direct interplay with the mind. Finally, we also showcase some of our own projects in this area and shed light on future challenges

    Anthropomorphism Index of Mobility for Artificial Hands

    Get PDF
    The increasing development of anthropomorphic artificial hands makes necessary quick metrics that analyze their anthropomorphism. In this study, a human grasp experiment on the most important grasp types was undertaken in order to obtain an Anthropomorphism Index of Mobility (AIM) for artificial hands. The AIM evaluates the topology of the whole hand, joints and degrees of freedom (DoFs), and the possibility to control these DoFs independently. It uses a set of weighting factors, obtained from analysis of human grasping, depending on the relevance of the different groups of DoFs of the hand. The computation of the index is straightforward, making it a useful tool for analyzing new artificial hands in early stages of the design process and for grading human-likeness of existing artificial hands. Thirteen artificial hands, both prosthetic and robotic, were evaluated and compared using the AIM, highlighting the reasons behind their differences. The AIM was also compared with other indexes in the literature with more cumbersome computation, ranking equally different artificial hands. As the index was primarily proposed for prosthetic hands, normally used as nondominant hands in unilateral amputees, the grasp types selected for the human grasp experiment were the most relevant for the human nondominant hand to reinforce bimanual grasping in activities of daily living. However, it was shown that the effect of using the grasping information from the dominant hand is small, indicating that the index is also valid for evaluating the artificial hand as dominant and so being valid for bilateral amputees or robotic hands

    The role of morphology of the thumb in anthropomorphic grasping : a review

    Get PDF
    The unique musculoskeletal structure of the human hand brings in wider dexterous capabilities to grasp and manipulate a repertoire of objects than the non-human primates. It has been widely accepted that the orientation and the position of the thumb plays an important role in this characteristic behavior. There have been numerous attempts to develop anthropomorphic robotic hands with varying levels of success. Nevertheless, manipulation ability in those hands is to be ameliorated even though they can grasp objects successfully. An appropriate model of the thumb is important to manipulate the objects against the fingers and to maintain the stability. Modeling these complex interactions about the mechanical axes of the joints and how to incorporate these joints in robotic thumbs is a challenging task. This article presents a review of the biomechanics of the human thumb and the robotic thumb designs to identify opportunities for future anthropomorphic robotic hands

    Development of Modular Compliant Anthropomorphic Robot Hand

    Get PDF
    The chapter presents the development of a modular compliant robotic hand characterized by the anthropomorphic structure and functionality. The prototype is made based on experience in development of contemporary advanced artificial hands and taking into account the complementary aspects of human bio-mechanics. The robot hand developed in the Institute Mihailo Pupin is called “Pupin hand”. The Pupin hand is developed for research purposes as well as for implementation with service and medical robot devices as an advance robot end-effector. Mechanical design, system identification, modeling and simulation and acquisition of the biological skill of grasping adopted from humans are considered in the chapter. Mechanical structure of the tendon-driven, multi-finger, 23 degrees of freedom compliant robot hand is presented in the chapter. Model of the hand is represented by corresponding multi-body rigid system with the complementary structural elasticity inserted between the particular finger modules. Some characteristic simulation results are given in the chapter in order to validate the chosen design concept. For the purpose of motion capture of human grasping skill, an appropriate experimental setup is prepared. It includes an infrared Kinect camera that combines visual and depth information about objects from the environment. The aim of using the Kinect sensor is to acquire human grasping skill and to map this natural motion to the robotic device. The novelties of the robot hand prototyping beyond to the state-of-the-art are stressed out in the conclusion

    Unsupervised decoding of long-term, naturalistic human neural recordings with automated video and audio annotations

    Get PDF
    Fully automated decoding of human activities and intentions from direct neural recordings is a tantalizing challenge in brain-computer interfacing. Most ongoing efforts have focused on training decoders on specific, stereotyped tasks in laboratory settings. Implementing brain-computer interfaces (BCIs) in natural settings requires adaptive strategies and scalable algorithms that require minimal supervision. Here we propose an unsupervised approach to decoding neural states from human brain recordings acquired in a naturalistic context. We demonstrate our approach on continuous long-term electrocorticographic (ECoG) data recorded over many days from the brain surface of subjects in a hospital room, with simultaneous audio and video recordings. We first discovered clusters in high-dimensional ECoG recordings and then annotated coherent clusters using speech and movement labels extracted automatically from audio and video recordings. To our knowledge, this represents the first time techniques from computer vision and speech processing have been used for natural ECoG decoding. Our results show that our unsupervised approach can discover distinct behaviors from ECoG data, including moving, speaking and resting. We verify the accuracy of our approach by comparing to manual annotations. Projecting the discovered cluster centers back onto the brain, this technique opens the door to automated functional brain mapping in natural settings

    A Human-Robot Interaction Perspective on Assistive and Rehabilitation Robotics

    Get PDF
    Assistive and rehabilitation devices are a promising and challenging field of recent robotics research. Motivated by societal needs such as aging populations, such devices can support motor functionality and subject training. The design, control, sensing, and assessment of the devices become more sophisticated due to a human in the loop. This paper gives a human–robot interaction perspective on current issues and opportunities in the field. On the topic of control and machine learning, approaches that support but do not distract subjects are reviewed. Options to provide sensory user feedback that are currently missing from robotic devices are outlined. Parallels between device acceptance and affective computing are made. Furthermore, requirements for functional assessment protocols that relate to real-world tasks are discussed. In all topic areas, the design of human-oriented frameworks and methods is dominated by challenges related to the close interaction between the human and robotic device. This paper discusses the aforementioned aspects in order to open up new perspectives for future robotic solutions
    corecore