6,089 research outputs found

    Human-centered Electric Prosthetic (HELP) Hand

    Get PDF
    Through a partnership with Indian non-profit Bhagwan Mahaveer Viklang Sahayata Samiti, we designed a functional, robust, and and low cost electrically powered prosthetic hand that communicates with unilateral, transradial, urban Indian amputees through a biointerface. The device uses compliant tendon actuation, a small linear servo, and a wearable garment outfitted with flex sensors to produce a device that, once placed inside a prosthetic glove, is anthropomorphic in both look and feel. The prosthesis was developed such that future groups can design for manufacturing and distribution in India

    On Neuromechanical Approaches for the Study of Biological Grasp and Manipulation

    Full text link
    Biological and robotic grasp and manipulation are undeniably similar at the level of mechanical task performance. However, their underlying fundamental biological vs. engineering mechanisms are, by definition, dramatically different and can even be antithetical. Even our approach to each is diametrically opposite: inductive science for the study of biological systems vs. engineering synthesis for the design and construction of robotic systems. The past 20 years have seen several conceptual advances in both fields and the quest to unify them. Chief among them is the reluctant recognition that their underlying fundamental mechanisms may actually share limited common ground, while exhibiting many fundamental differences. This recognition is particularly liberating because it allows us to resolve and move beyond multiple paradoxes and contradictions that arose from the initial reasonable assumption of a large common ground. Here, we begin by introducing the perspective of neuromechanics, which emphasizes that real-world behavior emerges from the intimate interactions among the physical structure of the system, the mechanical requirements of a task, the feasible neural control actions to produce it, and the ability of the neuromuscular system to adapt through interactions with the environment. This allows us to articulate a succinct overview of a few salient conceptual paradoxes and contradictions regarding under-determined vs. over-determined mechanics, under- vs. over-actuated control, prescribed vs. emergent function, learning vs. implementation vs. adaptation, prescriptive vs. descriptive synergies, and optimal vs. habitual performance. We conclude by presenting open questions and suggesting directions for future research. We hope this frank assessment of the state-of-the-art will encourage and guide these communities to continue to interact and make progress in these important areas

    A robot hand testbed designed for enhancing embodiment and functional neurorehabilitation of body schema in subjects with upper limb impairment or loss.

    Get PDF
    Many upper limb amputees experience an incessant, post-amputation "phantom limb pain" and report that their missing limbs feel paralyzed in an uncomfortable posture. One hypothesis is that efferent commands no longer generate expected afferent signals, such as proprioceptive feedback from changes in limb configuration, and that the mismatch of motor commands and visual feedback is interpreted as pain. Non-invasive therapeutic techniques for treating phantom limb pain, such as mirror visual feedback (MVF), rely on visualizations of postural changes. Advances in neural interfaces for artificial sensory feedback now make it possible to combine MVF with a high-tech "rubber hand" illusion, in which subjects develop a sense of embodiment with a fake hand when subjected to congruent visual and somatosensory feedback. We discuss clinical benefits that could arise from the confluence of known concepts such as MVF and the rubber hand illusion, and new technologies such as neural interfaces for sensory feedback and highly sensorized robot hand testbeds, such as the "BairClaw" presented here. Our multi-articulating, anthropomorphic robot testbed can be used to study proprioceptive and tactile sensory stimuli during physical finger-object interactions. Conceived for artificial grasp, manipulation, and haptic exploration, the BairClaw could also be used for future studies on the neurorehabilitation of somatosensory disorders due to upper limb impairment or loss. A remote actuation system enables the modular control of tendon-driven hands. The artificial proprioception system enables direct measurement of joint angles and tendon tensions while temperature, vibration, and skin deformation are provided by a multimodal tactile sensor. The provision of multimodal sensory feedback that is spatiotemporally consistent with commanded actions could lead to benefits such as reduced phantom limb pain, and increased prosthesis use due to improved functionality and reduced cognitive burden

    Anthropomorphic transradial myoelectric hand using tendon-spring mechanism

    Get PDF
    In the developing countries, the need for prosthetic hands is increasing. In general, transradial amputee patients use prosthetic hands that are passive like a body-powered prosthesis. This research proposes a low-cost myoelectric prosthetic hand based on 3D printing technology. Hand and finger size were designed based on the average size of human hands in Indonesia. The proposed myoelectric hand employs linear actuator combined with the tendon-spring mechanism. Myoelectric hand was developed with five modes of grip pattern to perform various object grasping in activity of daily living. Control strategy had been developed for controlling the motion of flexion and extension on the hand and saving the energy consumed by the actuators. The control strategy was developed under MATLAB/Simulink environment and embedded to Arduino Nano V3 using Simulink Support Package for Arduino Hardware. Surface electromyography (EMG) sensor was used in this research for reading the muscle activity of the user/wearer. The proposed myoelectric hand had been tested in object grasping test and was implemented on a study participant with transradial amputee

    Autonomy Infused Teleoperation with Application to BCI Manipulation

    Full text link
    Robot teleoperation systems face a common set of challenges including latency, low-dimensional user commands, and asymmetric control inputs. User control with Brain-Computer Interfaces (BCIs) exacerbates these problems through especially noisy and erratic low-dimensional motion commands due to the difficulty in decoding neural activity. We introduce a general framework to address these challenges through a combination of computer vision, user intent inference, and arbitration between the human input and autonomous control schemes. Adjustable levels of assistance allow the system to balance the operator's capabilities and feelings of comfort and control while compensating for a task's difficulty. We present experimental results demonstrating significant performance improvement using the shared-control assistance framework on adapted rehabilitation benchmarks with two subjects implanted with intracortical brain-computer interfaces controlling a seven degree-of-freedom robotic manipulator as a prosthetic. Our results further indicate that shared assistance mitigates perceived user difficulty and even enables successful performance on previously infeasible tasks. We showcase the extensibility of our architecture with applications to quality-of-life tasks such as opening a door, pouring liquids from containers, and manipulation with novel objects in densely cluttered environments

    Biosignal‐based human–machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal‐based HMIs for assistance and rehabilitation to outline state‐of‐the‐art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full‐text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever‐growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complex-ity, so their usefulness should be carefully evaluated for the specific application

    Human-activity-centered measurement system:challenges from laboratory to the real environment in assistive gait wearable robotics

    Get PDF
    Assistive gait wearable robots (AGWR) have shown a great advancement in developing intelligent devices to assist human in their activities of daily living (ADLs). The rapid technological advancement in sensory technology, actuators, materials and computational intelligence has sped up this development process towards more practical and smart AGWR. However, most assistive gait wearable robots are still confined to be controlled, assessed indoor and within laboratory environments, limiting any potential to provide a real assistance and rehabilitation required to humans in the real environments. The gait assessment parameters play an important role not only in evaluating the patient progress and assistive device performance but also in controlling smart self-adaptable AGWR in real-time. The self-adaptable wearable robots must interactively conform to the changing environments and between users to provide optimal functionality and comfort. This paper discusses the performance parameters, such as comfortability, safety, adaptability, and energy consumption, which are required for the development of an intelligent AGWR for outdoor environments. The challenges to measuring the parameters using current systems for data collection and analysis using vision capture and wearable sensors are presented and discussed

    The "Federica" hand: a simple, very efficient prothesis

    Get PDF
    Hand prostheses partially restore hand appearance and functionalities. Not everyone can afford expensive prostheses and many low-cost prostheses have been proposed. In particular, 3D printers have provided great opportunities by simplifying the manufacturing process and reducing costs. Generally, active prostheses use multiple motors for fingers movement and are controlled by electromyographic (EMG) signals. The "Federica" hand is a single motor prosthesis, equipped with an adaptive grasp and controlled by a force-myographic signal. The "Federica" hand is 3D printed and has an anthropomorphic morphology with five fingers, each consisting of three phalanges. The movement generated by a single servomotor is transmitted to the fingers by inextensible tendons that form a closed chain; practically, no springs are used for passive hand opening. A differential mechanical system simultaneously distributes the motor force in predefined portions on each finger, regardless of their actual positions. Proportional control of hand closure is achieved by measuring the contraction of residual limb muscles by means of a force sensor, replacing the EMG. The electrical current of the servomotor is monitored to provide the user with a sensory feedback of the grip force, through a small vibration motor. A simple Arduino board was adopted as processing unit. The differential mechanism guarantees an efficient transfer of mechanical energy from the motor to the fingers and a secure grasp of any object, regardless of its shape and deformability. The force sensor, being extremely thin, can be easily embedded into the prosthesis socket and positioned on both muscles and tendons; it offers some advantages over the EMG as it does not require any electrical contact or signal processing to extract information about the muscle contraction intensity. The grip speed is high enough to allow the user to grab objects on the fly: from the muscle trigger until to the complete hand closure, "Federica" takes about half a second. The cost of the device is about 100 US$. Preliminary tests carried out on a patient with transcarpal amputation, showed high performances in controlling the prosthesis, after a very rapid training session. The "Federica" hand turned out to be a lightweight, low-cost and extremely efficient prosthesis. The project is intended to be open-source: all the information needed to produce the prosthesis (e.g. CAD files, circuit schematics, software) can be downloaded from a public repository. Thus, allowing everyone to use the "Federica" hand and customize or improve it

    On the development of a cybernetic prosthetic hand

    Get PDF
    The human hand is the end organ of the upper limb, which in humans serves the important function of prehension, as well as being an important organ for sensation and communication. It is a marvellous example of how a complex mechanism can be implemented, capable of realizing very complex and useful tasks using a very effective combination of mechanisms, sensing, actuation and control functions. In this thesis, the road towards the realization of a cybernetic hand has been presented. After a detailed analysis of the model, the human hand, a deep review of the state of the art of artificial hands has been carried out. In particular, the performance of prosthetic hands used in clinical practice has been compared with the research prototypes, both for prosthetic and for robotic applications. By following a biomechatronic approach, i.e. by comparing the characteristics of these hands with the natural model, the human hand, the limitations of current artificial devices will be put in evidence, thus outlining the design goals for a new cybernetic device. Three hand prototypes with a high number of degrees of freedom have been realized and tested: the first one uses microactuators embedded inside the structure of the fingers, and the second and third prototypes exploit the concept of microactuation in order to increase the dexterity of the hand while maintaining the simplicity for the control. In particular, a framework for the definition and realization of the closed-loop electromyographic control of these devices has been presented and implemented. The results were quite promising, putting in evidence that, in the future, there could be two different approaches for the realization of artificial devices. On one side there could be the EMG-controlled hands, with compliant fingers but only one active degree of freedom. On the other side, more performing artificial hands could be directly interfaced with the peripheral nervous system, thus establishing a bi-directional communication with the human brain
    • 

    corecore