68 research outputs found
Biosignalâbased humanâmachine interfaces for assistance and rehabilitation : a survey
As a definition, HumanâMachine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignalâbased HMIs for assistance and rehabilitation to outline stateâofâtheâart and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, fullâtext), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An everâgrowing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIsâ complex-ity, so their usefulness should be carefully evaluated for the specific application
Recommended from our members
Advancing the Functionality and Wearability of Robotic Hand Orthoses Towards Activities of Daily Living in Stroke Patients
Post stroke rehabilitation is effective when a large number of motor repetitions are provided to patients. However, conventional physical therapy or traditional desktop-size robot aided rehabilitation do not provide sufficient number of repetitions due to cost and logistical barriers. Our vision is to realize a wearable and functional hand orthosis that could be used outside of controlled, clinical settings, thus allowing for more training repetitions. Furthermore, if such a device can prove effective for Activities of Daily Living (ADLs) while actively worn, this can incentivize patients to increase its use, further enhancing rehabilitative effects. However, in order to provide such clinical benefits, the device must be completely wearable without obtrusive features, and intuitive to control even for non-experts. In this thesis, we thus focus on wearability, functionality, and intuitive intent detection technology for a novel hand robot, and assess its performance when used both as a rehabilitative device and an assistive tool.
A fully wearable device must deliver meaningful manipulation capability in small and lightweight package. In this context, we investigate the capability of single-actuator devices to assist whole hand movement patterns through a network of exotendons. Our prototypes combine a single linear actuator (mounted on a forearm splint) with a network of exotendons (routed on the surface of a soft glove). We investigate two possible tendon network configurations: one that produces full finger extension (overcoming flexor spasticity) and one that combines proximal flexion with distal extension at each finger. In experiments with stroke survivors, we measure the force levels needed to overcome various levels of spasticity and to open the hand for grasping using the first of these configurations, and qualitatively demonstrate the ability to execute fingertip grasps using the second. Our results support the feasibility of developing future wearable devices able to assist a range of manipulation tasks.
In order to further improve the wearability of the device, we propose two designs that provide effective force transmission by increasing moment arms around finger joints. We evaluate the designs with geometric models and experiment using a 3D-printed artificial finger to find force and joint angle characteristics of the suggested structures. We also perform clinical tests with stroke patients to demonstrate the feasibility of the designs. The testing supports the hypothesis that the proposed designs efficiently elicit extension of the digits in patients with spasticity as compared to existing baselines. With the suggested transmission designs, the device can deliver sufficient extension force even when the users have increased muscle tone due to fatigue.
The vision of an orthotic device used for ADLs can only be realized if the patients are able to operate the device themselves. However, the field is generally lacking effective methods by which the user can operate the device: such controls must be effective, intuitive, and robust to the wide range of possible impairment patterns. The variety of encountered upper limb impairment patterns in stroke patients means that a single sensing modality, such as electromyography, might not be sufficient to enable controls for a broad range of users. To address this significant gap, we introduce a multimodal sensing and interaction paradigm for an active hand orthosis. In our proof-of-concept implementation, EMG is complemented by other sensing modalities, such as finger bend and contact pressure sensors. We propose multimodal interaction methods that utilize this sensory data as input, and show they can enable tasks for stroke survivors who exhibit different impairment patterns.
We then assess the performance of the robotic orthosis for two possible roles: as a therapeutic tool that facilitates device mediated hand exercises to recover neuromuscular function, or as an assistive device for use in everyday activities to aid functional use of the hand. 11 chronic stroke (> 2 years) patients with moderate muscle tone (Modified Ashworth Scale †2 in upper extremity) engage in a month-long training protocol using the orthosis. Individuals are evaluated using standardized outcome measures, both with and without orthosis assistance. The results highlight the potential for wearable and user-driven robotic hand orthoses to extend the use and training of the affected upper limb after stroke.
The advances proposed in this thesis have the potential to enable robotic based hand rehabilitation during daily activities (as opposed to isolated hand exercises with limited upper limb engagement) and over extended periods of time, even in a patientâs home environment. Numerous challenges must still be overcome in order to achieve this vision, related to design (compact devices with easier donning/doffing), control (robust yet intuitive intent inferral), and effectiveness (improved functionality in a wider range of metrics). However, if these challenges can be addressed, wearable robotic devices have the potential to greatly extend the use and training of the affected upper limb after stroke, and help improve the quality of life for a large patient population
Soft Gloves: A Review on Recent Developments in Actuation, Sensing, Control and Applications
Interest in soft gloves, both robotic and haptic, has enormously grown over the past decade, due to their inherent compliance, which makes them particularly suitable for direct interaction with the human hand. Robotic soft gloves have been developed for hand rehabilitation, for ADLs assistance, or sometimes for both. Haptic soft gloves may be applied in virtual reality (VR) applications or to give sensory feedback in combination with prostheses or to control robots. This paper presents an updated review of the state of the art of soft gloves, with a particular focus on actuation, sensing, and control, combined with a detailed analysis of the devices according to their application field. The review is organized on two levels: a prospective review allows the highlighting of the main trends in soft gloves development and applications, and an analytical review performs an in-depth analysis of the technical solutions developed and implemented in the revised scientific research. Additional minor evaluations integrate the analysis, such as a synthetic investigation of the main results in the clinical studies and trials referred in literature which involve soft gloves
A Discrete-Time Algorithm for Stiffness Extraction from sEMG and Its Application in Antidisturbance Teleoperation
© 2016 Peidong Liang et al. We have developed a new discrete-time algorithm of stiffness extraction from muscle surface electromyography (sEMG) collected from human operator's arms and have applied it for antidisturbance control in robot teleoperation. The variation of arm stiffness is estimated from sEMG signals and transferred to a telerobot under variable impedance control to imitate human motor control behaviours, particularly for disturbance attenuation. In comparison to the estimation of stiffness from sEMG, the proposed algorithm is able to reduce the nonlinear residual error effect and to enhance robustness and to simplify stiffness calibration. In order to extract a smoothing stiffness enveloping from sEMG signals, two enveloping methods are employed in this paper, namely, fast linear enveloping based on low pass filtering and moving average and amplitude monocomponent and frequency modulating (AM-FM) method. Both methods have been incorporated into the proposed stiffness variance estimation algorithm and are extensively tested. The test results show that stiffness variation extraction based on the two methods is sensitive and robust to attenuation disturbance. It could potentially be applied for teleoperation in the presence of hazardous surroundings or human robot physical cooperation scenarios
Recommended from our members
Intuitive Human-Machine Interfaces for Non-Anthropomorphic Robotic Hands
As robots become more prevalent in our everyday lives, both in our workplaces and in our homes, it becomes increasingly likely that people who are not experts in robotics will be asked to interface with robotic devices. It is therefore important to develop robotic controls that are intuitive and easy for novices to use. Robotic hands, in particular, are very useful, but their high dimensionality makes creating intuitive human-machine interfaces for them complex. In this dissertation, we study the control of non-anthropomorphic robotic hands by non-roboticists in two contexts: collaborative manipulation and assistive robotics.
In the field of collaborative manipulation, the human and the robot work side by side as independent agents. Teleoperation allows the human to assist the robot when autonomous grasping is not able to deal sufficiently well with corner cases or cannot operate fast enough. Using the teleoperatorâs hand as an input device can provide an intuitive control method, but finding a mapping between a human hand and a non-anthropomorphic robot hand can be difficult, due to the handsâ dissimilar kinematics. In this dissertation, we seek to create a mapping between the human hand and a fully actuated, non-anthropomorphic robot hand that is intuitive enough to enable effective real-time teleoperation, even for novice users.
We propose a low-dimensional and continuous teleoperation subspace which can be used as an intermediary for mapping between different hand pose spaces. We first propose the general concept of the subspace, its properties and the variables needed to map from the human hand to a robot hand. We then propose three ways to populate the teleoperation subspace mapping. Two of our mappings use a dataglove to harvest information about the user's hand. We define the mapping between joint space and teleoperation subspace with an empirical definition, which requires a person to define hand motions in an intuitive, hand-specific way, and with an algorithmic definition, which is kinematically independent, and uses objects to define the subspace. Our third mapping for the teleoperation subspace uses forearm electromyography (EMG) as a control input.
Assistive orthotics is another area of robotics where human-machine interfaces are critical, since, in this field, the robot is attached to the hand of the human user. In this case, the goal is for the robot to assist the human with movements they would not otherwise be able to achieve. Orthotics can improve the quality of life of people who do not have full use of their hands. Human-machine interfaces for assistive hand orthotics that use EMG signals from the affected forearm as input are intuitive and repeated use can strengthen the muscles of the user's affected arm. In this dissertation, we seek to create an EMG based control for an orthotic device used by people who have had a stroke. We would like our control to enable functional motions when used in conjunction with a orthosis and to be robust to changes in the input signal.
We propose a control for a wearable hand orthosis which uses an easy to don, commodity forearm EMG band. We develop an supervised algorithm to detect a userâs intent to open and close their hand, and pair this algorithm with a training protocol which makes our intent detection robust to changes in the input signal. We show that this algorithm, when used in conjunction with an orthosis over several weeks, can improve distal function in users. Additionally, we propose two semi-supervised intent detection algorithms designed to keep our control robust to changes in the input data while reducing the length and frequency of our training protocol
- âŠ