1,744 research outputs found

    Autonomy Infused Teleoperation with Application to BCI Manipulation

    Full text link
    Robot teleoperation systems face a common set of challenges including latency, low-dimensional user commands, and asymmetric control inputs. User control with Brain-Computer Interfaces (BCIs) exacerbates these problems through especially noisy and erratic low-dimensional motion commands due to the difficulty in decoding neural activity. We introduce a general framework to address these challenges through a combination of computer vision, user intent inference, and arbitration between the human input and autonomous control schemes. Adjustable levels of assistance allow the system to balance the operator's capabilities and feelings of comfort and control while compensating for a task's difficulty. We present experimental results demonstrating significant performance improvement using the shared-control assistance framework on adapted rehabilitation benchmarks with two subjects implanted with intracortical brain-computer interfaces controlling a seven degree-of-freedom robotic manipulator as a prosthetic. Our results further indicate that shared assistance mitigates perceived user difficulty and even enables successful performance on previously infeasible tasks. We showcase the extensibility of our architecture with applications to quality-of-life tasks such as opening a door, pouring liquids from containers, and manipulation with novel objects in densely cluttered environments

    Assessment of Myoelectric Controller Performance and Kinematic Behavior of a Novel Soft Synergy-Inspired Robotic Hand for Prosthetic Applications

    Get PDF
    Myoelectric artificial limbs can significantly advance the state of the art in prosthetics, since they can be used to control mechatronic devices through muscular activity in a way that mimics how the subjects used to activate their muscles before limb loss. However, surveys indicate that dissatisfaction with the functionality of terminal devices underlies the widespread abandonment of prostheses. We believe that one key factor to improve acceptability of prosthetic devices is to attain human likeness of prosthesis movements, a goal which is being pursued by research on social and human-robot interactions. Therefore, to reduce early abandonment of terminal devices, we propose that controllers should be designed so as to ensure effective task accomplishment in a natural fashion. In this work, we have analyzed and compared the performance of three types of myoelectric controller algorithms based on surface electromyography to control an underactuated and multi-degrees of freedom prosthetic hand, the SoftHand Pro. The goal of the present study was to identify the myoelectric algorithm that best mimics the native hand movements. As a preliminary step, we first quantified the repeatability of the SoftHand Pro finger movements and identified the electromyographic recording sites for able-bodied individuals with the highest signal-to-noise ratio from two pairs of muscles, i.e., flexor digitorum superficialis/extensor digitorum communis, and flexor carpi radialis/extensor carpi ulnaris. Able-bodied volunteers were then asked to execute reach-to-grasp movements, while electromyography signals were recorded from flexor digitorum superficialis/extensor digitorum communis as this was identified as the muscle pair characterized by high signal-to-noise ratio and intuitive control. Subsequently, we tested three myoelectric controllers that mapped electromyography signals to position of the SoftHand Pro. We found that a differential electromyography-to-position mapping ensured the highest coherence with hand movements. Our results represent a first step toward a more effective and intuitive control of myoelectric hand prostheses

    Assessment of Myoelectric Controller Performance and Kinematic Behavior of a Novel Soft Synergy-Inspired Robotic Hand for Prosthetic Applications

    Get PDF
    abstract: Myoelectric artificial limbs can significantly advance the state of the art in prosthetics, since they can be used to control mechatronic devices through muscular activity in a way that mimics how the subjects used to activate their muscles before limb loss. However, surveys indicate that dissatisfaction with the functionality of terminal devices underlies the widespread abandonment of prostheses. We believe that one key factor to improve acceptability of prosthetic devices is to attain human likeness of prosthesis movements, a goal which is being pursued by research on social and human–robot interactions. Therefore, to reduce early abandonment of terminal devices, we propose that controllers should be designed so as to ensure effective task accomplishment in a natural fashion. In this work, we have analyzed and compared the performance of three types of myoelectric controller algorithms based on surface electromyography to control an underactuated and multi-degrees of freedom prosthetic hand, the SoftHand Pro. The goal of the present study was to identify the myoelectric algorithm that best mimics the native hand movements. As a preliminary step, we first quantified the repeatability of the SoftHand Pro finger movements and identified the electromyographic recording sites for able-bodied individuals with the highest signal-to-noise ratio from two pairs of muscles, i.e., flexor digitorum superficialis/extensor digitorum communis, and flexor carpi radialis/extensor carpi ulnaris. Able-bodied volunteers were then asked to execute reach-to-grasp movements, while electromyography signals were recorded from flexor digitorum superficialis/extensor digitorum communis as this was identified as the muscle pair characterized by high signal-to-noise ratio and intuitive control. Subsequently, we tested three myoelectric controllers that mapped electromyography signals to position of the SoftHand Pro. We found that a differential electromyography-to-position mapping ensured the highest coherence with hand movements. Our results represent a first step toward a more effective and intuitive control of myoelectric hand prostheses.View the article as published at http://journal.frontiersin.org/article/10.3389/fnbot.2016.00011/ful

    Robotic Platforms for Assistance to People with Disabilities

    Get PDF
    People with congenital and/or acquired disabilities constitute a great number of dependents today. Robotic platforms to help people with disabilities are being developed with the aim of providing both rehabilitation treatment and assistance to improve their quality of life. A high demand for robotic platforms that provide assistance during rehabilitation is expected because of the health status of the world due to the COVID-19 pandemic. The pandemic has resulted in countries facing major challenges to ensure the health and autonomy of their disabled population. Robotic platforms are necessary to ensure assistance and rehabilitation for disabled people in the current global situation. The capacity of robotic platforms in this area must be continuously improved to benefit the healthcare sector in terms of chronic disease prevention, assistance, and autonomy. For this reason, research about human–robot interaction in these robotic assistance environments must grow and advance because this topic demands sensitive and intelligent robotic platforms that are equipped with complex sensory systems, high handling functionalities, safe control strategies, and intelligent computer vision algorithms. This Special Issue has published eight papers covering recent advances in the field of robotic platforms to assist disabled people in daily or clinical environments. The papers address innovative solutions in this field, including affordable assistive robotics devices, new techniques in computer vision for intelligent and safe human–robot interaction, and advances in mobile manipulators for assistive tasks

    Decoding social intentions in human prehensile actions: Insights from a combined kinematics-fMRI study

    Get PDF
    Consistent evidence suggests that the way we reach and grasp an object is modulated not only by object properties (e.g., size, shape, texture, fragility and weight), but also by the types of intention driving the action, among which the intention to interact with another agent (i.e., social intention). Action observation studies ascribe the neural substrate of this `intentional' component to the putative mirror neuron (pMNS) and the mentalizing (MS) systems. How social intentions are translated into executed actions, however, has yet to be addressed. We conducted a kinematic and a functional Magnetic Resonance Imaging (fMRI) study considering a reach-to-grasp movement performed towards the same object positioned at the same location but with different intentions: passing it to another person (social condition) or putting it on a concave base (individual condition). Kinematics showed that individual and social intentions are characterized by different profiles, with a slower movement at the level of both the reaching (i.e., arm movement) and the grasping (i.e., hand aperture) components. fMRI results showed that: (i) distinct voxel pattern activity for the social and the individual condition are present within the pMNS and the MS during action execution; (ii) decoding accuracies of regions belonging to the pMNS and the MS are correlated, suggesting that these two systems could interact for the generation of appropriate motor commands. Results are discussed in terms of motor simulation and inferential processes as part of a hierarchical generative model for action intention understanding and generation of appropriate motor commands

    Tongue Control of Upper-Limb Exoskeletons For Individuals With Tetraplegia

    Get PDF

    proprioceptive identification of joint position versus kinaesthetic movement reproduction

    Get PDF
    Abstract Regarding our voluntary control of movement, if identification of joint position, that is independent of the starting condition, is stronger than kinaesthetic movement reproduction, that implies knowledge of the starting position and movement's length for accuracy, is still a matter of debate in motor control theories and neuroscience. In the present study, we examined the mechanisms that individuals seem to prefer/adopt when they locate spatial positions and code the amplitude of movements. We implemented a joint position matching task on a wrist robotic device: this task consists in replicating (i.e. matching) a reference joint angle in the absence of vision and the proprioceptive acuity is given by the goodness of such matching. Two experiments were carried out by implementing two different versions of the task and performed by two groups of 15 healthy participants. In the first experiment, blindfolded subjects were asked to perform matching movements towards a fixed target position, experienced with passive movements that started from different positions and had different lengths. In the second experiment, blindfolded subjects were requested to accurately match target positions that had a different location in space but were passively shown through movements of the same length. We found a clear evidence for higher performances in terms of accuracy ( 0.42 ± 0.01 1 / ° ) and precision ( 0.43 ± 0.01 1 / ° ) in the first experiment, therefore in case of matching positions, rather than in the second where accuracy and precision were lower ( 0.36 ± 0.01 1 / ° and 0.35 ± 0.01 1 / ° respectively). These results suggested a preference for proprioceptive identification of joint position rather than kinaesthetic movement reproduction

    How virtual and mechanical coupling impact bimanual tracking.

    Get PDF
    Bilateral training systems look to promote the paretic hand's use in individuals with hemiplegia. Although this is normally achieved using mechanical coupling (i.e., a physical connection between the hands), a virtual reality system relying on virtual coupling (i.e., through a shared virtual object) would be simpler to use and prevent slacking. However, it is not clear whether different coupling modes differently impact task performance and effort distribution between the hands. We explored how 18 healthy right-handed participants changed their motor behaviors in response to the uninstructed addition of mechanical coupling, and virtual coupling using a shared cursor mapped to the average hands' position. In a second experiment, we then studied the impact of connection stiffness on performance, perception, and effort imbalance. The results indicated that both coupling types can induce the hands to actively contribute to the task. However, the task asymmetry introduced by using a cursor mapped to either the left or right hand only modulated the hands' contribution when not mechanically coupled. The tracking performance was similar for all coupling types, independent of the connection stiffness, although the mechanical coupling was preferred and induced the hands to move with greater correlation. These findings suggest that virtual coupling can induce the hands to actively contribute to a task in healthy participants without hindering their performance. Further investigation on the coupling types' impact on the performance and hands' effort distribution in patients with hemiplegia could allow for the design of simpler training systems that promote the affected hand's use.NEW & NOTEWORTHY We showed that the uninstructed addition of a virtual and/or a mechanical coupling can induce both hands to actively contribute in a continuous redundant bimanual tracking task without impacting performance. In addition, we showed that the task asymmetry can only alter the effort distribution when the hands are not connected, independent of the connection stiffness. Our findings suggest that virtual coupling could be used in the development of simpler VR-based training devices

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice
    • …
    corecore