491 research outputs found

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice

    On the development of a cybernetic prosthetic hand

    Get PDF
    The human hand is the end organ of the upper limb, which in humans serves the important function of prehension, as well as being an important organ for sensation and communication. It is a marvellous example of how a complex mechanism can be implemented, capable of realizing very complex and useful tasks using a very effective combination of mechanisms, sensing, actuation and control functions. In this thesis, the road towards the realization of a cybernetic hand has been presented. After a detailed analysis of the model, the human hand, a deep review of the state of the art of artificial hands has been carried out. In particular, the performance of prosthetic hands used in clinical practice has been compared with the research prototypes, both for prosthetic and for robotic applications. By following a biomechatronic approach, i.e. by comparing the characteristics of these hands with the natural model, the human hand, the limitations of current artificial devices will be put in evidence, thus outlining the design goals for a new cybernetic device. Three hand prototypes with a high number of degrees of freedom have been realized and tested: the first one uses microactuators embedded inside the structure of the fingers, and the second and third prototypes exploit the concept of microactuation in order to increase the dexterity of the hand while maintaining the simplicity for the control. In particular, a framework for the definition and realization of the closed-loop electromyographic control of these devices has been presented and implemented. The results were quite promising, putting in evidence that, in the future, there could be two different approaches for the realization of artificial devices. On one side there could be the EMG-controlled hands, with compliant fingers but only one active degree of freedom. On the other side, more performing artificial hands could be directly interfaced with the peripheral nervous system, thus establishing a bi-directional communication with the human brain

    JNER at 15 years: analysis of the state of neuroengineering and rehabilitation.

    Get PDF
    On JNER's 15th anniversary, this editorial analyzes the state of the field of neuroengineering and rehabilitation. I first discuss some ways that the nature of neurorehabilitation research has evolved in the past 15 years based on my perspective as editor-in-chief of JNER and a researcher in the field. I highlight increasing reliance on advanced technologies, improved rigor and openness of research, and three, related, new paradigms - wearable devices, the Cybathlon competition, and human augmentation studies - indicators that neurorehabilitation is squarely in the age of wearability. Then, I briefly speculate on how the field might make progress going forward, highlighting the need for new models of training and learning driven by big data, better personalization and targeting, and an increase in the quantity and quality of usability and uptake studies to improve translation

    The SmartHand transradial prosthesis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Prosthetic components and control interfaces for upper limb amputees have barely changed in the past 40 years. Many transradial prostheses have been developed in the past, nonetheless most of them would be inappropriate if/when a large bandwidth human-machine interface for control and perception would be available, due to either their limited (or inexistent) sensorization or limited dexterity. <it>SmartHand </it>tackles this issue as is meant to be clinically experimented in amputees employing different neuro-interfaces, in order to investigate their effectiveness. This paper presents the design and on bench evaluation of the SmartHand.</p> <p>Methods</p> <p>SmartHand design was bio-inspired in terms of its physical appearance, kinematics, sensorization, and its multilevel control system. Underactuated fingers and differential mechanisms were designed and exploited in order to fit all mechatronic components in the size and weight of a natural human hand. Its sensory system was designed with the aim of delivering significant afferent information to the user through adequate interfaces.</p> <p>Results</p> <p>SmartHand is a five fingered self-contained robotic hand, with 16 degrees of freedom, actuated by 4 motors. It integrates a bio-inspired sensory system composed of 40 proprioceptive and exteroceptive sensors and a customized embedded controller both employed for implementing automatic grasp control and for potentially delivering sensory feedback to the amputee. It is able to perform everyday grasps, count and independently point the index. The weight (530 g) and speed (closing time: 1.5 seconds) are comparable to actual commercial prostheses. It is able to lift a 10 kg suitcase; slippage tests showed that within particular friction and geometric conditions the hand is able to stably grasp up to 3.6 kg cylindrical objects.</p> <p>Conclusions</p> <p>Due to its unique embedded features and human-size, the SmartHand holds the promise to be experimentally fitted on transradial amputees and employed as a bi-directional instrument for investigating -during realistic experiments- different interfaces, control and feedback strategies in neuro-engineering studies.</p

    Design of a cybernetic hand for perception and action

    Get PDF
    Strong motivation for developing new prosthetic hand devices is provided by the fact that low functionality and controllability—in addition to poor cosmetic appearance—are the most important reasons why amputees do not regularly use their prosthetic hands. This paper presents the design of the CyberHand, a cybernetic anthropomorphic hand intended to provide amputees with functional hand replacement. Its design was bio-inspired in terms of its modular architecture, its physical appearance, kinematics, sensorization, and actuation, and its multilevel control system. Its underactuated mechanisms allow separate control of each digit as well as thumb–finger opposition and, accordingly, can generate a multitude of grasps. Its sensory system was designed to provide proprioceptive information as well as to emulate fundamental functional properties of human tactile mechanoreceptors of specific importance for grasp-and-hold tasks. The CyberHand control system presumes just a few efferent and afferent channels and was divided in two main layers: a high-level control that interprets the user’s intention (grasp selection and required force level) and can provide pertinent sensory feedback and a low-level control responsible for actuating specific grasps and applying the desired total force by taking advantage of the intelligent mechanics. The grasps made available by the high-level controller include those fundamental for activities of daily living: cylindrical, spherical, tridigital (tripod), and lateral grasps. The modular and flexible design of the CyberHand makes it suitable for incremental development of sensorization, interfacing, and control strategies and, as such, it will be a useful tool not only for clinical research but also for addressing neuroscientific hypotheses regarding sensorimotor control

    Intrinsic somatosensory feedback supports motor control and learning to operate artificial body parts

    Get PDF
    Objective Considerable resources are being invested to enhance the control and usability of artificial limbs through the delivery of unnatural forms of somatosensory feedback. Here, we investigated whether intrinsic somatosensory information from the body part(s) remotely controlling an artificial limb can be leveraged by the motor system to support control and skill learning. Approach In a placebo-controlled design, we used local anaesthetic to attenuate somatosensory inputs to the big toes while participants learned to operate through pressure sensors a toe-controlled and hand-worn robotic extra finger. Motor learning outcomes were compared against a control group who received sham anaesthetic and quantified in three different task scenarios: while operating in isolation from, in synchronous coordination, and collaboration with, the biological fingers. Main results Both groups were able to learn to operate the robotic extra finger, presumably due to abundance of visual feedback and other relevant sensory cues. Importantly, the availability of displaced somatosensory cues from the distal bodily controllers facilitated the acquisition of isolated robotic finger movements, the retention and transfer of synchronous hand-robot coordination skills, and performance under cognitive load. Motor performance was not impaired by toes anaesthesia when tasks involved close collaboration with the biological fingers, indicating that the motor system can close the sensory feedback gap by dynamically integrating task-intrinsic somatosensory signals from multiple, and even distal, body- parts. Significance Together, our findings demonstrate that there are multiple natural avenues to provide intrinsic surrogate somatosensory information to support motor control of an artificial body part, beyond artificial stimulation

    Electrocutaneous stimulation to close the loop in myoelectric prosthesis control

    Get PDF
    Current commercially available prosthetic systems still lack sensory feedback and amputees are forced to maintain eye-contact with the prosthesis when interacting with their environment. Electrocutaneous stimulation is a promising approach to convey sensory feedback via the skin. However, when discussed in the context of prosthetic applications, it is often refused due to its supposed incompatibility with myocontrol. This dissertation now addresses electrocutaneous stimulation as means to provide sensory feedback to prosthesis users, and its implications on myoelectric control, possible use for improved or accelerated mastering of prosthesis control through closing of the control loop, as well as its potential in aiding in the embodiment of prosthetic components. First, a comparison of different paradigms for encoding sensory feedback variables in electrocutaneous stimulation patterns was done. For this, subject ability to employ spatially and intensity-coded electrocutaneous feedback in a simulated closed-loop control task was evaluated. The task was to stabilise an invisible virtual inverted pendulum under ideal feedforward control conditions (joystick). Pendulum inclination was either presented spatially (12 stimulation sites), encoded by stimulation strength (≧ 2 stimulation sites), or a combination of the two. The tests indicated that spatial encoding was perceived as more intuitive, but intensity encoding yielded better performance and lower energy expenditure. The second study investigated the detrimental influence of stimulation artefacts on myoelectric control of prostheses for a wide range of stimulation parameters and two prosthesis control approaches (pattern recognition of eight motion primitives, direct proportional control). Artefact blanking is introduced and discussed as a practical approach to handle stimulation artefacts and restore control performance back to the baseline. This was shown with virtual and applied artefact blanking (pattern recognition on six electromyographic channels), as well as in a practical task-related test with a real prosthesis (proportional control). The information transfer of sensory feedback necessary to master a routine grasping task using electromyographic control of a prosthesis was investigated in another study. Subjects controlled a real prosthesis to repeatedly grasp a dummy object, which implemented two different objects with previously unknown slip and fragility properties. Three feedback conditions (basic feedback on grasp success, visual grasp force feedback, tactile grasp force feedback) were compared with regard to their influence on subjects’ task performance and variability in exerted grasp force. It was found that online force feedback via a visual or tactile channel did not add significant advantages, and that basic feedback was sufficient and was employed by subjects to improve both performance and force variability with time. Importantly, there was no adverse effect of the additional feedback, either. This has important implications for other non-functional applications of sensory feedback, such as facilitation of embodiment of prosthetic devices. The final study investigated the impact of electrocutaneous stimulation on embodiment of an artificial limb. For this purpose, a sensor finger was employed in a rubber-hand-illusion-like experiment. Two independent groups (test, control), were compared with regard to two objective measures of embodiment: proprioceptive drift, and change in skin temperature. Though proprioceptive drift measures did not reveal differences between conditions, they indicated trends generally associated to a successful illusion. Additionally, significant changes in skin temperature between test and control group indicated that embodiment of the artificial digit could be induced by providing sensory substitution feedback on the forearm. In conclusion, it has been shown that humans can employ electrocutaneous stimulation feedback in challenging closed-loop control tasks. It was found that transition from simple intuitive encodings (spatial) to those providing better resolution (intensity) further improves feedback exploitation. Blanking and segmentation approaches facilitate simultaneous application of electrocutaneous stimulation and electromyographic control of prostheses, using both pattern recognition and classic proportional approaches. While it was found that force feedback may not aid in the mastering of routine grasping, the presence of the feedback was also found to not impede the user performance. This is an important implication for the application of feedback for non-functional purposes, such as facilitation of embodiment. Regarding this, it was shown that providing sensory feedback via electrocutaneous stimulation did indeed promote embodiment of an artificial finger, even if the feedback was applied to the forearm. Based on the results of this work, the next step should be integration of sensory feedback into commercial devices, so that all amputees can benefit from its advantages. Electrocutaneous stimulation has been shown to be an ideal means for realising this. Hitherto existing concerns about the compatibility of electrocutaneous stimulation and myocontrol could be resolved by presenting appropriate methods to deal with stimulation artefacts
    corecore