15 research outputs found

    MUNDUS project : MUltimodal neuroprosthesis for daily upper limb support

    Get PDF
    Background: MUNDUS is an assistive framework for recovering direct interaction capability of severely motor impaired people based on arm reaching and hand functions. It aims at achieving personalization, modularity and maximization of the user’s direct involvement in assistive systems. To this, MUNDUS exploits any residual control of the end-user and can be adapted to the level of severity or to the progression of the disease allowing the user to voluntarily interact with the environment. MUNDUS target pathologies are high-level spinal cord injury (SCI) and neurodegenerative and genetic neuromuscular diseases, such as amyotrophic lateral sclerosis, Friedreich ataxia, and multiple sclerosis (MS). The system can be alternatively driven by residual voluntary muscular activation, head/eye motion, and brain signals. MUNDUS modularly combines an antigravity lightweight and non-cumbersome exoskeleton, closed-loop controlled Neuromuscular Electrical Stimulation for arm and hand motion, and potentially a motorized hand orthosis, for grasping interactive objects. Methods: The definition of the requirements and of the interaction tasks were designed by a focus group with experts and a questionnaire with 36 potential end-users. Five end-users (3 SCI and 2 MS) tested the system in the configuration suitable to their specific level of impairment. They performed two exemplary tasks: reaching different points in the working volume and drinking. Three experts evaluated over a 3-level score (from 0, unsuccessful, to 2, completely functional) the execution of each assisted sub-action. Results: The functionality of all modules has been successfully demonstrated. User’s intention was detected with a 100% success. Averaging all subjects and tasks, the minimum evaluation score obtained was 1.13 ± 0.99 for the release of the handle during the drinking task, whilst all the other sub-actions achieved a mean value above 1.6. All users, but one, subjectively perceived the usefulness of the assistance and could easily control the system. Donning time ranged from 6 to 65 minutes, scaled on the configuration complexity. Conclusions: The MUNDUS platform provides functional assistance to daily life activities; the modules integration depends on the user’s need, the functionality of the system have been demonstrated for all the possible configurations, and preliminary assessment of usability and acceptance is promising

    A review on design of upper limb exoskeletons

    Get PDF

    Down-Conditioning of Soleus Reflex Activity using Mechanical Stimuli and EMG Biofeedback

    Get PDF
    Spasticity is a common syndrome caused by various brain and neural injuries, which can severely impair walking ability and functional independence. To improve functional independence, conditioning protocols are available aimed at reducing spasticity by facilitating spinal neuroplasticity. This down-conditioning can be performed using different types of stimuli, electrical or mechanical, and reflex activity measures, EMG or impedance, used as biofeedback variable. Still, current results on effectiveness of these conditioning protocols are incomplete, making comparisons difficult. We aimed to show the within-session task- dependent and across-session long-term adaptation of a conditioning protocol based on mechanical stimuli and EMG biofeedback. However, in contrast to literature, preliminary results show that subjects were unable to successfully obtain task-dependent modulation of their soleus short-latency stretch reflex magnitude

    Mobile Mechatronic/Robotic Orthotic Devices to Assist–Rehabilitate Neuromotor Impairments in the Upper Limb: A Systematic and Synthetic Review

    Get PDF
    This paper overviews the state-of-the-art in upper limb robot-supported approaches, focusing on advancements in the related mechatronic devices for the patients' rehabilitation and/or assistance. Dedicated to the technical, comprehensively methodological and global effectiveness and improvement in this inter-disciplinary field of research, it includes information beyond the therapy administrated in clinical settings-but with no diminished safety requirements. Our systematic review, based on PRISMA guidelines, searched articles published between January 2001 and November 2017 from the following databases: Cochrane, Medline/PubMed, PMC, Elsevier, PEDro, and ISI Web of Knowledge/Science. Then we have applied a new innovative PEDro-inspired technique to classify the relevant articles. The article focuses on the main indications, current technologies, categories of intervention and outcome assessment modalities. It includes also, in tabular form, the main characteristics of the most relevant mobile (wearable and/or portable) mechatronic/robotic orthoses/exoskeletons prototype devices used to assist-rehabilitate neuromotor impairments in the upper limb

    Fused mechanomyography and inertial measurement for human-robot interface

    Get PDF
    Human-Machine Interfaces (HMI) are the technology through which we interact with the ever-increasing quantity of smart devices surrounding us. The fundamental goal of an HMI is to facilitate robot control through uniting a human operator as the supervisor with a machine as the task executor. Sensors, actuators, and onboard intelligence have not reached the point where robotic manipulators may function with complete autonomy and therefore some form of HMI is still necessary in unstructured environments. These may include environments where direct human action is undesirable or infeasible, and situations where a robot must assist and/or interface with people. Contemporary literature has introduced concepts such as body-worn mechanical devices, instrumented gloves, inertial or electromagnetic motion tracking sensors on the arms, head, or legs, electroencephalographic (EEG) brain activity sensors, electromyographic (EMG) muscular activity sensors and camera-based (vision) interfaces to recognize hand gestures and/or track arm motions for assessment of operator intent and generation of robotic control signals. While these developments offer a wealth of future potential their utility has been largely restricted to laboratory demonstrations in controlled environments due to issues such as lack of portability and robustness and an inability to extract operator intent for both arm and hand motion. Wearable physiological sensors hold particular promise for capture of human intent/command. EMG-based gesture recognition systems in particular have received significant attention in recent literature. As wearable pervasive devices, they offer benefits over camera or physical input systems in that they neither inhibit the user physically nor constrain the user to a location where the sensors are deployed. Despite these benefits, EMG alone has yet to demonstrate the capacity to recognize both gross movement (e.g. arm motion) and finer grasping (e.g. hand movement). As such, many researchers have proposed fusing muscle activity (EMG) and motion tracking e.g. (inertial measurement) to combine arm motion and grasp intent as HMI input for manipulator control. However, such work has arguably reached a plateau since EMG suffers from interference from environmental factors which cause signal degradation over time, demands an electrical connection with the skin, and has not demonstrated the capacity to function out of controlled environments for long periods of time. This thesis proposes a new form of gesture-based interface utilising a novel combination of inertial measurement units (IMUs) and mechanomyography sensors (MMGs). The modular system permits numerous configurations of IMU to derive body kinematics in real-time and uses this to convert arm movements into control signals. Additionally, bands containing six mechanomyography sensors were used to observe muscular contractions in the forearm which are generated using specific hand motions. This combination of continuous and discrete control signals allows a large variety of smart devices to be controlled. Several methods of pattern recognition were implemented to provide accurate decoding of the mechanomyographic information, including Linear Discriminant Analysis and Support Vector Machines. Based on these techniques, accuracies of 94.5% and 94.6% respectively were achieved for 12 gesture classification. In real-time tests, accuracies of 95.6% were achieved in 5 gesture classification. It has previously been noted that MMG sensors are susceptible to motion induced interference. The thesis also established that arm pose also changes the measured signal. This thesis introduces a new method of fusing of IMU and MMG to provide a classification that is robust to both of these sources of interference. Additionally, an improvement in orientation estimation, and a new orientation estimation algorithm are proposed. These improvements to the robustness of the system provide the first solution that is able to reliably track both motion and muscle activity for extended periods of time for HMI outside a clinical environment. Application in robot teleoperation in both real-world and virtual environments were explored. With multiple degrees of freedom, robot teleoperation provides an ideal test platform for HMI devices, since it requires a combination of continuous and discrete control signals. The field of prosthetics also represents a unique challenge for HMI applications. In an ideal situation, the sensor suite should be capable of detecting the muscular activity in the residual limb which is naturally indicative of intent to perform a specific hand pose and trigger this post in the prosthetic device. Dynamic environmental conditions within a socket such as skin impedance have delayed the translation of gesture control systems into prosthetic devices, however mechanomyography sensors are unaffected by such issues. There is huge potential for a system like this to be utilised as a controller as ubiquitous computing systems become more prevalent, and as the desire for a simple, universal interface increases. Such systems have the potential to impact significantly on the quality of life of prosthetic users and others.Open Acces
    corecore