122 research outputs found

    A human-assisting manipulator teleoperated by EMG signals and arm motions

    Get PDF
    This paper proposes a human-assisting manipulator teleoperated by electromyographic (EMG) signals and arm motions. The proposed method can realize a new master-slave manipulator system that uses no mechanical master controller. A person whose forearm has been amputated can use this manipulator as a personal assistant for desktop work. The control system consists of a hand and wrist control part and an arm control part. The hand and wrist control part selects an active joint in the manipulator's end-effector and controls it based on EMG pattern discrimination. The arm control part measures the position of the operator's wrist joint or the amputated part using a three-dimensional position sensor, and the joint angles of the manipulator's arm, except for the end-effector part, are controlled according to this position, which, in turn, corresponds to the position of the manipulator's joint. These control parts enable the operator to control the manipulator intuitively. The distinctive feature of our system is to use a novel statistical neural network for EMG pattern discrimination. The system can adapt itself to changes of the EMG patterns according to the differences among individuals, different locations of the electrodes, and time variation caused by fatigue or sweat. Our experiments have shown that the developed system could learn and estimate the operator's intended motions with a high degree of accuracy using the EMG signals, and that the manipulator could be controlled smoothly. We also confirmed that our system could assist the amputee in performing desktop work

    Human-machine interfaces based on EMG and EEG applied to robotic systems

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Two different Human-Machine Interfaces (HMIs) were developed, both based on electro-biological signals. One is based on the EMG signal and the other is based on the EEG signal. Two major features of such interfaces are their relatively simple data acquisition and processing systems, which need just a few hardware and software resources, so that they are, computationally and financially speaking, low cost solutions. Both interfaces were applied to robotic systems, and their performances are analyzed here. The EMG-based HMI was tested in a mobile robot, while the EEG-based HMI was tested in a mobile robot and a robotic manipulator as well.</p> <p>Results</p> <p>Experiments using the EMG-based HMI were carried out by eight individuals, who were asked to accomplish ten eye blinks with each eye, in order to test the eye blink detection algorithm. An average rightness rate of about 95% reached by individuals with the ability to blink both eyes allowed to conclude that the system could be used to command devices. Experiments with EEG consisted of inviting 25 people (some of them had suffered cases of meningitis and epilepsy) to test the system. All of them managed to deal with the HMI in only one training session. Most of them learnt how to use such HMI in less than 15 minutes. The minimum and maximum training times observed were 3 and 50 minutes, respectively.</p> <p>Conclusion</p> <p>Such works are the initial parts of a system to help people with neuromotor diseases, including those with severe dysfunctions. The next steps are to convert a commercial wheelchair in an autonomous mobile vehicle; to implement the HMI onboard the autonomous wheelchair thus obtained to assist people with motor diseases, and to explore the potentiality of EEG signals, making the EEG-based HMI more robust and faster, aiming at using it to help individuals with severe motor dysfunctions.</p

    A MYOELECTRIC PROSTHETIC ARM CONTROLLED BY A SENSOR-ACTUATOR LOOP

    Get PDF
    This paper describes new methods and systems designed for application in upper extremity prostheses. An artificial upper limb with this system is a robot arm controlled by EMG signals and a set of sensors. The new multi-sensor system is based on ultrasonic sensors, infrared sensors, Hall-effect sensors, a CO2 sensor and a relative humidity sensor. The multi-sensor system is used to update a 3D map of objects in the robot’s environment, or it directly sends information about the environment to the control system of the myoelectric arm. Occupancy grid mapping is used to build a 3D map of the robot’s environment. The multi-sensor system can identify the distance of objects in 3D space, and the information from the system is used in a 3D map to identify potential collisions or a potentially dangerous environment, which could damage the prosthesis or the user. Information from the sensors and from the 3D map is evaluated using a fuzzy expert system. The control system of the myoelectric prosthetic arm can choose an adequate reaction on the basis of information from the fuzzy expert system. The systems and methods were designed and verified using MatLab/Simulink. They are aimed for use as assistive technology for disabled people

    Comparison of machine learning algorithms for EMG signal classification

    Get PDF
    The use of muscle activation signals in the control loop in biomechatronics systems is extremely important for effective and stable control. One of the methods used for this purpose is motion classification using electromyography (EMG) signals that reflect muscle activation. Classifying these signals with variable amplitude and frequency is a difficult process. On the other hand, EMG signal characteristics change over time depending on the person, task and duration. Various artificial intelligence-based methods are used for movement classification. One of these methods is machine learning. In this study, a total of 24 different models of 6 main machine learning algorithms were used for motion classification. With these models, 7 different wrist movements (rest, grip, flexion, extension, radial deviation, ulnar deviation, expanded palm) are classified. Test studies were carried out with 8 channels of EMG data taken from 4 subjects. Classification performances were compared in terms of classification accuracy and training time parameters. According to the simulation results, the Ensemble algorithm Bagged Trees model has been shown to have the highest classification performance with an average classification accuracy of 98.55%

    Exploration of muscle fatigue effects in bioinspired robot learning from sEMG signals

    Get PDF
    © 2018 Ning Wang et al. To investigate the effects of muscle fatigue on bioinspired robot learning quality in teaching by demonstration (TbD) tasks, in this work, we propose to first identify the emerging muscle fatigue phenomenon of the human demonstrator by analyzing his/her surface Electromyography (sEMG) recordings and then guide the robot learning curve with this knowledge in mind. The time-varying amplitude and frequency sequences determining the subband sEMG signals have been estimated and their dominant values over short time intervals have been explored as fatigue-indicating features. These features are found carrying muscle fatigue cues of the human demonstrator in the course of robot manipulation. In robot learning tasks requiring multiple demonstrations, the fatiguing status of human demonstrator can be acquired by tracking the changes of the proposed features over time. In order to model data from multiple demonstrations, Gaussian mixture models (GMMs) have been employed. According to the identified muscle fatigue factor, a weight has been assigned to each of the demonstration trials in training stage, which is therefore termed as weighted GMMs (W-GMMs) algorithm. Six groups of data with various fatiguing status, as well as their corresponding weights, are taken as input data to get the adapted W-GMMs parameters. After that, Gaussian mixture regression (GMR) algorithm has been applied to regenerate the movement trajectory for the robot. TbD experiments on Baxter robot with 30 human demonstration trials show that the robot can successfully accomplish the taught task with a generated trajectory much closer to that of the desirable condition where little fatigue exists

    Active interaction control applied to a lower limb rehabilitation robot by using EMG recognition and impedance model

    Get PDF
    Purpose – The purpose of this paper is to propose a seamless active interaction control method integrating electromyography (EMG)-triggered assistance and the adaptive impedance control scheme for parallel robot-assisted lower limb rehabilitation and training. Design/methodology/approach – An active interaction control strategy based on EMG motion recognition and adaptive impedance model is implemented on a six-degrees of freedom parallel robot for lower limb rehabilitation. The autoregressive coefficients of EMG signals integrating with a support vector machine classifier are utilized to predict the movement intention and trigger the robot assistance. An adaptive impedance controller is adopted to influence the robot velocity during the exercise, and in the meantime, the user’s muscle activity level is evaluated online and the robot impedance is adapted in accordance with the recovery conditions. Findings – Experiments on healthy subjects demonstrated that the proposed method was able to drive the robot according to the user’s intention, and the robot impedance can be updated with the muscle conditions. Within the movement sessions, there was a distinct increase in the muscle activity levels for all subjects with the active mode in comparison to the EMG-triggered mode. Originality/value – Both users’ movement intention and voluntary participation are considered, not only triggering the robot when people attempt to move but also changing the robot movement in accordance with user’s efforts. The impedance model here responds directly to velocity changes, and thus allows the exercise along a physiological trajectory. Moreover, the muscle activity level depends on both the normalized EMG signals and the weight coefficients of involved muscles
    corecore