137 research outputs found

    Using Artificial Intelligence To Improve The Control Of Prosthetic Legs

    Get PDF
    For as long as people have been able to survive limb threatening injuries prostheses have been created. Modern lower limb prostheses are primarily controlled by adjusting the amount of damping in the knee to bend in a suitable manner for walking and running. Often the choice of walking state or running state has to be controlled manually by pressing a button. While this simple tuning strategy can work for many users it can be limiting and there is the tendency that controlling the leg is not intuitive and the wearer has to learn how to use leg. This thesis examines how this control can be improved using Artificial Intelligence (AI) to allow the system to be tuned for each individual. A wearable gait lab was developed consisting of a number of sensors attached to the limbs of eight volunteers. The signals from the sensors were analysed and features were extracted from them which were then passed through 2 separate Artificial Neural Networks (ANN). One network attempted to classify whether the wearer was standing still, walking or running. The other network attempted to estimate the wearer’s movement speed. A Genetic Algorithm (GA) was used to tune the ANNs parameters for each individual. The results showed that each individual needed different parameters to tune the features presented to the ANN. It was also found that different features were needed for each of the two problems presented to the ANN. Two new features are presented which identify the movement states of standing, walking and running and the movement speed of the volunteer. The results suggest that the control of the prosthetic limb can be improved

    Fused mechanomyography and inertial measurement for human-robot interface

    Get PDF
    Human-Machine Interfaces (HMI) are the technology through which we interact with the ever-increasing quantity of smart devices surrounding us. The fundamental goal of an HMI is to facilitate robot control through uniting a human operator as the supervisor with a machine as the task executor. Sensors, actuators, and onboard intelligence have not reached the point where robotic manipulators may function with complete autonomy and therefore some form of HMI is still necessary in unstructured environments. These may include environments where direct human action is undesirable or infeasible, and situations where a robot must assist and/or interface with people. Contemporary literature has introduced concepts such as body-worn mechanical devices, instrumented gloves, inertial or electromagnetic motion tracking sensors on the arms, head, or legs, electroencephalographic (EEG) brain activity sensors, electromyographic (EMG) muscular activity sensors and camera-based (vision) interfaces to recognize hand gestures and/or track arm motions for assessment of operator intent and generation of robotic control signals. While these developments offer a wealth of future potential their utility has been largely restricted to laboratory demonstrations in controlled environments due to issues such as lack of portability and robustness and an inability to extract operator intent for both arm and hand motion. Wearable physiological sensors hold particular promise for capture of human intent/command. EMG-based gesture recognition systems in particular have received significant attention in recent literature. As wearable pervasive devices, they offer benefits over camera or physical input systems in that they neither inhibit the user physically nor constrain the user to a location where the sensors are deployed. Despite these benefits, EMG alone has yet to demonstrate the capacity to recognize both gross movement (e.g. arm motion) and finer grasping (e.g. hand movement). As such, many researchers have proposed fusing muscle activity (EMG) and motion tracking e.g. (inertial measurement) to combine arm motion and grasp intent as HMI input for manipulator control. However, such work has arguably reached a plateau since EMG suffers from interference from environmental factors which cause signal degradation over time, demands an electrical connection with the skin, and has not demonstrated the capacity to function out of controlled environments for long periods of time. This thesis proposes a new form of gesture-based interface utilising a novel combination of inertial measurement units (IMUs) and mechanomyography sensors (MMGs). The modular system permits numerous configurations of IMU to derive body kinematics in real-time and uses this to convert arm movements into control signals. Additionally, bands containing six mechanomyography sensors were used to observe muscular contractions in the forearm which are generated using specific hand motions. This combination of continuous and discrete control signals allows a large variety of smart devices to be controlled. Several methods of pattern recognition were implemented to provide accurate decoding of the mechanomyographic information, including Linear Discriminant Analysis and Support Vector Machines. Based on these techniques, accuracies of 94.5% and 94.6% respectively were achieved for 12 gesture classification. In real-time tests, accuracies of 95.6% were achieved in 5 gesture classification. It has previously been noted that MMG sensors are susceptible to motion induced interference. The thesis also established that arm pose also changes the measured signal. This thesis introduces a new method of fusing of IMU and MMG to provide a classification that is robust to both of these sources of interference. Additionally, an improvement in orientation estimation, and a new orientation estimation algorithm are proposed. These improvements to the robustness of the system provide the first solution that is able to reliably track both motion and muscle activity for extended periods of time for HMI outside a clinical environment. Application in robot teleoperation in both real-world and virtual environments were explored. With multiple degrees of freedom, robot teleoperation provides an ideal test platform for HMI devices, since it requires a combination of continuous and discrete control signals. The field of prosthetics also represents a unique challenge for HMI applications. In an ideal situation, the sensor suite should be capable of detecting the muscular activity in the residual limb which is naturally indicative of intent to perform a specific hand pose and trigger this post in the prosthetic device. Dynamic environmental conditions within a socket such as skin impedance have delayed the translation of gesture control systems into prosthetic devices, however mechanomyography sensors are unaffected by such issues. There is huge potential for a system like this to be utilised as a controller as ubiquitous computing systems become more prevalent, and as the desire for a simple, universal interface increases. Such systems have the potential to impact significantly on the quality of life of prosthetic users and others.Open Acces

    The Use of Skeletal Muscle to Amplify Action Potentials in Transected Peripheral Nerves

    Get PDF
    Upper limb amputees suffer with problems associated with control and attachment of prostheses. Skin-surface electrodes placed over the stump, which detect myoelectric signals, are traditionally used to control hand movements. However, this method is unintuitive, the electrodes lift-off, and signal selectivity can be an issue. One solution to these limitations is to implant electrodes directly on muscles. Another approach is to implant electrodes directly into the nerves that innervate the muscles. A significant challenge with both solutions is the reliable transmission of biosignals across the skin barrier. In this thesis, I investigated the use of implantable muscle electrodes in an ovine model using myoelectrodes in combination with a bone-anchor, acting as a conduit for signal transmission. High-quality readings were obtained which were significantly better than skin-surface electrode readings. I further investigated the effect of electrode configurations to achieve the best signal quality. For direct recording from nerves, I tested the effect of adsorbed endoneural basement membrane proteins on nerve regeneration in vivo using microchannel neural interfaces implanted in rat sciatic nerves. Muscle and nerve signal recordings were obtained and improvements in sciatic nerve function were observed. Direct skeletal fixation of a prosthesis to the amputation stump using a bone-anchor has been proposed as a solution to skin problems associated with traditional socket-type prostheses. However, there remains a concern about the risk of infection between the implant and skin. Achieving a durable seal at this interface is therefore crucial, which formed the final part of the thesis. Bone-anchors were optimised for surface pore size and coatings to facilitate binding of human dermal fibroblasts to optimise skin-implant seal in an ovine model. Implants silanised with Arginine-Glycine-Aspartic Acid experienced significantly increased dermal tissue infiltration. This approach may therefore improve the soft tissue seal, and thus success of bone-anchored implants. By addressing both the way prostheses are attached to the amputation stump, by way of direct skeletal fixation, as well as providing high fidelity biosignals for high-level intuitive prosthetic control, I aim to further the field of limb loss rehabilitation

    Pattern identification of electromyographic (EMG) signals in the lower arm

    Get PDF

    Moving approximate entropy and its application to the electromyographic control of an artificial hand

    No full text
    A multiple-degree-of-freedom artificial hand has been developed at the University of Southampton with the aim of including control philosophies to form a highly functional prosthesis hand. Using electromyographic signals is an established technique for the control of a hand. In it simplest form, the signals allow for opening a hand and subsequent closing to grasp an object.This thesis describes the work carried out in the development of an electromyographic control system, with the aim to have a simple and robust method. A model of the control system was developed to differentiate grip postures using two surface electromyographic signals. A new method, moving approximate entropy was employed to investigate whether any significant patterns can be observed in the structure of the electromyographic signals. An investigation, using moving approximate entropy, on twenty healthy participants' wrist muscles (flexor carpi ulnaris and extensor carpi radialis) during wrist exion, wrist extension and cocontraction at different speeds has shown repeatable and distinct patterns at three states of contraction: start, middle and end. An analysis of the results also showed differences at different speeds of contraction. There is a low variation of the approximate entropy values between participants. This result, if used in the control of an artificial hand, would eliminate any training requirement. Other methods, mean absolute value, number of zero crossings, sample entropy, standard deviation, skewness and kurtosis have been determined from the signals. Of these features, mean absolute value and kurtosis were selected for information extraction. These three methods: moving approximate entropy, mean absolute value and kurtosis are used in the feature extraction process of the control system. A fuzzy logic system is used to classify the extracted information in discriminating the final grip posture. The results demonstrate the ability of the system to classify the information related to different grip postures

    The SmartHand transradial prosthesis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Prosthetic components and control interfaces for upper limb amputees have barely changed in the past 40 years. Many transradial prostheses have been developed in the past, nonetheless most of them would be inappropriate if/when a large bandwidth human-machine interface for control and perception would be available, due to either their limited (or inexistent) sensorization or limited dexterity. <it>SmartHand </it>tackles this issue as is meant to be clinically experimented in amputees employing different neuro-interfaces, in order to investigate their effectiveness. This paper presents the design and on bench evaluation of the SmartHand.</p> <p>Methods</p> <p>SmartHand design was bio-inspired in terms of its physical appearance, kinematics, sensorization, and its multilevel control system. Underactuated fingers and differential mechanisms were designed and exploited in order to fit all mechatronic components in the size and weight of a natural human hand. Its sensory system was designed with the aim of delivering significant afferent information to the user through adequate interfaces.</p> <p>Results</p> <p>SmartHand is a five fingered self-contained robotic hand, with 16 degrees of freedom, actuated by 4 motors. It integrates a bio-inspired sensory system composed of 40 proprioceptive and exteroceptive sensors and a customized embedded controller both employed for implementing automatic grasp control and for potentially delivering sensory feedback to the amputee. It is able to perform everyday grasps, count and independently point the index. The weight (530 g) and speed (closing time: 1.5 seconds) are comparable to actual commercial prostheses. It is able to lift a 10 kg suitcase; slippage tests showed that within particular friction and geometric conditions the hand is able to stably grasp up to 3.6 kg cylindrical objects.</p> <p>Conclusions</p> <p>Due to its unique embedded features and human-size, the SmartHand holds the promise to be experimentally fitted on transradial amputees and employed as a bi-directional instrument for investigating -during realistic experiments- different interfaces, control and feedback strategies in neuro-engineering studies.</p

    Application of wearable sensors in actuation and control of powered ankle exoskeletons: a Comprehensive Review

    Get PDF
    Powered ankle exoskeletons (PAEs) are robotic devices developed for gait assistance, rehabilitation, and augmentation. To fulfil their purposes, PAEs vastly rely heavily on their sensor systems. Human–machine interface sensors collect the biomechanical signals from the human user to inform the higher level of the control hierarchy about the user’s locomotion intention and requirement, whereas machine–machine interface sensors monitor the output of the actuation unit to ensure precise tracking of the high-level control commands via the low-level control scheme. The current article aims to provide a comprehensive review of how wearable sensor technology has contributed to the actuation and control of the PAEs developed over the past two decades. The control schemes and actuation principles employed in the reviewed PAEs, as well as their interaction with the integrated sensor systems, are investigated in this review. Further, the role of wearable sensors in overcoming the main challenges in developing fully autonomous portable PAEs is discussed. Finally, a brief discussion on how the recent technology advancements in wearable sensors, including environment—machine interface sensors, could promote the future generation of fully autonomous portable PAEs is provided

    Study and development of sensorimotor interfaces for robotic human augmentation

    Get PDF
    This thesis presents my research contribution to robotics and haptics in the context of human augmentation. In particular, in this document, we are interested in bodily or sensorimotor augmentation, thus the augmentation of humans by supernumerary robotic limbs (SRL). The field of sensorimotor augmentation is new in robotics and thanks to the combination with neuroscience, great leaps forward have already been made in the past 10 years. All of the research work I produced during my Ph.D. focused on the development and study of fundamental technology for human augmentation by robotics: the sensorimotor interface. This new concept is born to indicate a wearable device which has two main purposes, the first is to extract the input generated by the movement of the user's body, and the second to provide the somatosensory system of the user with an haptic feedback. This thesis starts with an exploratory study of integration between robotic and haptic devices, intending to combine state-of-the-art devices. This allowed us to realize that we still need to understand how to improve the interface that will allow us to feel the agency when using an augmentative robot. At this point, the path of this thesis forks into two alternative ways that have been adopted to improve the interaction between the human and the robot. In this regard, the first path we presented tackles two aspects conerning the haptic feedback of sensorimotor interfaces, which are the choice of the positioning and the effectiveness of the discrete haptic feedback. In the second way we attempted to lighten a supernumerary finger, focusing on the agility of use and the lightness of the device. One of the main findings of this thesis is that haptic feedback is considered to be helpful by stroke patients, but this does not mitigate the fact that the cumbersomeness of the devices is a deterrent to their use. Preliminary results here presented show that both the path we chose to improve sensorimotor augmentation worked: the presence of the haptic feedback improves the performance of sensorimotor interfaces, the co-positioning of haptic feedback and the input taken from the human body can improve the effectiveness of these interfaces, and creating a lightweight version of a SRL is a viable solution for recovering the grasping function

    Distributed Sensing and Stimulation Systems Towards Sense of Touch Restoration in Prosthetics

    Get PDF
    Modern prostheses aim at restoring the functional and aesthetic characteristics of the lost limb. To foster prosthesis embodiment and functionality, it is necessary to restitute both volitional control and sensory feedback. Contemporary feedback interfaces presented in research use few sensors and stimulation units to feedback at most two discrete feedback variables (e.g. grasping force and aperture), whereas the human sense of touch relies on a distributed network of mechanoreceptors providing high-fidelity spatial information. To provide this type of feedback in prosthetics, it is necessary to sense tactile information from artificial skin placed on the prosthesis and transmit tactile feedback above the amputation in order to map the interaction between the prosthesis and the environment. This thesis proposes the integration of distributed sensing systems (e-skin) to acquire tactile sensation, and non-invasive multichannel electrotactile feedback and virtual reality to deliver high-bandwidth information to the user. Its core focus addresses the development and testing of close-loop sensory feedback human-machine interface, based on the latest distributed sensing and stimulation techniques for restoring the sense of touch in prosthetics. To this end, the thesis is comprised of two introductory chapters that describe the state of art in the field, the objectives and the used methodology and contributions; as well as three studies distributed over stimulation system level and sensing system level. The first study presents the development of close-loop compensatory tracking system to evaluate the usability and effectiveness of electrotactile sensory feedback in enabling real-time close-loop control in prosthetics. It examines and compares the subject\u2019s adaptive performance and tolerance to random latencies while performing the dynamic control task (i.e. position control) and simultaneously receiving either visual feedback or electrotactile feedback for communicating the momentary tracking error. Moreover, it reported the minimum time delay needed for an abrupt impairment of users\u2019 performance. The experimental results have shown that electrotactile feedback performance is less prone to changes with longer delays. However, visual feedback drops faster than electrotactile with increased time delays. This is a good indication for the effectiveness of electrotactile feedback in enabling close- loop control in prosthetics, since some delays are inevitable. The second study describes the development of a novel non-invasive compact multichannel interface for electrotactile feedback, containing 24 pads electrode matrix, with fully programmable stimulation unit, that investigates the ability of able-bodied human subjects to localize the electrotactile stimulus delivered through the electrode matrix. Furthermore, it designed a novel dual parameter -modulation (interleaved frequency and intensity) and compared it to conventional stimulation (same frequency for all pads). In addition and for the first time, it compared the electrotactile stimulation to mechanical stimulation. More, it exposes the integration of virtual prosthesis with the developed system in order to achieve better user experience and object manipulation through mapping the acquired real-time collected tactile data and feedback it simultaneously to the user. The experimental results demonstrated that the proposed interleaved coding substantially improved the spatial localization compared to same-frequency stimulation. Furthermore, it showed that same-frequency stimulation was equivalent to mechanical stimulation, whereas the performance with dual-parameter modulation was significantly better. The third study presents the realization of a novel, flexible, screen- printed e-skin based on P(VDF-TrFE) piezoelectric polymers, that would cover the fingertips and the palm of the prosthetic hand (particularly the Michelangelo hand by Ottobock) and an assistive sensorized glove for stroke patients. Moreover, it developed a new validation methodology to examine the sensors behavior while being solicited. The characterization results showed compatibility between the expected (modeled) behavior of the electrical response of each sensor to measured mechanical (normal) force at the skin surface, which in turn proved the combination of both fabrication and assembly processes was successful. This paves the way to define a practical, simplified and reproducible characterization protocol for e-skin patches In conclusion, by adopting innovative methodologies in sensing and stimulation systems, this thesis advances the overall development of close-loop sensory feedback human-machine interface used for restoration of sense of touch in prosthetics. Moreover, this research could lead to high-bandwidth high-fidelity transmission of tactile information for modern dexterous prostheses that could ameliorate the end user experience and facilitate it acceptance in the daily life
    • …
    corecore