109 research outputs found

    Down-Conditioning of Soleus Reflex Activity using Mechanical Stimuli and EMG Biofeedback

    Get PDF
    Spasticity is a common syndrome caused by various brain and neural injuries, which can severely impair walking ability and functional independence. To improve functional independence, conditioning protocols are available aimed at reducing spasticity by facilitating spinal neuroplasticity. This down-conditioning can be performed using different types of stimuli, electrical or mechanical, and reflex activity measures, EMG or impedance, used as biofeedback variable. Still, current results on effectiveness of these conditioning protocols are incomplete, making comparisons difficult. We aimed to show the within-session task- dependent and across-session long-term adaptation of a conditioning protocol based on mechanical stimuli and EMG biofeedback. However, in contrast to literature, preliminary results show that subjects were unable to successfully obtain task-dependent modulation of their soleus short-latency stretch reflex magnitude

    Human Gait Model Development for Objective Analysis of Pre/Post Gait Characteristics Following Lumbar Spine Surgery

    Get PDF
    Although multiple advanced tools and methods are available for gait analysis, the gait and its related disorders are usually assessed by visual inspection in the clinical environment. This thesis aims to introduce a gait analysis system that provides an objective method for gait evaluation in clinics and overcomes the limitations of the current gait analysis systems. Early identification of foot drop, a common gait disorder, would become possible using the proposed methodology

    Body sensor networks: smart monitoring solutions after reconstructive surgery

    Get PDF
    Advances in reconstructive surgery are providing treatment options in the face of major trauma and cancer. Body Sensor Networks (BSN) have the potential to offer smart solutions to a range of clinical challenges. The aim of this thesis was to review the current state of the art devices, then develop and apply bespoke technologies developed by the Hamlyn Centre BSN engineering team supported by the EPSRC ESPRIT programme to deliver post-operative monitoring options for patients undergoing reconstructive surgery. A wireless optical sensor was developed to provide a continuous monitoring solution for free tissue transplants (free flaps). By recording backscattered light from 2 different source wavelengths, we were able to estimate the oxygenation of the superficial microvasculature. In a custom-made upper limb pressure cuff model, forearm deoxygenation measured by our sensor and gold standard equipment showed strong correlations, with incremental reductions in response to increased cuff inflation durations. Such a device might allow early detection of flap failure, optimising the likelihood of flap salvage. An ear-worn activity recognition sensor was utilised to provide a platform capable of facilitating objective assessment of functional mobility. This work evolved from an initial feasibility study in a knee replacement cohort, to a larger clinical trial designed to establish a novel mobility score in patients recovering from open tibial fractures (OTF). The Hamlyn Mobility Score (HMS) assesses mobility over 3 activities of daily living: walking, stair climbing, and standing from a chair. Sensor-derived parameters including variation in both temporal and force aspects of gait were validated to measure differences in performance in line with fracture severity, which also matched questionnaire-based assessments. Monitoring the OTF cohort over 12 months with the HMS allowed functional recovery to be profiled in great detail. Further, a novel finding of continued improvements in walking quality after a plateau in walking quantity was demonstrated objectively. The methods described in this thesis provide an opportunity to revamp the recovery paradigm through continuous, objective patient monitoring along with self-directed, personalised rehabilitation strategies, which has the potential to improve both the quality and cost-effectiveness of reconstructive surgery services.Open Acces

    Wearables for Movement Analysis in Healthcare

    Get PDF
    Quantitative movement analysis is widely used in clinical practice and research to investigate movement disorders objectively and in a complete way. Conventionally, body segment kinematic and kinetic parameters are measured in gait laboratories using marker-based optoelectronic systems, force plates, and electromyographic systems. Although movement analyses are considered accurate, the availability of specific laboratories, high costs, and dependency on trained users sometimes limit its use in clinical practice. A variety of compact wearable sensors are available today and have allowed researchers and clinicians to pursue applications in which individuals are monitored in their homes and in community settings within different fields of study, such movement analysis. Wearable sensors may thus contribute to the implementation of quantitative movement analyses even during out-patient use to reduce evaluation times and to provide objective, quantifiable data on the patients’ capabilities, unobtrusively and continuously, for clinical purposes

    Human Activity Recognition and Control of Wearable Robots

    Get PDF
    abstract: Wearable robotics has gained huge popularity in recent years due to its wide applications in rehabilitation, military, and industrial fields. The weakness of the skeletal muscles in the aging population and neurological injuries such as stroke and spinal cord injuries seriously limit the abilities of these individuals to perform daily activities. Therefore, there is an increasing attention in the development of wearable robots to assist the elderly and patients with disabilities for motion assistance and rehabilitation. In military and industrial sectors, wearable robots can increase the productivity of workers and soldiers. It is important for the wearable robots to maintain smooth interaction with the user while evolving in complex environments with minimum effort from the user. Therefore, the recognition of the user's activities such as walking or jogging in real time becomes essential to provide appropriate assistance based on the activity. This dissertation proposes two real-time human activity recognition algorithms intelligent fuzzy inference (IFI) algorithm and Amplitude omega (AωA \omega) algorithm to identify the human activities, i.e., stationary and locomotion activities. The IFI algorithm uses knee angle and ground contact forces (GCFs) measurements from four inertial measurement units (IMUs) and a pair of smart shoes. Whereas, the AωA \omega algorithm is based on thigh angle measurements from a single IMU. This dissertation also attempts to address the problem of online tuning of virtual impedance for an assistive robot based on real-time gait and activity measurement data to personalize the assistance for different users. An automatic impedance tuning (AIT) approach is presented for a knee assistive device (KAD) in which the IFI algorithm is used for real-time activity measurements. This dissertation also proposes an adaptive oscillator method known as amplitude omega adaptive oscillator (AωAOA\omega AO) method for HeSA (hip exoskeleton for superior augmentation) to provide bilateral hip assistance during human locomotion activities. The AωA \omega algorithm is integrated into the adaptive oscillator method to make the approach robust for different locomotion activities. Experiments are performed on healthy subjects to validate the efficacy of the human activities recognition algorithms and control strategies proposed in this dissertation. Both the activity recognition algorithms exhibited higher classification accuracy with less update time. The results of AIT demonstrated that the KAD assistive torque was smoother and EMG signal of Vastus Medialis is reduced, compared to constant impedance and finite state machine approaches. The AωAOA\omega AO method showed real-time learning of the locomotion activities signals for three healthy subjects while wearing HeSA. To understand the influence of the assistive devices on the inherent dynamic gait stability of the human, stability analysis is performed. For this, the stability metrics derived from dynamical systems theory are used to evaluate unilateral knee assistance applied to the healthy participants.Dissertation/ThesisDoctoral Dissertation Aerospace Engineering 201

    Fused mechanomyography and inertial measurement for human-robot interface

    Get PDF
    Human-Machine Interfaces (HMI) are the technology through which we interact with the ever-increasing quantity of smart devices surrounding us. The fundamental goal of an HMI is to facilitate robot control through uniting a human operator as the supervisor with a machine as the task executor. Sensors, actuators, and onboard intelligence have not reached the point where robotic manipulators may function with complete autonomy and therefore some form of HMI is still necessary in unstructured environments. These may include environments where direct human action is undesirable or infeasible, and situations where a robot must assist and/or interface with people. Contemporary literature has introduced concepts such as body-worn mechanical devices, instrumented gloves, inertial or electromagnetic motion tracking sensors on the arms, head, or legs, electroencephalographic (EEG) brain activity sensors, electromyographic (EMG) muscular activity sensors and camera-based (vision) interfaces to recognize hand gestures and/or track arm motions for assessment of operator intent and generation of robotic control signals. While these developments offer a wealth of future potential their utility has been largely restricted to laboratory demonstrations in controlled environments due to issues such as lack of portability and robustness and an inability to extract operator intent for both arm and hand motion. Wearable physiological sensors hold particular promise for capture of human intent/command. EMG-based gesture recognition systems in particular have received significant attention in recent literature. As wearable pervasive devices, they offer benefits over camera or physical input systems in that they neither inhibit the user physically nor constrain the user to a location where the sensors are deployed. Despite these benefits, EMG alone has yet to demonstrate the capacity to recognize both gross movement (e.g. arm motion) and finer grasping (e.g. hand movement). As such, many researchers have proposed fusing muscle activity (EMG) and motion tracking e.g. (inertial measurement) to combine arm motion and grasp intent as HMI input for manipulator control. However, such work has arguably reached a plateau since EMG suffers from interference from environmental factors which cause signal degradation over time, demands an electrical connection with the skin, and has not demonstrated the capacity to function out of controlled environments for long periods of time. This thesis proposes a new form of gesture-based interface utilising a novel combination of inertial measurement units (IMUs) and mechanomyography sensors (MMGs). The modular system permits numerous configurations of IMU to derive body kinematics in real-time and uses this to convert arm movements into control signals. Additionally, bands containing six mechanomyography sensors were used to observe muscular contractions in the forearm which are generated using specific hand motions. This combination of continuous and discrete control signals allows a large variety of smart devices to be controlled. Several methods of pattern recognition were implemented to provide accurate decoding of the mechanomyographic information, including Linear Discriminant Analysis and Support Vector Machines. Based on these techniques, accuracies of 94.5% and 94.6% respectively were achieved for 12 gesture classification. In real-time tests, accuracies of 95.6% were achieved in 5 gesture classification. It has previously been noted that MMG sensors are susceptible to motion induced interference. The thesis also established that arm pose also changes the measured signal. This thesis introduces a new method of fusing of IMU and MMG to provide a classification that is robust to both of these sources of interference. Additionally, an improvement in orientation estimation, and a new orientation estimation algorithm are proposed. These improvements to the robustness of the system provide the first solution that is able to reliably track both motion and muscle activity for extended periods of time for HMI outside a clinical environment. Application in robot teleoperation in both real-world and virtual environments were explored. With multiple degrees of freedom, robot teleoperation provides an ideal test platform for HMI devices, since it requires a combination of continuous and discrete control signals. The field of prosthetics also represents a unique challenge for HMI applications. In an ideal situation, the sensor suite should be capable of detecting the muscular activity in the residual limb which is naturally indicative of intent to perform a specific hand pose and trigger this post in the prosthetic device. Dynamic environmental conditions within a socket such as skin impedance have delayed the translation of gesture control systems into prosthetic devices, however mechanomyography sensors are unaffected by such issues. There is huge potential for a system like this to be utilised as a controller as ubiquitous computing systems become more prevalent, and as the desire for a simple, universal interface increases. Such systems have the potential to impact significantly on the quality of life of prosthetic users and others.Open Acces

    XXII International Conference on Mechanics in Medicine and Biology - Abstracts Book

    Get PDF
    This book contain the abstracts presented the XXII ICMMB, held in Bologna in September 2022. The abstracts are divided following the sessions scheduled during the conference
    • …
    corecore