126 research outputs found

    Rehabilitation Engineering

    Get PDF
    Population ageing has major consequences and implications in all areas of our daily life as well as other important aspects, such as economic growth, savings, investment and consumption, labour markets, pensions, property and care from one generation to another. Additionally, health and related care, family composition and life-style, housing and migration are also affected. Given the rapid increase in the aging of the population and the further increase that is expected in the coming years, an important problem that has to be faced is the corresponding increase in chronic illness, disabilities, and loss of functional independence endemic to the elderly (WHO 2008). For this reason, novel methods of rehabilitation and care management are urgently needed. This book covers many rehabilitation support systems and robots developed for upper limbs, lower limbs as well as visually impaired condition. Other than upper limbs, the lower limb research works are also discussed like motorized foot rest for electric powered wheelchair and standing assistance device

    Toward Standardizing the Classification of Robotic Gait Rehabilitation Systems

    Get PDF

    Impact of Ear Occlusion on In-Ear Sounds Generated by Intra-oral Behaviors

    Get PDF
    We conducted a case study with one volunteer and a recording setup to detect sounds induced by the actions: jaw clenching, tooth grinding, reading, eating, and drinking. The setup consisted of two in-ear microphones, where the left ear was semi-occluded with a commercially available earpiece and the right ear was occluded with a mouldable silicon ear piece. Investigations in the time and frequency domains demonstrated that for behaviors such as eating, tooth grinding, and reading, sounds could be recorded with both sensors. For jaw clenching, however, occluding the ear with a mouldable piece was necessary to enable its detection. This can be attributed to the fact that the mouldable ear piece sealed the ear canal and isolated it from the environment, resulting in a detectable change in pressure. In conclusion, our work suggests that detecting behaviors such as eating, grinding, reading with a semi-occluded ear is possible, whereas, behaviors such as clenching require the complete occlusion of the ear if the activity should be easily detectable. Nevertheless, the latter approach may limit real-world applicability because it hinders the hearing capabilities.</p

    System Identification of Bipedal Locomotion in Robots and Humans

    Get PDF
    The ability to perform a healthy walking gait can be altered in numerous cases due to gait disorder related pathologies. The latter could lead to partial or complete mobility loss, which affects the patients’ quality of life. Wearable exoskeletons and active prosthetics have been considered as a key component to remedy this mobility loss. The control of such devices knows numerous challenges that are yet to be addressed. As opposed to fixed trajectories control, real-time adaptive reference generation control is likely to provide the wearer with more intent control over the powered device. We propose a novel gait pattern generator for the control of such devices, taking advantage of the inter-joint coordination in the human gait. Our proposed method puts the user in the control loop as it maps the motion of healthy limbs to that of the affected one. To design such control strategy, it is critical to understand the dynamics behind bipedal walking. We begin by studying the simple compass gait walker. We examine the well-known Virtual Constraints method of controlling bipedal robots in the image of the compass gait. In addition, we provide both the mechanical and control design of an affordable research platform for bipedal dynamic walking. We then extend the concept of virtual constraints to human locomotion, where we investigate the accuracy of predicting lower limb joints angular position and velocity from the motion of the other limbs. Data from nine healthy subjects performing specific locomotion tasks were collected and are made available online. A successful prediction of the hip, knee, and ankle joints was achieved in different scenarios. It was also found that the motion of the cane alone has sufficient information to help predict good trajectories for the lower limb in stairs ascent. Better estimates were obtained using additional information from arm joints. We also explored the prediction of knee and ankle trajectories from the motion of the hip joints

    Upper-limb Kinematic Analysis and Artificial Intelligent Techniques for Neurorehabilitation and Assistive Environments

    Get PDF
    Stroke, one of the leading causes of death and disability around the world, usually affects the motor cortex causing weakness or paralysis in the limbs of one side of the body. Research efforts in neurorehabilitation technology have focused on the development of robotic devices to restore motor and cognitive function in impaired individuals, having the potential to deliver high-intensity and motivating therapy. End-effector-based devices have become an usual tool in the upper- limb neurorehabilitation due to the ease of adapting to patients. However, they are unable to measure the joint movements during the exercise. Thus, the first part of this thesis is focused on the development of a kinematic reconstruction algorithm that can be used in a real rehabilitation environment, without disturbing the normal patient-clinician interaction. On the basis of the algorithm found in the literature that presents some instabilities, a new algorithm is developed. The proposed algorithm is the first one able to online estimate not only the upper-limb joints, but also the trunk compensation using only two non-invasive wearable devices, placed onto the shoulder and upper arm of the patient. This new tool will allow the therapist to perform a comprehensive assessment combining the range of movement with clinical assessment scales. Knowing that the intensity of the therapy improves the outcomes of neurorehabilitation, a ‘self-managed’ rehabilitation system can allow the patients to continue the rehabilitation at home. This thesis proposes a system to online measure a set of upper-limb rehabilitation gestures, and intelligently evaluates the quality of the exercise performed by the patients. The assessment is performed through the study of the performed movement as a whole as well as evaluating each joint independently. The first results are promising and suggest that this system can became a a new tool to complement the clinical therapy at home and improve the rehabilitation outcomes. Finally, severe motor condition can remain after rehabilitation process. Thus, a technology solution for these patients and people with severe motor disabilities is proposed. An intelligent environmental control interface is developed with the ability to adapt its scan control to the residual capabilities of the user. Furthermore, the system estimates the intention of the user from the environmental information and the behavior of the user, helping in the navigation through the interface, improving its independence at home.El accidente cerebrovascular o ictus es una de las causas principales de muerte y discapacidad a nivel mundial. Normalmente afecta a la corteza motora causando debilidad o parálisis en las articulaciones del mismo lado del cuerpo. Los esfuerzos de investigación dentro de la tecnología de neurorehabilitación se han centrado en el desarrollo de dispositivos robóticos para restaurar las funciones motoras y cognitivas en las personas con esta discapacidad, teniendo un gran potencial para ofrecer una terapia de alta intensidad y motivadora. Los dispositivos basados en efector final se han convertido en una herramienta habitual en la neurorehabilitación de miembro superior ya que es muy sencillo adaptarlo a los pacientes. Sin embargo, éstos no son capaces de medir los movimientos articulares durante la realización del ejercicio. Por tanto, la primera parte de esta tesis se centra en el desarrollo de un algoritmo de reconstrucción cinemática que pueda ser usado en un entorno de rehabilitación real, sin perjudicar a la interacción normal entre el paciente y el clínico. Partiendo de la base que propone el algoritmo encontrado en la literatura, el cual presenta algunas inestabilidades, se ha desarrollado un nuevo algoritmo. El algoritmo propuesto es el primero capaz de estimar en tiempo real no sólo las articulaciones del miembro superior, sino también la compensación del tronco usando solamente dos dispositivos no invasivos y portátiles, colocados sobre el hombro y el brazo del paciente. Esta nueva herramienta permite al terapeuta realizar una valoración más exhaustiva combinando el rango de movimiento con las escalas de valoración clínicas. Sabiendo que la intensidad de la terapia mejora los resultados de la recuperación del ictus, un sistema de rehabilitación ‘auto-gestionado’ permite a los pacientes continuar con la rehabilitación en casa. Esta tesis propone un sistema para medir en tiempo real un conjunto de gestos de miembro superior y evaluar de manera inteligente la calidad del ejercicio realizado por el paciente. La valoración se hace a través del estudio del movimiento ejecutado en su conjunto, así como evaluando cada articulación independientemente. Los primeros resultados son prometedores y apuntan a que este sistema puede convertirse en una nueva herramienta para complementar la terapia clínica en casa y mejorar los resultados de la rehabilitación. Finalmente, después del proceso de rehabilitación pueden quedar secuelas motoras graves. Por este motivo, se propone una solución tecnológica para estas personas y para personas con discapacidades motoras severas. Así, se ha desarrollado una interfaz de control de entorno inteligente capaz de adaptar su control a las capacidades residuales del usuario. Además, el sistema estima la intención del usuario a partir de la información del entorno y el comportamiento del usuario, ayudando en la navegación a través de la interfaz, mejorando su independencia en el hogar

    Virtual reality application for rehabilitation

    Get PDF
    Aquest projecte té com a objectiu desenvolupar una aplicació de Realitat Virtual funcional per a la rehabilitació i el benestar de la gent gran en residències d’avis. El que es pretén és complementar el tractament de rehabilitació dels pacients mitjançant jocs de Realitat Virtual que els permetin realitzar moviments repetitius mentre es mouen per un entorn natural immersiu.Este proyecto tiene como objetivo desarrollar una aplicación de Realidad Virtual funcional para la rehabilitación y el bienestar de las personas mayores en residencias de ancianos. Lo que se pretende es complementar el tratamiento de rehabilitación de los pacientes mediante juegos de Realidad Virtual que les permitan realizar movimientos repetitivos mientras se mueven por un entorno natural inmersivo.This project aims to develop a functional Virtual Reality application for the rehabilitation and well-being of older adults in nursing homes. It intends to engage the patient in the rehabilitation treatment by means of Virtual Reality games. To play these games, the patient must execute repetitive movements while moving around in an immersive natural environment.Objectius de Desenvolupament Sostenible::3 - Salut i Benesta

    Virtual reality application for rehabilitation

    Get PDF
    Aquest projecte té com a objectiu desenvolupar una aplicació de Realitat Virtual funcional per a la rehabilitació i el benestar de la gent gran en residències d’avis. El que es pretén és complementar el tractament de rehabilitació dels pacients mitjançant jocs de Realitat Virtual que els permetin realitzar moviments repetitius mentre es mouen per un entorn natural immersiu.Este proyecto tiene como objetivo desarrollar una aplicación de Realidad Virtual funcional para la rehabilitación y el bienestar de las personas mayores en residencias de ancianos. Lo que se pretende es complementar el tratamiento de rehabilitación de los pacientes mediante juegos de Realidad Virtual que les permitan realizar movimientos repetitivos mientras se mueven por un entorno natural inmersivo.This project aims to develop a functional Virtual Reality application for the rehabilitation and well-being of older adults in nursing homes. It intends to engage the patient in the rehabilitation treatment by means of Virtual Reality games. To play these games, the patient must execute repetitive movements while moving around in an immersive natural environment.Objectius de Desenvolupament Sostenible::3 - Salut i Benesta

    Fused mechanomyography and inertial measurement for human-robot interface

    Get PDF
    Human-Machine Interfaces (HMI) are the technology through which we interact with the ever-increasing quantity of smart devices surrounding us. The fundamental goal of an HMI is to facilitate robot control through uniting a human operator as the supervisor with a machine as the task executor. Sensors, actuators, and onboard intelligence have not reached the point where robotic manipulators may function with complete autonomy and therefore some form of HMI is still necessary in unstructured environments. These may include environments where direct human action is undesirable or infeasible, and situations where a robot must assist and/or interface with people. Contemporary literature has introduced concepts such as body-worn mechanical devices, instrumented gloves, inertial or electromagnetic motion tracking sensors on the arms, head, or legs, electroencephalographic (EEG) brain activity sensors, electromyographic (EMG) muscular activity sensors and camera-based (vision) interfaces to recognize hand gestures and/or track arm motions for assessment of operator intent and generation of robotic control signals. While these developments offer a wealth of future potential their utility has been largely restricted to laboratory demonstrations in controlled environments due to issues such as lack of portability and robustness and an inability to extract operator intent for both arm and hand motion. Wearable physiological sensors hold particular promise for capture of human intent/command. EMG-based gesture recognition systems in particular have received significant attention in recent literature. As wearable pervasive devices, they offer benefits over camera or physical input systems in that they neither inhibit the user physically nor constrain the user to a location where the sensors are deployed. Despite these benefits, EMG alone has yet to demonstrate the capacity to recognize both gross movement (e.g. arm motion) and finer grasping (e.g. hand movement). As such, many researchers have proposed fusing muscle activity (EMG) and motion tracking e.g. (inertial measurement) to combine arm motion and grasp intent as HMI input for manipulator control. However, such work has arguably reached a plateau since EMG suffers from interference from environmental factors which cause signal degradation over time, demands an electrical connection with the skin, and has not demonstrated the capacity to function out of controlled environments for long periods of time. This thesis proposes a new form of gesture-based interface utilising a novel combination of inertial measurement units (IMUs) and mechanomyography sensors (MMGs). The modular system permits numerous configurations of IMU to derive body kinematics in real-time and uses this to convert arm movements into control signals. Additionally, bands containing six mechanomyography sensors were used to observe muscular contractions in the forearm which are generated using specific hand motions. This combination of continuous and discrete control signals allows a large variety of smart devices to be controlled. Several methods of pattern recognition were implemented to provide accurate decoding of the mechanomyographic information, including Linear Discriminant Analysis and Support Vector Machines. Based on these techniques, accuracies of 94.5% and 94.6% respectively were achieved for 12 gesture classification. In real-time tests, accuracies of 95.6% were achieved in 5 gesture classification. It has previously been noted that MMG sensors are susceptible to motion induced interference. The thesis also established that arm pose also changes the measured signal. This thesis introduces a new method of fusing of IMU and MMG to provide a classification that is robust to both of these sources of interference. Additionally, an improvement in orientation estimation, and a new orientation estimation algorithm are proposed. These improvements to the robustness of the system provide the first solution that is able to reliably track both motion and muscle activity for extended periods of time for HMI outside a clinical environment. Application in robot teleoperation in both real-world and virtual environments were explored. With multiple degrees of freedom, robot teleoperation provides an ideal test platform for HMI devices, since it requires a combination of continuous and discrete control signals. The field of prosthetics also represents a unique challenge for HMI applications. In an ideal situation, the sensor suite should be capable of detecting the muscular activity in the residual limb which is naturally indicative of intent to perform a specific hand pose and trigger this post in the prosthetic device. Dynamic environmental conditions within a socket such as skin impedance have delayed the translation of gesture control systems into prosthetic devices, however mechanomyography sensors are unaffected by such issues. There is huge potential for a system like this to be utilised as a controller as ubiquitous computing systems become more prevalent, and as the desire for a simple, universal interface increases. Such systems have the potential to impact significantly on the quality of life of prosthetic users and others.Open Acces

    Augmented visual, auditory, haptic, and multimodal feedback in motor learning: A review

    Get PDF
    It is generally accepted that augmented feedback, provided by a human expert or a technical display, effectively enhances motor learning. However, discussion of the way to most effectively provide augmented feedback has been controversial. Related studies have focused primarily on simple or artificial tasks enhanced by visual feedback. Recently, technical advances have made it possible also to investigate more complex, realistic motor tasks and to implement not only visual, but also auditory, haptic, or multimodal augmented feedback. The aim of this review is to address the potential of augmented unimodal and multimodal feedback in the framework of motor learning theories. The review addresses the reasons for the different impacts of feedback strategies within or between the visual, auditory, and haptic modalities and the challenges that need to be overcome to provide appropriate feedback in these modalities, either in isolation or in combination. Accordingly, the design criteria for successful visual, auditory, haptic, and multimodal feedback are elaborate
    corecore