29 research outputs found

    Action recognition using instrumented objects for stroke rehabilitation

    Get PDF
    Assisting patients to perform activities of daily living (ADLs) is a challenging task for both human and machine. Hence, developing a computer-based rehabilitation system to re-train patients to carry out daily activities is an essential step towards facilitating rehabilitation of stroke patients with apraxia and action disorganization syndrome (AADS). This thesis presents a real-time Hidden Markov Model (HMM) based human activity recognizer, and proposes a technique to reduce the time delay occurred during the decoding stage. Results are reported for complete tea-making trials. In this study, the input features are recorded using sensors attached to the objects involved in the tea making task, plus hand coordinate data captured using Kinect sensor. A coaster of sensors, comprising an accelerometer and three force-sensitive resistors, are packaged in a unit which can be easily attached to the base of an object. A parallel asynchronous set of detectors, each responsible for the detection of one sub-goal in the tea-making task, are used to address challenges arising from overlaps between human actions. In this work HMMs are used to exploit temporal dependencies between actions and emission distributions are modelled by two generative and discriminative modelling techniques namely Gaussian Mixture Models (GMMs) and Deep Neural Networks (DNNs). Our experimental results show that HMM-DNN based systems outperform the GMM-HMM based systems by 18%. The proposed activity recognition system with the modified HMM topology provides a practical solution to the action recognition problem and reduces the time delay by 64% with no loss in accuracy

    Redesenho da interface de utilizador da aplicação móvel Us'Em

    Get PDF
    Mestrado em Comunicação MultimédiaThe research presented here aims to design a feasible and adequate mobile application (app) user interface. This mobile app is part of Us'em system, designed to promote self-rehabilitation after stroke. The system is based on wearable, mobile and tracking sensors-based technology. The app works as a feedback tool, communicating Us’em system users about the frequency of their upper limb moves and about their recovery process. The mobile app aims increasing patient’s motivation in using their arm-hand through the day and improving their rehabilitation through self-training at home with continuous feedback. The design of its user interface is of great relevance, because it determines if post stoke patients can use Us’em system. The empirical part begins with interviews, questionnaires and observation of post stroke patients and physical therapists from Portugal and the Netherlands. It provides a better understanding of post stroke rehabilitation process and stroke victims’ characteristics and requirements regarding rehabilitation and mobile devices interaction. The gathered information contributed to the development of a prototype that materializes the defined Us’em app. The prototyping process ran through iterative cycles of design, implementation and evaluation to ascertain the adequacy of Us’em app user interface. The final prototype is the final product of this research project and it was evaluated through usability tests with post stroke patients from both countries aforementioned. Tests to the final prototype show it may be difficult to design a unique solution for all the users due to the wide range of their requirements. However, the core requirements of Us'em mobile app is simplicity: the number of user interface elements, the amount of information and the complexity of iteractions and functionalities of this app should be the lowest as possible. The research also allows to conclude that the user interface designed meets most of user’s requirements and it has a significant impact on the motivation of post stroke patients in moving their impaired arm-hand autonomously.O trabalho de investigação aqui apresentado objetiva o desenvolvimento de uma viável e adequada interface de utilizador de uma aplicação móvel (app). Esta app é um dos componentes do sistema Us’em, desenvolvido para promover a auto reabilitação após acidentes vasculares cerebrais (AVC). O sistema é baseado em tecnologia vestível, móvel e de monitorização através de sensores. A app funciona como uma ferramenta de feedback, informando os utilizadores do sistema Us’em sobre a frequência dos movimentos dos seus membros superiores e sobre o seu processo de recuperação. A app objetiva aumentar a motivação dos pacientes em usar o seu braço ou mão debilitado ao longo do dia e, assim, promover a sua reabilitação através do treino autónomo em casa com feedback contínuo. O desenvolvimento da interface de utilizador é de grande relevância, pois determina se pacientes vítimas de AVC conseguem utilizar o sistema Us’em. O estudo empírico parte da realização de entrevistas, questionários e observação de pacientes vítimas de AVC e fisioterapeutas Portugueses e Holandeses. Este estudo permite uma melhor compreensão do processo de reabilitação e das características e requisitos de vítimas de AVC no que respeita à reabilitação e à interação com dispositivos móveis. A informação recolhida contribuiu para o desenvolvimento de um protótipo que concretizasse a app Us’em definida. O processo de prototipagem ocorreu ao longo de ciclos iterativos de desenvolvimento, implementação e teste de forma a verificar a adequação da interface de utilizador da app Us’em. O protótipo final é o produto final deste projeto de investigação e foi testado através de testes de usabilidade com pacientes vítimas de AVC de ambos os países referidos anteriormente. Os testes ao protótipo final revelam que poderá ser difícil desenvolver uma solução única para todos os utilizadores devido ao conjunto vasto dos seus requisitos. No entanto, o requisito chave da app Us’em é simplicidade: o número de elementos da interface de utilizador, a quantidade de informação e a complexidade das interações e funcionalidades da app deve ser o mais reduzido possível. Esta investigação também permite concluir que a interface de utilizador desenvolvida satisfaz a maior parte dos requisitos dos utilizadores e tem um impacto significativo na motivação de pacientes vítimas de AVC em movimentar o seu braço ou mão desabilitada de forma autónoma

    Context-Aware Scenarios : Course on Context-Aware Computing 2003

    Get PDF
    According to a recent definition by Dey and Abowd context is any information that can be used to characterize the situation of an entity. An entity is a person, place or object that is considered relevant to the interaction between a user and an application, including the user and the application themselves. A system is context-aware if it uses context to provide relevant information and/or services to the user, where relevancy depends on the user's task. The following report contains ten scenarios about how context-aware applications could affect ordinary persons in fifteen years. The scenarios have been written by students participating in a course on context-aware computing in the autumn of 2003.Peer reviewe

    Biomechatronics: Harmonizing Mechatronic Systems with Human Beings

    Get PDF
    This eBook provides a comprehensive treatise on modern biomechatronic systems centred around human applications. A particular emphasis is given to exoskeleton designs for assistance and training with advanced interfaces in human-machine interaction. Some of these designs are validated with experimental results which the reader will find very informative as building-blocks for designing such systems. This eBook will be ideally suited to those researching in biomechatronic area with bio-feedback applications or those who are involved in high-end research on manmachine interfaces. This may also serve as a textbook for biomechatronic design at post-graduate level

    An fMRI-investigation on the neural correlates of tool use in young and elderly adults

    Get PDF

    Dynamic Calibration of EMG Signals for Control of a Wearable Elbow Brace

    Get PDF
    Musculoskeletal injuries can severely inhibit performance of activities of daily living. In order to regain function, rehabilitation is often required. Assistive devices for use in rehabilitation are an avenue explored to increase arm mobility by guiding therapeutic exercises or assisting with motion. Electromyography (EMG), which are the muscle activity signals, may be able to provide an intuitive interface between the patient and the device if appropriate classification models allow smart systems to relate these signals to the desired device motion. Unfortunately, there is a gap in the accuracy of pattern recognition models classifying motion in constrained laboratory environments, and large reductions in accuracy when used for detecting dynamic unconstrained movements. An understanding of combinations of motion factors (limb positions, forces, velocities) in dynamic movements affecting EMG, and ways to use information about these motion factors in control systems is lacking. The objectives of this thesis were to quantify how various motion factors affect arm muscle activations during dynamic motion, and to use these motion factors and EMG signals for detecting interaction forces between the person and the environment during motion. To address these objectives, software was developed and implemented to collect a unique dataset of EMG signals while healthy individuals performed unconstrained arm motions with combinations of arm positions, interaction forces with the environment, velocities, and types of motion. An analysis of the EMG signals and their use in training classification models to predict characteristics (arm positions, force levels, and velocities) of intended motion was completed. The results quantify how EMG features change significantly with variations in arm positions, interaction forces, and motion velocities. The results also show that pattern recognition models, usually used to detect movements, were able to detect intended characteristics of motion based solely on EMG signals, even during complex activities of daily living. Arm position during elbow flexion--extension was predicted with 83.02 % accuracy by a support vector machine model using EMG signal inputs. Prediction of force, the motion characteristic that cannot be measured without impeding motion, was improved from 76.85 % correct to 79.17 % accurate during elbow flexion--extension by providing measurable arm position and velocity information as additional inputs to a linear discriminant analysis model. The accuracy of force prediction was improved by 5.2 % (increased from 59.38 % to 64.58 %) during an activity of daily living when motion speeds were included as an input to a linear discriminant analysis model in addition to EMG signals. Future work should expand on using motion characteristics and EMG signals to identify interactions between a person and the environment, in order to guide high level tuning of control models working towards controlling wearable elbow braces during dynamic movements

    Virtual and Mixed Reality Support for Activities of Daily Living

    Get PDF
    Rehabilitation and training are extremely important process that help people who have suffered some form of trauma to regain their ability to live independently and successfully complete activities of daily living. VR and MR have been used in rehabilitation and training, with examples in a range of areas such as physical and cognitive rehabilitation, and medical training. However, previous research has mainly used non-immersive VR such as using video games on a computer monitor or television. Immersive VR Head-Mounted Displays were first developed in 1965 but the devices were usually large, bulky and expensive. In 2016, the release of low-cost VR HMDs allowed for wider adoption of VR technology. This thesis investigates the impact of these devices in supporting activities of daily living through three novel applications: training driving skills for a powered wheelchair in both VR and MR; and using VR to help with the cognitive rehabilitation of stroke patients. Results from the acceptability study for VR in cognitive rehabilitation showed that patients would be likely to accept VR as a method of rehabilitation. However, factors such as visual issues need to be taken into consideration. The validation study for the Wheelchair-VR project showed promising results in terms of user improvement after the VR training session but the majority of the users experienced symptoms of cybersickness. Wheelchair-MR didn’t show statistically significant results in terms of improvements but did show a mean average improvement compared to the control group. The effects of cybersickness were also greatly reduced compared to VR. We conclude that VR and MR can be used in conjunction with modern games engines to develop virtual environments that can be adapted to accelerate the rehabilitation and training of patients coping with different aspects of daily life

    Accessibility of Health Data Representations for Older Adults: Challenges and Opportunities for Design

    Get PDF
    Health data of consumer off-the-shelf wearable devices is often conveyed to users through visual data representations and analyses. However, this is not always accessible to people with disabilities or older people due to low vision, cognitive impairments or literacy issues. Due to trade-offs between aesthetics predominance or information overload, real-time user feedback may not be conveyed easily from sensor devices through visual cues like graphs and texts. These difficulties may hinder critical data understanding. Additional auditory and tactile feedback can also provide immediate and accessible cues from these wearable devices, but it is necessary to understand existing data representation limitations initially. To avoid higher cognitive and visual overload, auditory and haptic cues can be designed to complement, replace or reinforce visual cues. In this paper, we outline the challenges in existing data representation and the necessary evidence to enhance the accessibility of health information from personal sensing devices used to monitor health parameters such as blood pressure, sleep, activity, heart rate and more. By creating innovative and inclusive user feedback, users will likely want to engage and interact with new devices and their own data

    The Murray Ledger and Times, September 30, 1987

    Get PDF
    corecore