429 research outputs found
Recommended from our members
Reachable Workspace and Proximal Function Measures for Quantifying Upper Limb Motion.
There are a lack of quantitative measures for clinically assessing upper limb function. Conventional biomechanical performance measures are restricted to specialist labs due to hardware cost and complexity, while the resulting measurements require specialists for analysis. Depth cameras are low cost and portable systems that can track surrogate joint positions. However, these motions may not be biologically consistent, which can result in noisy, inaccurate movements. This paper introduces a rigid body modelling method to enforce biological feasibility of the recovered motions. This method is evaluated on an existing depth camera assessment: the reachable workspace (RW) measure for assessing gross shoulder function. As a rigid body model is used, position estimates of new proximal targets can be added, resulting in a proximal function (PF) measure for assessing a subject's ability to touch specific body landmarks. The accuracy, and repeatability of these measures is assessed on ten asymptomatic subjects, with and without rigid body constraints. This analysis is performed both on a low-cost depth camera system and a gold-standard active motion capture system. The addition of rigid body constraints was found to improve accuracy and concordance of the depth camera system, particularly in lateral reaching movements. Both RW and PF measures were found to be feasible candidates for clinical assessment, with future analysis needed to determine their ability to detect changes within specific patient populations
Biosignalâbased humanâmachine interfaces for assistance and rehabilitation : a survey
As a definition, HumanâMachine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignalâbased HMIs for assistance and rehabilitation to outline stateâofâtheâart and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, fullâtext), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An everâgrowing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIsâ complex-ity, so their usefulness should be carefully evaluated for the specific application
A real-time algorithm for the detection of compensatory movements during reaching
Introduction: Interactive game systems can motivate stroke survivors to engage with their rehabilitation exercises. However, it is crucial that systems are in place to detect if exercises are performed correctly as stroke survivors often perform compensatory movements which can be detrimental to recovery. Very few game systems integrate motion tracking algorithms to monitor performance and detect such movements. This paper describes the development of algorithms which monitor for compensatory movements during upper limb reaching movements in real-time and provides quantitative metrics for health professionals to monitor performance and progress over time. Methods: A real-time algorithm was developed to analyse reaching motions in real-time through a low-cost depth camera. The algorithm segments cyclical reaching motions into component parts, including compensatory movement, and provides a graphical representation of task performance. Healthy participants (n = 10) performed reaching motions facing the camera. The real-time accuracy of the algorithm was assessed by comparing offline analysis to real-time collection of data. Results: The algorithmâs ability to segment cyclical reaching motions and detect the component parts in real-time was assessed. Results show that movement types can be detected in real time with accuracy, showing a maximum error of 1.71%. Conclusions: Using the methods outlined, the real-time detection and quantification of compensatory movements is feasible for integration within home-based, repetitive task practice game systems for people with stroke
Master of Science
thesisStroke is a leading cause of death and adult disability in the United States. Survivors lose abilities that were controlled by the affected area of the brain. Rehabilitation therapy is administered to help survivors regain control of lost functional abilities. The number of sessions that stroke survivors attend are limited to the availability of a clinic close to their residence and the amount of time friends and family can devote to help them commute, as most are incapable of driving. Home-based therapy using virtual reality and computer games have the potential of solving these issues, increasing the amount of independent therapy performed by patients. This thesis presents the design, development and testing of a low-cost system, potentially suitable for use in the home environment. This system is designed for rehabilitation of the impaired upper limb of stroke survivors. A Microsoft Kinect was used to track the position of the patient's hand and the game requires the user to move the arm over increasing large areas by sliding the arm on a support. Studies were performed with six stroke survivors and five control subjects to determine the feasibility of the system. Patients played the game for 6 to 10 days and their game scores, range of motion and Fugl-Meyer scores were recorded for analysis. Statistically significant (p<0.05) differences were found between the game scores of the first and last day of the study. Furthermore, acceptability surveys revealed patients enjoyed playing the game, found this kind of therapy more enjoyable than conventional therapy and were willing to use the system in the home environment. Future work in the system will be focused on larger studies, improving the comfort of patients while playing the game, and developing new games that address cognitive issues and integrate art and therapy
Dynamic Calibration of EMG Signals for Control of a Wearable Elbow Brace
Musculoskeletal injuries can severely inhibit performance of activities of daily living. In order to regain function, rehabilitation is often required. Assistive devices for use in rehabilitation are an avenue explored to increase arm mobility by guiding therapeutic exercises or assisting with motion. Electromyography (EMG), which are the muscle activity signals, may be able to provide an intuitive interface between the patient and the device if appropriate classification models allow smart systems to relate these signals to the desired device motion.
Unfortunately, there is a gap in the accuracy of pattern recognition models classifying motion in constrained laboratory environments, and large reductions in accuracy when used for detecting dynamic unconstrained movements. An understanding of combinations of motion factors (limb positions, forces, velocities) in dynamic movements affecting EMG, and ways to use information about these motion factors in control systems is lacking.
The objectives of this thesis were to quantify how various motion factors affect arm muscle activations during dynamic motion, and to use these motion factors and EMG signals for detecting interaction forces between the person and the environment during motion.
To address these objectives, software was developed and implemented to collect a unique dataset of EMG signals while healthy individuals performed unconstrained arm motions with combinations of arm positions, interaction forces with the environment, velocities, and types of motion. An analysis of the EMG signals and their use in training classification models to predict characteristics (arm positions, force levels, and velocities) of intended motion was completed.
The results quantify how EMG features change significantly with variations in arm positions, interaction forces, and motion velocities. The results also show that pattern recognition models, usually used to detect movements, were able to detect intended characteristics of motion based solely on EMG signals, even during complex activities of daily living. Arm position during elbow flexion--extension was predicted with 83.02 % accuracy by a support vector machine model using EMG signal inputs. Prediction of force, the motion characteristic that cannot be measured without impeding motion, was improved from 76.85 % correct to 79.17 % accurate during elbow flexion--extension by providing measurable arm position and velocity information as additional inputs to a linear discriminant analysis model. The accuracy of force prediction was improved by 5.2 % (increased from 59.38 % to 64.58 %) during an activity of daily living when motion speeds were included as an input to a linear discriminant analysis model in addition to EMG signals.
Future work should expand on using motion characteristics and EMG signals to identify interactions between a person and the environment, in order to guide high level tuning of control models working towards controlling wearable elbow braces during dynamic movements
Attention Enhancement and Motion Assistance for Virtual Reality-Mediated Upper-Limb Rehabilitation
Dysfunctions of upper limbs caused by diseases such as stroke result in difficulties in conducting day-to-day activities. Studies show that rehabilitation training using virtual reality games is helpful for patients to restore arm functions. It has been found that ensuring active patient participation and effort devoting in the process is very important to obtain better training results. This article introduces a method to help patients increase their engagement and provide motion assistance in virtual reality-mediated upper-limb rehabilitation training. Attention enhancement and motion assistance is achieved through an illusion of virtual forces created by altering the drag speed between the cursor and the object presented on a screen to the patient as the only feedback. We present two game forms using the proposed method, including a target-approaching game and a maze-following game. The results of evaluation experiments with human participants showed that the proposed method could provide path guidance that significantly improved path-following performance of users and required more involvement of the users when compared to playing the game without attention enhancement and motion assistance
Kinematic assessment for stroke patients in a stroke game and a daily activity recognition and assessment system
Stroke is the leading cause of serious, long-term disabilities among which deficits in motor abilities in arms or legs are most common. Those who suffer a stroke can recover through effective rehabilitation which is delicately personalized. To achieve the best personalization, it is essential for clinicians to monitor patients' health status and recovery progress accurately and consistently. Traditionally, rehabilitation involves patients performing exercises in clinics where clinicians oversee the procedure and evaluate patients' recovery progress. Following the in-clinic visits, additional home practices are tailored and assigned to patients. The in-clinic visits are important to evaluate recovery progress. The information collected can then help clinicians customize home practices for stroke patients. However, as the number of in-clinic sessions is limited by insurance policies, the recovery information collected in-clinic is often insufficient. Meanwhile, the home practice programs report low adherence rates based on historic data. Given that clinicians rely on patients to self-report adherence, the actual adherence rate could be even lower. Despite the limited feedback clinicians could receive, the measurement method is subjective as well. In practice, classic clinical scales are mostly used for assessing the qualities of movements and the recovery status of patients. However, these clinical scales are evaluated subjectively with only moderate inter-rater and intra-rater reliabilities. Taken together, clinicians lack a method to get sufficient and accurate feedback from patients, which limits the extent to which clinicians can personalize treatment plans. This work aims to solve this problem. To help clinicians obtain abundant health information regarding patients' recovery in an objective approach, I've developed a novel kinematic assessment toolchain that consists of two parts. The first part is a tool to evaluate stroke patients' motions collected in a rehabilitation game setting. This kinematic assessment tool utilizes body-tracking in a rehabilitation game. Specifically, a set of upper body assessment measures were proposed and calculated for assessing the movements using skeletal joint data. Statistical analysis was applied to evaluate the quality of upper body motions using the assessment outcomes. Second, to classify and quantify home activities for stroke patients objectively and accurately, I've developed DARAS, a daily activity recognition and assessment system that evaluates daily motions in a home setting. DARAS consists of three main components: daily action logger, action recognition part, and assessment part. The logger is implemented with a Foresite system to record daily activities using depth and skeletal joint data. Daily activity data in a realistic environment were collected from sixteen post-stroke participants. The collection period for each participant lasts three months. An ensemble network for activity recognition and temporal localization was developed to detect and segment the clinically relevant actions from the recorded data. The ensemble network fuses the prediction outputs from customized 3D Convolutional-De-Convolutional, customized Region Convolutional 3D network and a proposed Region Hierarchical Co-occurrence network which learns rich spatial-temporal features from either depth data or joint data. The per-frame precision and the per-action precision were 0.819 and 0.838, respectively, on the validation set. For the recognized actions, the kinematic assessments were performed using the skeletal joint data, as well as the longitudinal assessments. The results showed that, compared with non-stroke participants, stroke participants had slower hand movements, were less active, and tended to perform fewer hand manipulation actions. The assessment outcomes from the proposed toolchain help clinicians to provide more personalized rehabilitation plans that benefit patients.Includes bibliographical references
Towards Assistive Feeding with a General-Purpose Mobile Manipulator
General-purpose mobile manipulators have the potential to serve as a
versatile form of assistive technology. However, their complexity creates
challenges, including the risk of being too difficult to use. We present a
proof-of-concept robotic system for assistive feeding that consists of a Willow
Garage PR2, a high-level web-based interface, and specialized autonomous
behaviors for scooping and feeding yogurt. As a step towards use by people with
disabilities, we evaluated our system with 5 able-bodied participants. All 5
successfully ate yogurt using the system and reported high rates of success for
the system's autonomous behaviors. Also, Henry Evans, a person with severe
quadriplegia, operated the system remotely to feed an able-bodied person. In
general, people who operated the system reported that it was easy to use,
including Henry. The feeding system also incorporates corrective actions
designed to be triggered either autonomously or by the user. In an offline
evaluation using data collected with the feeding system, a new version of our
multimodal anomaly detection system outperformed prior versions.Comment: This short 4-page paper was accepted and presented as a poster on
May. 16, 2016 in ICRA 2016 workshop on 'Human-Robot Interfaces for Enhanced
Physical Interactions' organized by Arash Ajoudani, Barkan Ugurlu, Panagiotis
Artemiadis, Jun Morimoto. It was peer reviewed by one reviewe
Markerless Analysis of Upper Extremity Kinematics during Standardized Pediatric Assessment
Children with hemiplegic cerebral palsy experience reduced motor performance in the affected upper extremity and are typically evaluated based on degree of functional impairment using activity-based assessments such as the Shriners Hospitals for Children Upper Extremity Evaluation (SHUEE), a validated clinical measure, to describe performance prior to and following rehabilitative or surgical interventions. Evaluations rely on subjective therapist scoring techniques and lack sensitivity to detect change. Objective clinical motion analysis systems are an available but time-consuming and cost-intensive alternative, requiring uncomfortable application of markers to the patient. There is currently no available markerless, low-cost system that quantitatively assesses upper extremity kinematics to improve sensitivity of evaluation during standardized task performance. A motion analysis system was developed, using Microsoft Kinect hardware to track motion during broad arm and subtle hand and finger movements. Algorithms detected and recorded skeletal position and calculated angular kinematics. Lab-developed articulating hand model and elbow fixation devices were used to evaluate accuracy, intra-trial, and inter-trial reliability of the Kinect platform. Results of technical evaluation indicate reasonably accurate detection and differentiation between hand and arm positions. Twelve typically-developing adolescent subjects were tested to characterize and evaluate performance scores obtained from the SHUEE and Kinect motion analysis system. Feasibility of the platform was determined in terms of kinematics and as an enhancement of quantitative kinematic reporting to the SHUEE, and a population mean of typically developing subject kinematics obtained for future development of performance scoring algorithms. The system was observed to be easily operable and clinically effective in subject testing. The Kinect motion analysis platform developed to quantify upper extremity motion during standardized tasks is a low-cost, portable, accurate, and reliable system in kinematic reporting, and has demonstrated quality of results in both technical evaluation of the system and a study of its applicability to standardized task-based evaluation, but has hardware and software limitations which will be resolved in future improvements of the system. The SHUEE benefits from improved quantitative data, and the Kinect system provides enhanced sensitivity in clinical upper extremity analysis for children with hemiplegic cerebral palsy
- âŠ