3 research outputs found

    Recognizing upper limb movements with wrist worn inertial sensors using k-means clustering classification

    No full text
    In this paper we present a methodology for recognizing three fundamental movements of the human forearm (extension, flexion and rotation) using pattern recognition applied to the data from a single wrist-worn, inertial sensor. We propose that this technique could be used as a clinical tool to assess rehabilitation progress in neurodegenerative pathologies such as stroke or cerebral palsy by tracking the number of times a patient performs specific arm movements (e.g. prescribed exercises) with their paretic arm throughout the day. We demonstrate this with healthy subjects and stroke patients in a simple proof of concept study in whichthese arm movements are detected during an archetypal activity of daily-living (ADL) – ‘making-a-cup-of-tea’. Data is collected from a tri-axial accelerometer and a tri-axial gyroscope located proximal to the wrist. In a training phase, movements are initially performed in a controlled environment which are represented by a ranked set of 30 time-domain features. Using a sequential forward selection technique, for each set of feature combinations three clusters are formed using k-means clustering followed by 10 runs of 10-fold cross validation on the training data to determine the best feature combinations. For the testing phase, movements performed during the ADL are associated with each cluster label using a minimum distance classifier in a multi-dimensional feature space, comprised of the best ranked features, using Euclidean or Mahalanobis distance as the metric. Experiments were performed with four healthy subjects and four stroke survivors and our results showthat the proposed methodology can detect the three movements performed during the ADL with an overall average accuracy of 88% using the accelerometer data and 83% using the gyroscope data across all healthy subjects and arm movement types. The average accuracy across all stroke survivors was 70% using accelerometer data and 66% using gyroscope data. We also use a Linear Discriminant Analysis (LDA) classifier and a Support Vector Machine (SVM) classifier in association with the same set of features to detect the three arm movements and compare the results to demonstrate the effectiveness of our proposed methodology

    Rehab-Net: Deep Learning framework for Arm Movement Classification using Wearable Sensors for Stroke Rehabilitation

    No full text
    In this paper, we present a deep learning framework 'Rehab-Net' for effectively classifying three upper limb movements of the human arm, involving extension, flexion and rotation of the forearm which over the time could provide a measure of rehabilitation progress. The proposed framework, Rehab-Net is formulated with a personalized, light weight and low complex, customized CNN model, using 2-layers of Convolutional neural network (CNN), interleaved with pooling layers, followed by a fully-connected layer that classifies the three movements from tri-axial acceleration input data collected from the wrist. The proposed Rehab-net framework was validated on sensor data collected in two situations-a) seminaturalistic environment involving an archetypal activity of 'making-tea' with 4 stroke survivors and b) natural environment, where 10 stroke survivors were free to perform any desired arm movement for a duration of 120 minutes. We achieve an overall accuracy of 97.89% on semi-naturalistic data and 88.87% on naturalistic data which exceeded state-of-the-art learning algorithms namely, Linear Discriminant Analysis, Support Vector Machines, and k-means clustering with an average accuracy of 48.89%, 44.14% and 27.64%. Subsequently, a computational complexity analysis of the proposed model has been discussed with an eye towards hardware implementation. The clinical significance of this study is to accurately monitor the clinical progress of the rehabilitated subjects under the ambulatory settings
    corecore