2,234 research outputs found

    Recognition of elementary arm movements using orientation of a tri-axial accelerometer located near the wrist

    No full text
    In this paper we present a method for recognising three fundamental movements of the human arm (reach and retrieve, lift cup to mouth, rotation of the arm) by determining the orientation of a tri-axial accelerometer located near the wrist. Our objective is to detect the occurrence of such movements performed with the impaired arm of a stroke patient during normal daily activities as a means to assess their rehabilitation. The method relies on accurately mapping transitions of predefined, standard orientations of the accelerometer to corresponding elementary arm movements. To evaluate the technique, kinematic data was collected from four healthy subjects and four stroke patients as they performed a number of activities involved in a representative activity of daily living, 'making-a-cup-of-tea'. Our experimental results show that the proposed method can independently recognise all three of the elementary upper limb movements investigated with accuracies in the range 91–99% for healthy subjects and 70–85% for stroke patients

    Recognition of elementary upper limb movements in an activity of daily living using data from wrist mounted accelerometers

    No full text
    In this paper we present a methodology as a proof of concept for recognizing fundamental movements of the humanarm (extension, flexion and rotation of the forearm) involved in ‘making-a-cup-of-tea’, typical of an activity of daily-living (ADL). The movements are initially performed in a controlled environment as part of a training phase and the data are grouped into three clusters using k-means clustering. Movements performed during ADL, forming part of the testing phase, are associated with each cluster label using a minimum distance classifier in a multi-dimensional feature space, comprising of features selected from a ranked set of 30 features, using Euclidean and Mahalonobis distance as the metric. Experiments were performed with four healthy subjects and our results show that the proposed methodology can detect the three movements with an overall average accuracy of 88% across all subjects and arm movement types using Euclidean distance classifier

    Fall Prediction and Prevention Systems: Recent Trends, Challenges, and Future Research Directions.

    Get PDF
    Fall prediction is a multifaceted problem that involves complex interactions between physiological, behavioral, and environmental factors. Existing fall detection and prediction systems mainly focus on physiological factors such as gait, vision, and cognition, and do not address the multifactorial nature of falls. In addition, these systems lack efficient user interfaces and feedback for preventing future falls. Recent advances in internet of things (IoT) and mobile technologies offer ample opportunities for integrating contextual information about patient behavior and environment along with physiological health data for predicting falls. This article reviews the state-of-the-art in fall detection and prediction systems. It also describes the challenges, limitations, and future directions in the design and implementation of effective fall prediction and prevention systems

    Computational Approaches for Remote Monitoring of Symptoms and Activities

    Get PDF
    We now have a unique phenomenon where significant computational power, storage, connectivity, and built-in sensors are carried by many people willingly as part of their life style; two billion people now use smart phones. Unique and innovative solutions using smart phones are motivated by rising health care cost in both the developed and developing worlds. In this work, development of a methodology for building a remote symptom monitoring system for rural people in developing countries has been explored. Design, development, deployment, and evaluation of e-ESAS is described. The system’s performance was studied by analyzing feedback from users. A smart phone based prototype activity detection system that can detect basic human activities for monitoring by remote observers was developed and explored in this study. The majority voting fusion technique, along with decision tree learners were used to classify eight activities in a multi-sensor framework. This multimodal approach was examined in details and evaluated for both single and multi-subject cases. Time-delay embedding with expectation-maximization for Gaussian Mixture Model was explored as a way of developing activity detection system using reduced number of sensors, leading to a lower computational cost algorithm. The systems and algorithms developed in this work focus on means for remote monitoring using smart phones. The smart phone based remote symptom monitoring system called e-ESAS serves as a working tool to monitor essential symptoms of patients with breast cancer by doctors. The activity detection system allows a remote observer to monitor basic human activities. For the activity detection system, the majority voting fusion technique in multi-sensor architecture is evaluated for eight activities in both single and multiple subjects cases. Time-delay embedding with expectation-maximization algorithm for Gaussian Mixture Model was studied using data from multiple single sensor cases

    Computational Approaches for Remote Monitoring of Symptoms and Activities

    Get PDF
    We now have a unique phenomenon where significant computational power, storage, connectivity, and built-in sensors are carried by many people willingly as part of their life style; two billion people now use smart phones. Unique and innovative solutions using smart phones are motivated by rising health care cost in both the developed and developing worlds. In this work, development of a methodology for building a remote symptom monitoring system for rural people in developing countries has been explored. Design, development, deployment, and evaluation of e-ESAS is described. The system’s performance was studied by analyzing feedback from users. A smart phone based prototype activity detection system that can detect basic human activities for monitoring by remote observers was developed and explored in this study. The majority voting fusion technique, along with decision tree learners were used to classify eight activities in a multi-sensor framework. This multimodal approach was examined in details and evaluated for both single and multi-subject cases. Time-delay embedding with expectation-maximization for Gaussian Mixture Model was explored as a way of developing activity detection system using reduced number of sensors, leading to a lower computational cost algorithm. The systems and algorithms developed in this work focus on means for remote monitoring using smart phones. The smart phone based remote symptom monitoring system called e-ESAS serves as a working tool to monitor essential symptoms of patients with breast cancer by doctors. The activity detection system allows a remote observer to monitor basic human activities. For the activity detection system, the majority voting fusion technique in multi-sensor architecture is evaluated for eight activities in both single and multiple subjects cases. Time-delay embedding with expectation-maximization algorithm for Gaussian Mixture Model was studied using data from multiple single sensor cases

    Comparison and Characterization of Android-Based Fall Detection Systems

    Get PDF
    Falls are a foremost source of injuries and hospitalization for seniors. The adoption of automatic fall detection mechanisms can noticeably reduce the response time of the medical staff or caregivers when a fall takes place. Smartphones are being increasingly proposed as wearable, cost-effective and not-intrusive systems for fall detection. The exploitation of smartphones’ potential (and in particular, the Android Operating System) can benefit from the wide implantation, the growing computational capabilities and the diversity of communication interfaces and embedded sensors of these personal devices. After revising the state-of-the-art on this matter, this study develops an experimental testbed to assess the performance of different fall detection algorithms that ground their decisions on the analysis of the inertial data registered by the accelerometer of the smartphone. Results obtained in a real testbed with diverse individuals indicate that the accuracy of the accelerometry-based techniques to identify the falls depends strongly on the fall pattern. The performed tests also show the difficulty to set detection acceleration thresholds that allow achieving a good trade-off between false negatives (falls that remain unnoticed) and false positives (conventional movements that are erroneously classified as falls). In any case, the study of the evolution of the battery drain reveals that the extra power consumption introduced by the Android monitoring applications cannot be neglected when evaluating the autonomy and even the viability of fall detection systems.Ministerio de Economía y Competitividad TEC2009-13763-C02-0

    A 'one-size-fits-most' walking recognition method for smartphones, smartwatches, and wearable accelerometers

    Full text link
    The ubiquity of personal digital devices offers unprecedented opportunities to study human behavior. Current state-of-the-art methods quantify physical activity using 'activity counts,' a measure which overlooks specific types of physical activities. We proposed a walking recognition method for sub-second tri-axial accelerometer data, in which activity classification is based on the inherent features of walking: intensity, periodicity, and duration. We validated our method against 20 publicly available, annotated datasets on walking activity data collected at various body locations (thigh, waist, chest, arm, wrist). We demonstrated that our method can estimate walking periods with high sensitivity and specificity: average sensitivity ranged between 0.92 and 0.97 across various body locations, and average specificity for common daily activities was typically above 0.95. We also assessed the method's algorithmic fairness to demographic and anthropometric variables and measurement contexts (body location, environment). Finally, we have released our method as open-source software in MATLAB and Python.Comment: 39 pages, 4 figures (incl. 1 supplementary), and 5 tables (incl. 2 supplementary
    • 

    corecore