4 research outputs found

    Elderly Fall Detection and Fall Direction Detection via Various Machine Learning Algorithms Using Wearable Sensors

    Get PDF
    The world population is aging rapidly. Some of the elderly live alone and it is observed that the elderly who live with their families frequently have to stay at home alone, especially during the working hours of adult members of the family. Falling while alone at home often results in fatal injuries and even death in elderly individuals. Fall detection systems detect falls and provide emergency healthcare services quickly. In this study, a two-step fall detection and fall direction detection system has been developed by using a public dataset and by testing 5 different machine learning algorithms comparatively. If a fall is detected in the first stage, the second stage is started and the direction of the fall is determined. In this way, the fall direction of the elderly individual can be determined for use in future researches, and a system that enables necessary measures such as opening an airbag in the direction of the fall is developed. Thus, a gradual fall detection and fall direction detection system has been developed by determining the best classifying algorithms. As a result, it has been determined that Ensemble Subspace k-NN classifier performs a little more successful classification compared to other classifiers. The classification via the test data corresponding to 30% of the total data, which was never used during the training phase, has been performed with 99.4% accuracy, and then 97.2% success has been achieved in determining the direction of falling

    Hips do lie! A position-aware mobile fall detection system

    Full text link
    Ambient Assisted Living using mobile device sensors is an active area of research in pervasive computing. Multiple approaches have shown that wearable sensors perform very well and distinguish falls reliably from Activities of Daily Living. However, these systems are tested in a controlled environment and are optimized for a given set of sensor types, sensor positions, and subjects. In this work, we propose a self-adaptive pervasive fall detection approach that is robust to the heterogeneity of real life situations. Therefore, we combine sensor data of four publicly available datasets, covering about 100 subjects, 5 devices, and 3 sensor placements. In a comprehensive evaluation, we show that our system is not only robust regarding the different dimensions of heterogeneity, but also adapts autonomously to spontaneous changes in the sensor's position at runtime

    Sensor-based human activity recognition: Overcoming issues in a real world setting

    Get PDF
    The rapid growing of the population age in industrialized societies calls for advanced tools to continuous monitor the activities of people. The goals of those tools are usually to support active and healthy ageing, and to early detect possible health issues to enable a long and independent life. Recent advancements in sensor miniaturization and wireless communications have paved the way to unobtrusive activity recognition systems. Hence, many pervasive health care systems have been proposed which monitor activities through unobtrusive sensors and by machine learning or artificial intelligence methods. Unfortunately, while those systems are effective in controlled environments, their actual effectiveness out of the lab is still limited due to different shortcomings of existing approaches. In this work, we explore such systems and aim to overcome existing limitations and shortcomings. Focusing on physical movements and crucial activities, our goal is to develop robust activity recognition methods based on external and wearable sensors that generate high quality results in a real world setting. Under laboratory conditions, existing research already showed that wearable sensors are suitable to recognize physical activities while external sensors are promising for activities that are more complex. Consequently, we investigate problems that emerge when coming out of the lab. This includes the position handling of wearable devices, the need of large expensive labeled datasets, the requirement to recognize activities in almost real-time, the necessity to adapt deployed systems online to changes in behavior of the user, the variability of executing an activity, and to use data and models across people. As a result, we present feasible solutions for these problems and provide useful insights for implementing corresponding techniques. Further, we introduce approaches and novel methods for both external and wearable sensors where we also clarify limitations and capabilities of the respective sensor types. Thus, we investigate both types separately to clarify their contribution and application use in respect of recognizing different types of activities in a real world scenario. Overall, our comprehensive experiments and discussions show on the one hand the feasibility of physical activity recognition but also recognizing complex activities in a real world scenario. Comparing our techniques and results with existing works and state-of-the-art techniques also provides evidence concerning the reliability and quality of the proposed techniques. On the other hand, we also identify promising research directions and highlight that combining external and wearable sensors seem to be the next step to go beyond activity recognition. In other words, our results and discussions also show that combining external and wearable sensors would compensate weaknesses of the individual sensors in respect of certain activity types and scenarios. Therefore, by addressing the outlined problems, we pave the way for a hybrid approach. Along with our presented solutions, we conclude our work with a high-level multi-tier activity recognition architecture showing that aspects like physical activity, (emotional) condition, used objects, and environmental features are critical for reliable recognizing complex activities

    A framework for engineering reusable self-adaptive systems

    Full text link
    The increasing complexity and size of information systems result in an increasing effort for maintenance. Additionally, miniaturization of devices leads to mobility and the need for context-adaptation. Self-adaptive Systems (SASs) can adapt to changes in their environment or the system itself. So far, however, development of SASs is frequently tailored towards the requirements of use cases. The research for reusable elements — for implementation as well as design processes — is often neglected. Integrating reusable processes and implementation artifacts into a framework and offering a tool suite to developers would make development of SASs faster and less error-prone. This thesis presents the Framework for Engineering Self-adaptive Systems (FESAS). It offers a reusable implementation of a reference system, tools for implementation and design as well as a middleware for controlling system deployment. As a second contribution, this thesis introduces a new approach for self-improvement of SASs which complements the SAS with meta-adaptation
    corecore