72 research outputs found

    Multisensor Data Fusion for Human Activities Classification and Fall Detection

    Get PDF
    Significant research exists on the use of wearable sensors in the context of assisted living for activities recognition and fall detection, whereas radar sensors have been studied only recently in this domain. This paper approaches the performance limitation of using individual sensors, especially for classification of similar activities, by implementing information fusion of features extracted from experimental data collected by different sensors, namely a tri-axial accelerometer, a micro-Doppler radar, and a depth camera. Preliminary results confirm that combining information from heterogeneous sensors improves the overall performance of the system. The classification accuracy attained by means of this fusion approach improves by 11.2% compared to radar-only use, and by 16.9% compared to the accelerometer. Furthermore, adding features extracted from a RGB-D Kinect sensor, the overall classification accuracy increases up to 91.3%

    Toward an Automatic Road Accessibility Information Collecting and Sharing Based on Human Behavior Sensing Technologies of Wheelchair Users

    Get PDF
    AbstractThis research proposes a methodology for digitizing street level accessibility with human sensing of wheelchair users. The dig- itization of street level accessibility is essential to develop accessibility maps or to personalize a route considering accessibility. However, current digitization methodologies are not sufficient because it requires a lot of manpower and therefore money and time cost. The proposed method makes it possible to digitize the accessibility semi-automatically. In this research, a three-axis accelerometer embedded on iPod touch sensed actions of nine wheelchair users across the range of disabilities and aged groups, in Tokyo, approximately 9hours. This paper reports out attempts to estimate both environmental factors: the status of street and subjective factors: driver's fatigue from human sensing data using machine learning

    The Big Data Obstacle of Lifelogging

    Get PDF
    Living in the digital age has resulted in a data rich society where the ability to log every moment of our lives is now possible. This chronicle is known as a human digital memory and is a heterogeneous record of our lives, which grows alongside its human counterpart. Managing a lifetime of data results in these sets of big data growing to enormous proportions; as these records increase in size the problem of effectively managing them becomes more difficult. This paper explores the challenges of searching such big data sets of human digital memory data and posits a new approach that treats the searching of human digital memory data as a machine learning problem

    Clustering of Physical Activities for Quantified Self and mHealth Applications

    Get PDF

    A multi-sensory approach for remote health monitoring of older people

    Get PDF
    Growing life expectancy and increasing incidence of multiple chronic health conditions are significant societal challenges. Different technologies have been proposed to address these issues, detect critical events, such as stroke or falls, and monitor automatically human activities for health condition inference and anomaly detection. This paper aims to investigate two types of sensing technologies proposed for assisted living: wearable and radar sensors. First, different feature selection methods are validated and compared in terms of accuracy and computational loads. Then, information fusion is applied to enhance activity classification accuracy combining the two sensors. Improvements in classification accuracy of approximately 12% using feature level fusion are achieved with both support vector machine s (SVMs) and k-nearest neighbor (KNN) classifiers. Decision-level fusion schemes are also investigated, yielding classification accuracy in the order of 97%-98%
    corecore