2,781 research outputs found

    Radar and RGB-depth sensors for fall detection: a review

    Get PDF
    This paper reviews recent works in the literature on the use of systems based on radar and RGB-Depth (RGB-D) sensors for fall detection, and discusses outstanding research challenges and trends related to this research field. Systems to detect reliably fall events and promptly alert carers and first responders have gained significant interest in the past few years in order to address the societal issue of an increasing number of elderly people living alone, with the associated risk of them falling and the consequences in terms of health treatments, reduced well-being, and costs. The interest in radar and RGB-D sensors is related to their capability to enable contactless and non-intrusive monitoring, which is an advantage for practical deployment and users’ acceptance and compliance, compared with other sensor technologies, such as video-cameras, or wearables. Furthermore, the possibility of combining and fusing information from The heterogeneous types of sensors is expected to improve the overall performance of practical fall detection systems. Researchers from different fields can benefit from multidisciplinary knowledge and awareness of the latest developments in radar and RGB-D sensors that this paper is discussing

    Magnetic and radar sensing for multimodal remote health monitoring

    Get PDF
    With the increased life expectancy and rise in health conditions related to aging, there is a need for new technologies that can routinely monitor vulnerable people, identify their daily pattern of activities and any anomaly or critical events such as falls. This paper aims to evaluate magnetic and radar sensors as suitable technologies for remote health monitoring purpose, both individually and fusing their information. After experiments and collecting data from 20 volunteers, numerical features has been extracted in both time and frequency domains. In order to analyse and verify the validation of fusion method for different classifiers, a Support Vector Machine with a quadratic kernel, and an Artificial Neural Network with one and multiple hidden layers have been implemented. Furthermore, for both classifiers, feature selection has been performed to obtain salient features. Using this technique along with fusion, both classifiers can detect 10 different activities with an accuracy rate of approximately 96%. In cases where the user is unknown to the classifier, an accuracy of approximately 92% is maintained

    Radar for Assisted Living in the Context of Internet of Things for Health and Beyond

    Get PDF
    This paper discusses the place of radar for assisted living in the context of IoT for Health and beyond. First, the context of assisted living and the urgency to address the problem is described. The second part gives a literature review of existing sensing modalities for assisted living and explains why radar is an upcoming preferred modality to address this issue. The third section presents developments in machine learning that helps improve performances in classification especially with deep learning with a reflection on lessons learned from it. The fourth section introduces recent published work from our research group in the area that shows promise with multimodal sensor fusion for classification and long short-term memory applied to early stages in the radar signal processing chain. Finally, we conclude with open challenges still to be addressed in the area and open to future research directions in animal welfare

    Multisensor Data Fusion for Human Activities Classification and Fall Detection

    Get PDF
    Significant research exists on the use of wearable sensors in the context of assisted living for activities recognition and fall detection, whereas radar sensors have been studied only recently in this domain. This paper approaches the performance limitation of using individual sensors, especially for classification of similar activities, by implementing information fusion of features extracted from experimental data collected by different sensors, namely a tri-axial accelerometer, a micro-Doppler radar, and a depth camera. Preliminary results confirm that combining information from heterogeneous sensors improves the overall performance of the system. The classification accuracy attained by means of this fusion approach improves by 11.2% compared to radar-only use, and by 16.9% compared to the accelerometer. Furthermore, adding features extracted from a RGB-D Kinect sensor, the overall classification accuracy increases up to 91.3%

    WiHAR : From Wi-Fi Channel State Information to Unobtrusive Human Activity Recognition

    Get PDF
    Author's accepted manuscript.© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.acceptedVersio

    Respiration and Activity Detection based on Passive Radio Sensing in Home Environments

    Get PDF
    The pervasive deployment of connected devices in modern society has significantly changed the nature of the wireless landscape, especially in the license free industrial, scientific and medical (ISM) bands. This paper introduces a deep learning enabled passive radio sensing method that can monitor human respiration and daily activities through leveraging unplanned and ever-present wireless bursts in the ISM frequency band, and can be employed as an additional data input within healthcare informatics. Wireless connected biomedical sensors (Medical Things) rely on coding and modulating of the sensor data onto wireless (radio) bursts which comply with specific physical layer standards like 802.11, 802.15.1 or 802.15.4. The increasing use of these unplanned connected sensors has led to a pell-mell of radio bursts which limit the capacity and robustness of communication channels to deliver data, whilst also increasing inter-system interference. This paper presents a novel methodology to disentangle the chaotic bursts in congested radio environments in order to provide healthcare informatics. The radio bursts are treated as pseudo noise waveforms which eliminate the requirement to extract embedded information through signal demodulation or decoding. Instead, we leverage the phase and frequency components of these radio bursts in conjunction with cross ambiguity function (CAF) processing and a Deep Transfer Network (DTN). We use 2.4GHz 802.11 (WiFi) signals to demonstrate experimentally the capability of this technique for human respiration detection (including through-the-wall), and classifying everyday but complex human motions such as standing, sitting and falling
    corecore