68,099 research outputs found

    Sound Event Detection and Recognition in Autonomous Robot Navigation

    Get PDF
    While many elders of our communities live alone or do not have constant support of a person watching over them, an artificially intelligent navigator robot can potentially patrol the living environment to provide consistent activity detection and recognition to alert them of any threats or dangers, call for emergency support if needed, as well as becoming customized to provide assistance or reminders for the user’s daily routines and activities. The focus of this presentation is the process of utilizing various engineering tools with a multidisciplinary approach to the concept of sound activity detection and recognition, as the aural layer of a multi-modal AI-based sensing system developed for elderly care. Therefore, how sound behaves in a physical and natural sense, and its signals through the tools of technology will be first discussed. The knowledge of how the world of AI uses these precisely quantifiable behaviors of sound to detect and recognize various sound events are then presented to build the foundation for the more hands-on side of the research. The Oculus Prime Navigator Robot which is equipped with various sensors such as RGB and depth cameras, as well as a directional microphone array, allow the active real-time detection of human activities in the living environment, which can also be demoed to the audience. Faculty Supervisor: Dr. Shahram Payandeh, School of Engineering Science, Simon Fraser Universit

    Radar and RGB-depth sensors for fall detection: a review

    Get PDF
    This paper reviews recent works in the literature on the use of systems based on radar and RGB-Depth (RGB-D) sensors for fall detection, and discusses outstanding research challenges and trends related to this research field. Systems to detect reliably fall events and promptly alert carers and first responders have gained significant interest in the past few years in order to address the societal issue of an increasing number of elderly people living alone, with the associated risk of them falling and the consequences in terms of health treatments, reduced well-being, and costs. The interest in radar and RGB-D sensors is related to their capability to enable contactless and non-intrusive monitoring, which is an advantage for practical deployment and users’ acceptance and compliance, compared with other sensor technologies, such as video-cameras, or wearables. Furthermore, the possibility of combining and fusing information from The heterogeneous types of sensors is expected to improve the overall performance of practical fall detection systems. Researchers from different fields can benefit from multidisciplinary knowledge and awareness of the latest developments in radar and RGB-D sensors that this paper is discussing

    Improving activity recognition using a wearable barometric pressure sensor in mobility-impaired stroke patients.

    Get PDF
    © 2015 Massé et al.Background: Stroke survivors often suffer from mobility deficits. Current clinical evaluation methods, including questionnaires and motor function tests, cannot provide an objective measure of the patients mobility in daily life. Physical activity performance in daily-life can be assessed using unobtrusive monitoring, for example with a single sensor module fixed on the trunk. Existing approaches based on inertial sensors have limited performance, particularly in detecting transitions between different activities and postures, due to the inherent inter-patient variability of kinematic patterns. To overcome these limitations, one possibility is to use additional information from a barometric pressure (BP) sensor. Methods: Our study aims at integrating BP and inertial sensor data into an activity classifier in order to improve the activity (sitting, standing, walking, lying) recognition and the corresponding body elevation (during climbing stairs or when taking an elevator). Taking into account the trunk elevation changes during postural transitions (sit-to-stand, stand-to-sit), we devised an event-driven activity classifier based on fuzzy-logic. Data were acquired from 12 stroke patients with impaired mobility, using a trunk-worn inertial and BP sensor. Events, including walking and lying periods and potential postural transitions, were first extracted. These events were then fed into a double-stage hierarchical Fuzzy Inference System (H-FIS). The first stage processed the events to infer activities and the second stage improved activity recognition by applying behavioral constraints. Finally, the body elevation was estimated using a pattern-enhancing algorithm applied on BP. The patients were videotaped for reference. The performance of the algorithm was estimated using the Correct Classification Rate (CCR) and F-score. The BP-based classification approach was benchmarked against a previously-published fuzzy-logic classifier (FIS-IMU) and a conventional epoch-based classifier (EPOCH). Results: The algorithm performance for posture/activity detection, in terms of CCR was 90.4 %, with 3.3 % and 5.6 % improvements against FIS-IMU and EPOCH, respectively. The proposed classifier essentially benefits from a better recognition of standing activity (70.3 % versus 61.5 % [FIS-IMU] and 42.5 % [EPOCH]) with 98.2 % CCR for body elevation estimation. Conclusion: The monitoring and recognition of daily activities in mobility-impaired stoke patients can be significantly improved using a trunk-fixed sensor that integrates BP, inertial sensors, and an event-based activity classifier
    • …
    corecore