5,031 research outputs found
Towards a Practical Pedestrian Distraction Detection Framework using Wearables
Pedestrian safety continues to be a significant concern in urban communities
and pedestrian distraction is emerging as one of the main causes of grave and
fatal accidents involving pedestrians. The advent of sophisticated mobile and
wearable devices, equipped with high-precision on-board sensors capable of
measuring fine-grained user movements and context, provides a tremendous
opportunity for designing effective pedestrian safety systems and applications.
Accurate and efficient recognition of pedestrian distractions in real-time
given the memory, computation and communication limitations of these devices,
however, remains the key technical challenge in the design of such systems.
Earlier research efforts in pedestrian distraction detection using data
available from mobile and wearable devices have primarily focused only on
achieving high detection accuracy, resulting in designs that are either
resource intensive and unsuitable for implementation on mainstream mobile
devices, or computationally slow and not useful for real-time pedestrian safety
applications, or require specialized hardware and less likely to be adopted by
most users. In the quest for a pedestrian safety system that achieves a
favorable balance between computational efficiency, detection accuracy, and
energy consumption, this paper makes the following main contributions: (i)
design of a novel complex activity recognition framework which employs motion
data available from users' mobile and wearable devices and a lightweight
frequency matching approach to accurately and efficiently recognize complex
distraction related activities, and (ii) a comprehensive comparative evaluation
of the proposed framework with well-known complex activity recognition
techniques in the literature with the help of data collected from human subject
pedestrians and prototype implementations on commercially-available mobile and
wearable devices
A survey on wireless body area networks for eHealthcare systems in residential environments
The progress in wearable and implanted health monitoring technologies has strong potential to alter the future of healthcare services by enabling ubiquitous monitoring of patients. A typical health monitoring system consists of a network of wearable or implanted sensors that constantly monitor physiological parameters. Collected data are relayed using existing wireless communication protocols to the base station for additional processing. This article provides researchers with information to compare the existing low-power communication technologies that can potentially support the rapid development and deployment of WBAN systems, and mainly focuses on remote monitoring of elderly or chronically ill patients in residential environments
TinyML based Deep Learning Model for Activity Detection
Our physical and emotional well-being are directly impacted by our body positions. In addition to promoting a confident, upright image, maintaining good body posture during various activities also ensures that our musculoskeletal system is properly aligned. On the other side, bad posture can result in a number of musculoskeletal conditions, discomfort, and reduced productivity. Accurate systems that can detect posture in real time, activity detection, are required due to the rising use of wearable technology and the growing interest in health and fitness tracking. The goal of this project is to create a TinyML model for wearable activity detection that will allow users to assess their posture and make necessary corrections in order to improve their health and general well-being. The project intends to contribute to the creation of useful posture detection technologies that can be quickly implemented on wearable devices for widespread usage by leveraging machine learning algorithms and wearable sensor data. For reliable posture categorization, the model architecture combines deep neural networks (DNN) and LSTM layers. With the development and implementation of the TinyML model, a significant decrease in the model's power consumption, memory, and latency was achieved without any compromise in the accuracy. This work can be used in the fields of health, wellness, rehabilitation, corporate life, sports and fitness to keep track of calories burned, activity duration, distance traveled, posture analysis, and real-time tracking
Radar and RGB-depth sensors for fall detection: a review
This paper reviews recent works in the literature on the use of systems based on radar and RGB-Depth (RGB-D) sensors for fall detection, and discusses outstanding research challenges and trends related to this research field. Systems to detect reliably fall events and promptly alert carers and first responders have gained significant interest in the past few years in order to address the societal issue of an increasing number of elderly people living alone, with the associated risk of them falling and the consequences in terms of health treatments, reduced well-being, and costs. The interest in radar and RGB-D sensors is related to their capability to enable contactless and non-intrusive monitoring, which is an advantage for practical deployment and usersâ acceptance and compliance, compared with other sensor technologies, such as video-cameras, or wearables. Furthermore, the possibility of combining and fusing information from The heterogeneous types of sensors is expected to improve the overall performance of practical fall detection systems. Researchers from different fields can benefit from multidisciplinary knowledge and awareness of the latest developments in radar and RGB-D sensors that this paper is discussing
Wireless body sensor networks for health-monitoring applications
This is an author-created, un-copyedited version of an article accepted for publication in
Physiological Measurement. The publisher is
not responsible for any errors or omissions in this version of the manuscript or any version
derived from it. The Version of Record is available online at http://dx.doi.org/10.1088/0967-3334/29/11/R01
- âŠ