513 research outputs found
Review of Wearable Devices and Data Collection Considerations for Connected Health
Wearable sensor technology has gradually extended its usability into a wide range of well-known applications. Wearable sensors can typically assess and quantify the wearer’s physiology and are commonly employed for human activity detection and quantified self-assessment. Wearable sensors are increasingly utilised to monitor patient health, rapidly assist with disease diagnosis, and help predict and often improve patient outcomes. Clinicians use various self-report questionnaires and well-known tests to report patient symptoms and assess their functional ability. These assessments are time consuming and costly and depend on subjective patient recall. Moreover, measurements may not accurately demonstrate the patient’s functional ability whilst at home. Wearable sensors can be used to detect and quantify specific movements in different applications. The volume of data collected by wearable sensors during long-term assessment of ambulatory movement can become immense in tuple size. This paper discusses current techniques used to track and record various human body movements, as well as techniques used to measure activity and sleep from long-term data collected by wearable technology devices
Embedded neural network for real-time animal behavior classification
Recent biological studies have focused on understanding animal interactions and welfare. To help biolo- gists to obtain animals’ behavior information, resources like wireless sensor networks are needed. More- over, large amounts of obtained data have to be processed off-line in order to classify different behaviors.There are recent research projects focused on designing monitoring systems capable of measuring someanimals’ parameters in order to recognize and monitor their gaits or behaviors. However, network unre- liability and high power consumption have limited their applicability.In this work, we present an animal behavior recognition, classification and monitoring system based ona wireless sensor network and a smart collar device, provided with inertial sensors and an embeddedmulti-layer perceptron-based feed-forward neural network, to classify the different gaits or behaviorsbased on the collected information. In similar works, classification mechanisms are implemented in aserver (or base station). The main novelty of this work is the full implementation of a reconfigurableneural network embedded into the animal’s collar, which allows a real-time behavior classification andenables its local storage in SD memory. Moreover, this approach reduces the amount of data transmittedto the base station (and its periodicity), achieving a significantly improving battery life. The system hasbeen simulated and tested in a real scenario for three different horse gaits, using different heuristics andsensors to improve the accuracy of behavior recognition, achieving a maximum of 81%.Junta de Andalucía P12-TIC-130
Semi-wildlife gait patterns classification using Statistical Methods and Artificial Neural Networks
Several studies have focused on classifying behavioral
patterns in wildlife and captive species to monitor their
activities and so to understanding the interactions of animals
and control their welfare, for biological research or commercial
purposes. The use of pattern recognition techniques, statistical
methods and Overall Dynamic Body Acceleration (ODBA) are
well known for animal behavior recognition tasks. The reconfigurability
and scalability of these methods are not trivial, since a
new study has to be done when changing any of the configuration
parameters. In recent years, the use of Artificial Neural Networks
(ANN) has increased for this purpose due to the fact that they can
be easily adapted when new animals or patterns are required. In
this context, a comparative study between a theoretical research is
presented, where statistical and spectral analyses were performed
and an embedded implementation of an ANN on a smart collar
device was placed on semi-wild animals. This system is part
of a project whose main aim is to monitor wildlife in real
time using a wireless sensor network infrastructure. Different
classifiers were tested and compared for three different horse
gaits. Experimental results in a real time scenario achieved an
accuracy of up to 90.7%, proving the efficiency of the embedded
ANN implementation.Junta de Andalucía P12-TIC-1300Ministerio de Economía y Competitividad TEC2016-77785-
Sensing and Signal Processing in Smart Healthcare
In the last decade, we have witnessed the rapid development of electronic technologies that are transforming our daily lives. Such technologies are often integrated with various sensors that facilitate the collection of human motion and physiological data and are equipped with wireless communication modules such as Bluetooth, radio frequency identification, and near-field communication. In smart healthcare applications, designing ergonomic and intuitive human–computer interfaces is crucial because a system that is not easy to use will create a huge obstacle to adoption and may significantly reduce the efficacy of the solution. Signal and data processing is another important consideration in smart healthcare applications because it must ensure high accuracy with a high level of confidence in order for the applications to be useful for clinicians in making diagnosis and treatment decisions. This Special Issue is a collection of 10 articles selected from a total of 26 contributions. These contributions span the areas of signal processing and smart healthcare systems mostly contributed by authors from Europe, including Italy, Spain, France, Portugal, Romania, Sweden, and Netherlands. Authors from China, Korea, Taiwan, Indonesia, and Ecuador are also included
MirrorGen Wearable Gesture Recognition using Synthetic Videos
abstract: In recent years, deep learning systems have outperformed traditional machine learning systems in most domains. There has been a lot of research recently in the field of hand gesture recognition using wearable sensors due to the numerous advantages these systems have over vision-based ones. However, due to the lack of extensive datasets and the nature of the Inertial Measurement Unit (IMU) data, there are difficulties in applying deep learning techniques to them. Although many machine learning models have good accuracy, most of them assume that training data is available for every user while other works that do not require user data have lower accuracies. MirrorGen is a technique which uses wearable sensor data and generates synthetic videos using hand movements and it mitigates the traditional challenges of vision based recognition such as occlusion, lighting restrictions, lack of viewpoint variations, and environmental noise. In addition, MirrorGen allows for user-independent recognition involving minimal human effort during data collection. It also helps leverage the advances in vision-based recognition by using various techniques like optical flow extraction, 3D convolution. Projecting the orientation (IMU) information to a video helps in gaining position information of the hands. To validate these claims, we perform entropy analysis on various configurations such as raw data, stick model, hand model and real video. Human hand model is found to have an optimal entropy that helps in achieving user independent recognition. It also serves as a pervasive option as opposed to a video-based recognition. The average user independent recognition accuracy of 99.03% was achieved for a sign language dataset with 59 different users, 20 different signs with 20 repetitions each for a total of 23k training instances. Moreover, synthetic videos can be used to augment real videos to improve recognition accuracy.Dissertation/ThesisMasters Thesis Computer Science 201
Smart Computing and Sensing Technologies for Animal Welfare: A Systematic Review
Animals play a profoundly important and intricate role in our lives today.
Dogs have been human companions for thousands of years, but they now work
closely with us to assist the disabled, and in combat and search and rescue
situations. Farm animals are a critical part of the global food supply chain,
and there is increasing consumer interest in organically fed and humanely
raised livestock, and how it impacts our health and environmental footprint.
Wild animals are threatened with extinction by human induced factors, and
shrinking and compromised habitat. This review sets the goal to systematically
survey the existing literature in smart computing and sensing technologies for
domestic, farm and wild animal welfare. We use the notion of \emph{animal
welfare} in broad terms, to review the technologies for assessing whether
animals are healthy, free of pain and suffering, and also positively stimulated
in their environment. Also the notion of \emph{smart computing and sensing} is
used in broad terms, to refer to computing and sensing systems that are not
isolated but interconnected with communication networks, and capable of remote
data collection, processing, exchange and analysis. We review smart
technologies for domestic animals, indoor and outdoor animal farming, as well
as animals in the wild and zoos. The findings of this review are expected to
motivate future research and contribute to data, information and communication
management as well as policy for animal welfare
Mobility increases localizability: A survey on wireless indoor localization using inertial sensors
Wireless indoor positioning has been extensively studied for the past 2 decades and continuously attracted growing research efforts in mobile computing context. As the integration of multiple inertial sensors (e.g., accelerometer, gyroscope, and magnetometer) to nowadays smartphones in recent years, human-centric mobility sensing is emerging and coming into vogue. Mobility information, as a new dimension in addition to wireless signals, can benefit localization in a number of ways, since location and mobility are by nature related in the physical world. In this article, we survey this new trend of mobility enhancing smartphone-based indoor localization. Specifically, we first study how to measure human mobility: what types of sensors we can use and what types of mobility information we can acquire. Next, we discuss how mobility assists localization with respect to enhancing location accuracy, decreasing deployment cost, and enriching location context. Moreover, considering the quality and cost of smartphone built-in sensors, handling measurement errors is essential and accordingly investigated. Combining existing work and our own working experiences, we emphasize the principles and conduct comparative study of the mainstream technologies. Finally, we conclude this survey by addressing future research directions and opportunities in this new and largely open area.</jats:p
- …