16 research outputs found

    Advanced Signal Processing in Wearable Sensors for Health Monitoring

    Get PDF
    Smart, wearables devices on a miniature scale are becoming increasingly widely available, typically in the form of smart watches and other connected devices. Consequently, devices to assist in measurements such as electroencephalography (EEG), electrocardiogram (ECG), electromyography (EMG), blood pressure (BP), photoplethysmography (PPG), heart rhythm, respiration rate, apnoea, and motion detection are becoming more available, and play a significant role in healthcare monitoring. The industry is placing great emphasis on making these devices and technologies available on smart devices such as phones and watches. Such measurements are clinically and scientifically useful for real-time monitoring, long-term care, and diagnosis and therapeutic techniques. However, a pertaining issue is that recorded data are usually noisy, contain many artefacts, and are affected by external factors such as movements and physical conditions. In order to obtain accurate and meaningful indicators, the signal has to be processed and conditioned such that the measurements are accurate and free from noise and disturbances. In this context, many researchers have utilized recent technological advances in wearable sensors and signal processing to develop smart and accurate wearable devices for clinical applications. The processing and analysis of physiological signals is a key issue for these smart wearable devices. Consequently, ongoing work in this field of study includes research on filtration, quality checking, signal transformation and decomposition, feature extraction and, most recently, machine learning-based methods

    Embedding a Grid of Load Cells into a Dining Table for Automatic Monitoring and Detection of Eating Events

    Get PDF
    This dissertation describes a “smart dining table” that can detect and measure consumption events. This work is motivated by the growing problem of obesity, which is a global problem and an epidemic in the United States and Europe. Chapter 1 gives a background on the economic burden of obesity and its comorbidities. For the assessment of obesity, we briefly describe the classic dietary assessment tools and discuss their drawback and the necessity of using more objective, accurate, low-cost, and in-situ automatic dietary assessment tools. We explain in short various technologies used for automatic dietary assessment such as acoustic-, motion-, or image-based systems. This is followed by a literature review of prior works related to the detection of weights and locations of objects sitting on a table surface. Finally, we state the novelty of this work. In chapter 2, we describe the construction of a table that uses an embedded grid of load cells to sense the weights and positions of objects. The main challenge is aligning the tops of adjacent load cells to within a few micrometer tolerance, which we accomplish using a novel inversion process during construction. Experimental tests found that object weights distributed across 4 to 16 load cells could be measured with 99.97±0.1% accuracy. Testing the surface for flatness at 58 points showed that we achieved approximately 4.2±0.5 um deviation among adjacent 2x2 grid of tiles. Through empirical measurements we determined that the table has a 40.2 signal-to-noise ratio when detecting the smallest expected intake amount (0.5 g) from a normal meal (approximate total weight is 560 g), indicating that a tiny amount of intake can be detected well above the noise level of the sensors. In chapter 3, we describe a pilot experiment that tests the capability of the table to monitor eating. Eleven human subjects were video recorded for ground truth while eating a meal on the table using a plate, bowl, and cup. To detect consumption events, we describe an algorithm that analyzes the grid of weight measurements in the format of an image. The algorithm segments the image into multiple objects, tracks them over time, and uses a set of rules to detect and measure individual bites of food and drinks of liquid. On average, each meal consisted of 62 consumption events. Event detection accuracy was very high, with an F1-score per subject of 0.91 to 1.0, and an F1 score per container of 0.97 for the plate and bowl, and 0.99 for the cup. The experiment demonstrates that our device is capable of detecting and measuring individual consumption events during a meal. Chapter 4 compares the capability of our new tool to monitor eating against previous works that have also monitored table surfaces. We completed a literature search and identified the three state-of-the-art methods to be used for comparison. The main limitation of all previous methods is that they used only one load cell for monitoring, so only the total surface weight can be analyzed. To simulate their operations, the weights of our grid of load cells were summed up to use the 2D data as 1D. Data were prepared according to the requirements of each method. Four metrics were used to evaluate the comparison: precision, recall, accuracy, and F1-score. Our method scored the highest in recall, accuracy, and F1-score; compared to all other methods, our method scored 13-21% higher for recall, 8-28% higher for accuracy, and 10-18% higher for F1-score. For precision, our method scored 97% that is just 1% lower than the highest precision, which was 98%. In summary, this dissertation describes novel hardware, a pilot experiment, and a comparison against current state-of-the-art tools. We also believe our methods could be used to build a similar surface for other applications besides monitoring consumption

    Detecting Periods of Eating in Everyday Life by Tracking Wrist Motion — What is a Meal?

    Get PDF
    Eating is one of the most basic activities observed in sentient animals, a behavior so natural that humans often eating without giving the activity a second thought. Unfortunately, this often leads to consuming more calories than expended, which can cause weight gain - a leading cause of diseases and death. This proposal describes research in methods to automatically detect periods of eating by tracking wrist motion so that calorie consumption can be tracked. We first briefly discuss how obesity is caused due to an imbalance in calorie intake and expenditure. Calorie consumption and expenditure can be tracked manually using tools like paper diaries, however it is well known that human bias can affect the accuracy of such tracking. Researchers in the upcoming field of automated dietary monitoring (ADM) are attempting to track diet using electronic methods in an effort to mitigate this bias. We attempt to replicate a previous algorithm that detects eating by tracking wrist motion electronically. The previous algorithm was evaluated on data collected from 43 subjects using an iPhone as the sensor. Periods of time are segmented first, and then classified using a naive Bayesian classifier. For replication, we describe the collection of the Clemson all-day data set (CAD), a free-living eating activity dataset containing 4,680 hours of wrist motion collected from 351 participants - the largest of its kind known to us. We learn that while different sensors are available to log wrist acceleration data, no unified convention exists, and this data must thus be transformed between conventions. We learn that the performance of the eating detection algorithm is affected due to changes in the sensors used to track wrist motion, increased variability in behavior due to a larger participant pool, and the ratio of eating to non-eating in the dataset. We learn that commercially available acceleration sensors contain noise in their reported readings which affects wrist tracking specifically due to the low magnitude of wrist acceleration. Commercial accelerometers can have noise up to 0.06g which is acceptable in applications like automobile crash testing or pedestrian indoor navigation, but not in ones using wrist motion. We quantify linear acceleration noise in our free-living dataset. We explain sources of noise, a method to mitigate it, and also evaluate the effect of this noise on the eating detection algorithm. By visualizing periods of eating in the collected dataset we learn that that people often conduct secondary activities while eating, such as walking, watching television, working, and doing household chores. These secondary activities cause wrist motions that obfuscate wrist motions associated with eating, which increases the difficulty of detecting periods of eating (meals). Subjects reported conducting secondary activities in 72% of meals. Analysis of wrist motion data revealed that the wrist was resting 12.8% of the time during self-reported meals, compared to only 6.8% of the time in a cafeteria dataset. Walking motion was found during 5.5% of the time during meals in free-living, compared to 0% in the cafeteria. Augmenting an eating detection classifier to include walking and resting detection improved the average per person accuracy from 74% to 77% on our free-living dataset (t[353]=7.86, p\u3c0.001). This suggests that future data collections for eating activity detection should also collect detailed ground truth on secondary activities being conducted during eating. Finally, learning from this data collection, we describe a convolutional neural network (CNN) to detect periods of eating by tracking wrist motion during everyday life. Eating uses hand-to-mouth gestures for ingestion, each of which lasts appx 1-5 sec. The novelty of our new approach is that we analyze a much longer window (0.5-15 min) that can contain other gestures related to eating, such as cutting or manipulating food, preparing foods for consumption, and resting between ingestion events. The context of these other gestures can improve the detection of periods of eating. We found that accuracy at detecting eating increased by 15% in longer windows compared to shorter windows. Overall results on CAD were 89% detection of meals with 1.7 false positives for every true positive (FP/TP), and a time weighted accuracy of 80%

    Hand Gesture and Activity Recognition in Assisted Living Through Wearable Sensing and Computing

    Get PDF
    With the growth of the elderly population, more seniors live alone as sole occupants of a private dwelling than any other population groups. Helping them to live a better life is very important and has great societal benefits. Assisted living systems can provide support to elderly people in their houses or apartments. Since automated recognition of human gestures and activities is indispensable for human-robot interaction (HRI) in assisted living systems, this dissertation focuses on developing a theoretical framework for human gesture, daily activity recognition and anomaly detection. First, we introduce two prototypes of wearable sensors for motion data collection used in this project. Second, gesture recognition algorithms are developed to recognize explicit human intention. Third, body activity recognition algorithms are presented with different sensor setups. Fourth, complex daily activities, which consist of body activities and hand gestures simultaneously, are recognized using a dynamic Bayesian network (DBN). Fifth, a coherent anomaly detection framework is built to detect four types of abnormal behaviors in human's daily life. Our work can be extended in several directions in the future.School of Electrical & Computer Engineerin

    Using topic models to detect behaviour patterns for healthcare monitoring

    Get PDF
    Healthcare systems worldwide are facing growing demands on their resources due to an ageing population and increase in prevalence of chronic diseases. Innovative residential healthcare monitoring systems, using a variety of sensors are being developed to help address these needs. Interpreting the vast wealth of data generated is key to fully exploiting the benefits offered by a monitoring system. This thesis presents the application of topic models, a machine learning algorithm, to detect behaviour patterns in different types of data produced by a monitoring system. Latent Dirichlet Allocation was applied to real world activity data with corresponding ground truth labels of daily routines. The results from an existing dataset and a novel dataset collected using a custom mobile phone app, demonstrated that the patterns found are equivalent of routines. Long term monitoring can identify changes that could indicate an alteration in health status. Dynamic topic models were applied to simulated long term activity datasets to detect changes in the structure of daily routines. It was shown that the changes occurring in the simulated data can successfully be detected. This result suggests potential for dynamic topic models to identify changes in routines that could aid early diagnosis of chronic diseases. Furthermore, chronic conditions, such as diabetes and obesity, are related to quality of diet. Current research findings on the association between eating behaviours, especially snacking, and the impact on diet quality and health are often conflicting. One problem is the lack of consistent definitions for different types of eating event. The novel application of Latent Dirichlet Allocation to three nutrition datasets is described. The results demonstrated that combinations of food groups representative of eating event types can be detected. Moreover, labels assigned to these combinations showed good agreement with alternative methods for labelling eating event types

    SHELDON Smart habitat for the elderly.

    Get PDF
    An insightful document concerning active and assisted living under different perspectives: Furniture and habitat, ICT solutions and Healthcare
    corecore