1,182 research outputs found

    Using Hidden Markov Models to Segment and Classify Wrist Motions Related to Eating Activities

    Get PDF
    Advances in body sensing and mobile health technology have created new opportunities for empowering people to take a more active role in managing their health. Measurements of dietary intake are commonly used for the study and treatment of obesity. However, the most widely used tools rely upon self-report and require considerable manual effort, leading to underreporting of consumption, non-compliance, and discontinued use over the long term. We are investigating the use of wrist-worn accelerometers and gyroscopes to automatically recognize eating gestures. In order to improve recognition accuracy, we studied the sequential ependency of actions during eating. In chapter 2 we first undertook the task of finding a set of wrist motion gestures which were small and descriptive enough to model the actions performed by an eater during consumption of a meal. We found a set of four actions: rest, utensiling, bite, and drink; any alternative gestures is referred as the other gesture. The stability of the definitions for gestures was evaluated using an inter-rater reliability test. Later, in chapter 3, 25 meals were hand labeled and used to study the existence of sequential dependence of the gestures. To study this, three types of classifiers were built: 1) a K-nearest neighbor classifier which uses no sequential context, 2) a hidden Markov model (HMM) which captures the sequential context of sub-gesture motions, and 3) HMMs that model inter-gesture sequential dependencies. We built first-order to sixth-order HMMs to evaluate the usefulness of increasing amounts of sequential dependence to aid recognition. The first two were our baseline algorithms. We found that the adding knowledge of the sequential dependence of gestures achieved an accuracy of 96.5%, which is an improvement of 20.7% and 12.2% over the KNN and sub-gesture HMM. Lastly, in chapter 4, we automatically segmented a continuous wrist motion signal and assessed its classification performance for each of the three classifiers. Again, the knowledge of sequential dependence enhances the recognition of gestures in unsegmented data, achieving 90% accuracy and improving 30.1% and 18.9% over the KNN and the sub-gesture HMM

    Detection of Swallowing Events to Quantify Fluid Intake in Older Adults Based on Wearable Sensors

    Get PDF
    The percentage of adults aged 65 an over, defined as older adults, is prospected to increase in the coming decades over the total population. With such an increase, it is essential that healthcare technologies evolve to cater for the needs of an aging population. One such need is hydration: due to both physiological and psychological reasons, older adults tend to be more prone to develop dehydration, which in turn increases the chances of morbidity and mortality. Currently, there are no gold standards to monitor hydration, with most methods relying on filling manual forms. Thus, there is an urgent need to develop techniques which can accurately monitor fluid intake and prevent dehydration in older adult, especially those residing in healthcare settings. Several methods, such as smart cups and on-body sensors such as microphones, were proposed in the literature, however none of these have been widely investigated, and often the presented results were based on an extremely small cohort. Therefore, the scope of this PhD project is to investigate and develop methods that can detect swallowing events and that can quantify the volume of ingested fluids by leveraging on signals harvested using non-invasive, on-body sensors. Two types of on-body sensors were selected and used throughout this research: namely surface Electromyographic sensors (sEMG) and microphones. These sensors were then used to collect sound and electric signals from the subjects while swallowing boluses of different viscosities and while performing actions not related to swallowing but that could recruit the same muscles or produce similar sounds, such as talking or coughing. Features were then extracted from the collected observations and used to train Machine Learning (ML) and Deep Learning (DL) models to analyse their ability to differentiate between swallowing and non-swallowing actions, to distinguish between different bolus types, and to quantify the volume of fluid ingested. Results showed a precision of 81.55±3.40% in differentiating between swallows and non-swallows and a precision of 81.74±8.01% in distinguishing between bolus types, both given by the sEMG. Also, a root mean square error (RMSE) of 3.94±1.31 ml in estimating fluid intake was obtained using the microphone. The significance of the findings exposed in this thesis rely on the fact that surface EMGs and microphones demonstrate a significant potential in fluid intake monitoring, and on the concrete possibility of developing a non-invasive, reliable system that could prevent dehydration in older adults living in healthcare settings

    Detecting Periods of Eating in Everyday Life by Tracking Wrist Motion — What is a Meal?

    Get PDF
    Eating is one of the most basic activities observed in sentient animals, a behavior so natural that humans often eating without giving the activity a second thought. Unfortunately, this often leads to consuming more calories than expended, which can cause weight gain - a leading cause of diseases and death. This proposal describes research in methods to automatically detect periods of eating by tracking wrist motion so that calorie consumption can be tracked. We first briefly discuss how obesity is caused due to an imbalance in calorie intake and expenditure. Calorie consumption and expenditure can be tracked manually using tools like paper diaries, however it is well known that human bias can affect the accuracy of such tracking. Researchers in the upcoming field of automated dietary monitoring (ADM) are attempting to track diet using electronic methods in an effort to mitigate this bias. We attempt to replicate a previous algorithm that detects eating by tracking wrist motion electronically. The previous algorithm was evaluated on data collected from 43 subjects using an iPhone as the sensor. Periods of time are segmented first, and then classified using a naive Bayesian classifier. For replication, we describe the collection of the Clemson all-day data set (CAD), a free-living eating activity dataset containing 4,680 hours of wrist motion collected from 351 participants - the largest of its kind known to us. We learn that while different sensors are available to log wrist acceleration data, no unified convention exists, and this data must thus be transformed between conventions. We learn that the performance of the eating detection algorithm is affected due to changes in the sensors used to track wrist motion, increased variability in behavior due to a larger participant pool, and the ratio of eating to non-eating in the dataset. We learn that commercially available acceleration sensors contain noise in their reported readings which affects wrist tracking specifically due to the low magnitude of wrist acceleration. Commercial accelerometers can have noise up to 0.06g which is acceptable in applications like automobile crash testing or pedestrian indoor navigation, but not in ones using wrist motion. We quantify linear acceleration noise in our free-living dataset. We explain sources of noise, a method to mitigate it, and also evaluate the effect of this noise on the eating detection algorithm. By visualizing periods of eating in the collected dataset we learn that that people often conduct secondary activities while eating, such as walking, watching television, working, and doing household chores. These secondary activities cause wrist motions that obfuscate wrist motions associated with eating, which increases the difficulty of detecting periods of eating (meals). Subjects reported conducting secondary activities in 72% of meals. Analysis of wrist motion data revealed that the wrist was resting 12.8% of the time during self-reported meals, compared to only 6.8% of the time in a cafeteria dataset. Walking motion was found during 5.5% of the time during meals in free-living, compared to 0% in the cafeteria. Augmenting an eating detection classifier to include walking and resting detection improved the average per person accuracy from 74% to 77% on our free-living dataset (t[353]=7.86, p\u3c0.001). This suggests that future data collections for eating activity detection should also collect detailed ground truth on secondary activities being conducted during eating. Finally, learning from this data collection, we describe a convolutional neural network (CNN) to detect periods of eating by tracking wrist motion during everyday life. Eating uses hand-to-mouth gestures for ingestion, each of which lasts appx 1-5 sec. The novelty of our new approach is that we analyze a much longer window (0.5-15 min) that can contain other gestures related to eating, such as cutting or manipulating food, preparing foods for consumption, and resting between ingestion events. The context of these other gestures can improve the detection of periods of eating. We found that accuracy at detecting eating increased by 15% in longer windows compared to shorter windows. Overall results on CAD were 89% detection of meals with 1.7 false positives for every true positive (FP/TP), and a time weighted accuracy of 80%

    Embedding a Grid of Load Cells into a Dining Table for Automatic Monitoring and Detection of Eating Events

    Get PDF
    This dissertation describes a “smart dining table” that can detect and measure consumption events. This work is motivated by the growing problem of obesity, which is a global problem and an epidemic in the United States and Europe. Chapter 1 gives a background on the economic burden of obesity and its comorbidities. For the assessment of obesity, we briefly describe the classic dietary assessment tools and discuss their drawback and the necessity of using more objective, accurate, low-cost, and in-situ automatic dietary assessment tools. We explain in short various technologies used for automatic dietary assessment such as acoustic-, motion-, or image-based systems. This is followed by a literature review of prior works related to the detection of weights and locations of objects sitting on a table surface. Finally, we state the novelty of this work. In chapter 2, we describe the construction of a table that uses an embedded grid of load cells to sense the weights and positions of objects. The main challenge is aligning the tops of adjacent load cells to within a few micrometer tolerance, which we accomplish using a novel inversion process during construction. Experimental tests found that object weights distributed across 4 to 16 load cells could be measured with 99.97±0.1% accuracy. Testing the surface for flatness at 58 points showed that we achieved approximately 4.2±0.5 um deviation among adjacent 2x2 grid of tiles. Through empirical measurements we determined that the table has a 40.2 signal-to-noise ratio when detecting the smallest expected intake amount (0.5 g) from a normal meal (approximate total weight is 560 g), indicating that a tiny amount of intake can be detected well above the noise level of the sensors. In chapter 3, we describe a pilot experiment that tests the capability of the table to monitor eating. Eleven human subjects were video recorded for ground truth while eating a meal on the table using a plate, bowl, and cup. To detect consumption events, we describe an algorithm that analyzes the grid of weight measurements in the format of an image. The algorithm segments the image into multiple objects, tracks them over time, and uses a set of rules to detect and measure individual bites of food and drinks of liquid. On average, each meal consisted of 62 consumption events. Event detection accuracy was very high, with an F1-score per subject of 0.91 to 1.0, and an F1 score per container of 0.97 for the plate and bowl, and 0.99 for the cup. The experiment demonstrates that our device is capable of detecting and measuring individual consumption events during a meal. Chapter 4 compares the capability of our new tool to monitor eating against previous works that have also monitored table surfaces. We completed a literature search and identified the three state-of-the-art methods to be used for comparison. The main limitation of all previous methods is that they used only one load cell for monitoring, so only the total surface weight can be analyzed. To simulate their operations, the weights of our grid of load cells were summed up to use the 2D data as 1D. Data were prepared according to the requirements of each method. Four metrics were used to evaluate the comparison: precision, recall, accuracy, and F1-score. Our method scored the highest in recall, accuracy, and F1-score; compared to all other methods, our method scored 13-21% higher for recall, 8-28% higher for accuracy, and 10-18% higher for F1-score. For precision, our method scored 97% that is just 1% lower than the highest precision, which was 98%. In summary, this dissertation describes novel hardware, a pilot experiment, and a comparison against current state-of-the-art tools. We also believe our methods could be used to build a similar surface for other applications besides monitoring consumption

    A pilot study to determine whether using a lightweight, wearable micro-camera improves dietary assessment accuracy and offers information on macronutrients and eating rate.

    No full text
    A major limitation in nutritional science is the lack of understanding of the nutritional intake of free-living people. There is an inverse relationship between accuracy of reporting of energy intake by all current nutritional methodologies and body weight. In this pilot study we aim to explore whether using a novel lightweight, wearable micro-camera improves the accuracy of dietary intake assessment. Doubly labelled water (DLW) was used to estimate energy expenditure and intake over a 14-d period, over which time participants (n 6) completed a food diary and wore a micro-camera on 2 of the days. Comparisons were made between the estimated energy intake from the reported food diary alone and together with the images from the micro-camera recordings. There was an average daily deficit of 3912 kJ using food diaries to estimate energy intake compared with estimated energy expenditure from DLW (P=0·0118), representing an under-reporting rate of 34 %. Analysis of food diaries alone showed a significant deficit in estimated daily energy intake compared with estimated intake from food diary analysis with images from the micro-camera recordings (405 kJ). Use of the micro-camera images in conjunction with food diaries improves the accuracy of dietary assessment and provides valuable information on macronutrient intake and eating rate. There is a need to develop this recording technique to remove user and assessor bias

    Neural correlates of taste reactivity in autism spectrum disorder.

    Get PDF
    Selective or \u27picky\u27 eating habits are common among those with autism spectrum disorder (ASD). These behaviors are often related to aberrant sensory experience in individuals with ASD, including heightened reactivity to food taste and texture. However, very little is known about the neural mechanisms that underlie taste reactivity in ASD. In the present study, food-related neural responses were evaluated in 21 young adult and adolescent males diagnosed with ASD without intellectual disability, and 21 typically-developing (TD) controls. Taste reactivity was assessed using the Adolescent/Adult Sensory Profile, a clinical self-report measure. Functional magnetic resonance imaging was used to evaluate hemodynamic responses to sweet (vs. neutral) tastants and food pictures. Subjects also underwent resting-state functional connectivity scans.The ASD and TD individuals did not differ in their hemodynamic response to gustatory stimuli. However, the ASD subjects, but not the controls, exhibited a positive association between self-reported taste reactivity and the response to sweet tastants within the insular cortex and multiple brain regions associated with gustatory perception and reward. There was a strong interaction between diagnostic group and taste reactivity on tastant response in brain regions associated with ASD pathophysiology, including the bilateral anterior superior temporal sulcus (STS). This interaction of diagnosis and taste reactivity was also observed in the resting state functional connectivity between the anterior STS and dorsal mid-insula (i.e., gustatory cortex).These results suggest that self-reported heightened taste reactivity in ASD is associated with heightened brain responses to food-related stimuli and atypical functional connectivity of primary gustatory cortex, which may predispose these individuals to maladaptive and unhealthy patterns of selective eating behavior. Trial registration: (clinicaltrials.gov identifier) NCT01031407. Registered: December 14, 2009

    Advances in the physiological assessment and diagnosis of GERD

    Get PDF
    Abstract GERD is a common condition worldwide. Key mechanisms of disease include abnormal oesophagogastric junction structure and function, and impaired oesophageal clearance. A therapeutic trial of acid-suppressive PPI therapy is often the initial management, with endoscopy performed in the setting of alarm symptoms and to exclude other conditions. If symptoms persist and endoscopy does not reveal evidence of GERD, oesophageal function tests are performed, including oesophageal manometry and ambulatory reflux monitoring. However, reflux episodes can be physiological, and some findings on endoscopy and manometry can be encountered in asymptomatic individuals without GERD symptoms. The diagnosis of GERD on the basis of functional oesophageal testing has been previously reported, but no updated expert recommendations on indications and the interpretation of oesophageal function testing in GERD has been made since the Porto consensus over a decade ago. In this Consensus Statement, we aim to describe modern oesophageal physiological tests and their analysis with an emphasis on establishing indications and consensus on interpretation parameters of oesophageal function testing for the evaluation of GERD in clinical practice. This document reflects the collective conclusions of the international GERD working group, incorporating existing data with expert consensus opinion
    corecore