176 research outputs found

    Early diagnosis of frailty: Technological and non-intrusive devices for clinical detection

    Get PDF
    This work analyses different concepts for frailty diagnosis based on affordable standard technology such as smartphones or wearable devices. The goal is to provide ideas that go beyond classical diagnostic tools such as magnetic resonance imaging or tomography, thus changing the paradigm; enabling the detection of frailty without expensive facilities, in an ecological way for both patients and medical staff and even with continuous monitoring. Fried's five-point phenotype model of frailty along with a model based on trials and several classical physical tests were used for device classification. This work provides a starting point for future researchers who will have to try to bridge the gap separating elderly people from technology and medical tests in order to provide feasible, accurate and affordable tools for frailty monitoring for a wide range of users.This work was sponsored by the Spanish Ministry of Science, Innovation and Universities and the European Regional Development Fund (ERDF) across projects RTC-2017-6321-1 AEI/FEDER, UE, TEC2016-76021-C2-2-R AEI/FEDER, UE and PID2019-107270RB-C21/AEI/10.13039/501100011033, UE

    DETECTION OF HEALTH-RELATED BEHAVIOURS USING HEAD-MOUNTED DEVICES

    Get PDF
    The detection of health-related behaviors is the basis of many mobile-sensing applications for healthcare and can trigger other inquiries or interventions. Wearable sensors have been widely used for mobile sensing due to their ever-decreasing cost, ease of deployment, and ability to provide continuous monitoring. In this dissertation, we develop a generalizable approach to sensing eating-related behavior. First, we developed Auracle, a wearable earpiece that can automatically detect eating episodes. Using an off-the-shelf contact microphone placed behind the ear, Auracle captures the sound of a person chewing as it passes through the head. This audio data is then processed by a custom circuit board. We collected data with 14 participants for 32 hours in free-living conditions and achieved accuracy exceeding 92.8% and F1 score exceeding77.5% for eating detection with 1-minute resolution. Second, we adapted Auracle for measuring children’s eating behavior, and improved the accuracy and robustness of the eating-activity detection algorithms. We used this improved prototype in a laboratory study with a sample of 10 children for 60 total sessions and collected 22.3 hours of data in both meal and snack scenarios. Overall, we achieved 95.5% accuracy and 95.7% F1 score for eating detection with 1-minute resolution. Third, we developed a computer-vision approach for eating detection in free-living scenarios. Using a miniature head-mounted camera, we collected data with 10 participants for about 55 hours. The camera was fixed under the brim of a cap, pointing to the mouth of the wearer and continuously recording video (but not audio) throughout their normal daily activity. We evaluated performance for eating detection using four different Convolutional Neural Network (CNN) models. The best model achieved 90.9% accuracy and 78.7%F1 score for eating detection with 1-minute resolution. Finally, we validated the feasibility of deploying the 3D CNN model in wearable or mobile platforms when considering computation, memory, and power constraints

    A review of chewing detection for automated dietary monitoring

    Get PDF
    A healthy dietary lifestyle prevents diseases and leads to good physical conditions. Poor dietary habits, such as eating disorders, emotional eating and excessive unhealthy food consumption, may cause health complications. People’s eating habits are monitored through automated dietary monitoring (ADM), which is considered a part of our daily life. In this study, the Google Scholar database from the last 5 years was considered. Articles that reported chewing activity characteristics and various wearable sensors used to detect chewing activities automatically were reviewed. Key challenges, including chew count, various food types, food classification and a large number of samples, were identified for further chewing data analysis. The chewing signal’s highest reported classification accuracy value was 99.85%, which was obtained using a piezoelectric contactless sensor and multistage linear SVM with a decision tree classifier. The decision tree approach was more robust and its classification accuracy (75%–93.3%) was higher than those of the Viterbi algorithm-based finite-state grammar approach, which yielded 26%–97% classification accuracy. This review served as a comparative study and basis for developing efficient ADM systems

    Detecting Eating Episodes with an Ear-mounted Sensor

    Get PDF
    In this paper, we propose Auracle, a wearable earpiece that can automatically recognize eating behavior. More specifically, in free-living conditions, we can recognize when and for how long a person is eating. Using an off-the-shelf contact microphone placed behind the ear, Auracle captures the sound of a person chewing as it passes through the bone and tissue of the head. This audio data is then processed by a custom analog/digital circuit board. To ensure reliable (yet comfortable) contact between microphone and skin, all hardware components are incorporated into a 3D-printed behind-the-head framework. We collected field data with 14 participants for 32 hours in free-living conditions and additional eating data with 10 participants for 2 hours in a laboratory setting. We achieved accuracy exceeding 92.8% and F1 score exceeding 77.5% for eating detection. Moreover, Auracle successfully detected 20-24 eating episodes (depending on the metrics) out of 26 in free-living conditions. We demonstrate that our custom device could sense, process, and classify audio data in real time. Additionally, we estimateAuracle can last 28.1 hours with a 110 mAh battery while communicating its observations of eating behavior to a smartphone over Bluetooth
    • …
    corecore