44 research outputs found

    Thermal imaging for vehicle occupant monitoring

    Get PDF

    Remote Heart Rate Monitoring in Smart Environments from Videos with Self-supervised Pre-training

    Full text link
    Recent advances in deep learning have made it increasingly feasible to estimate heart rate remotely in smart environments by analyzing videos. However, a notable limitation of deep learning methods is their heavy reliance on extensive sets of labeled data for effective training. To address this issue, self-supervised learning has emerged as a promising avenue. Building on this, we introduce a solution that utilizes self-supervised contrastive learning for the estimation of remote photoplethysmography (PPG) and heart rate monitoring, thereby reducing the dependence on labeled data and enhancing performance. We propose the use of 3 spatial and 3 temporal augmentations for training an encoder through a contrastive framework, followed by utilizing the late-intermediate embeddings of the encoder for remote PPG and heart rate estimation. Our experiments on two publicly available datasets showcase the improvement of our proposed approach over several related works as well as supervised learning baselines, as our results approach the state-of-the-art. We also perform thorough experiments to showcase the effects of using different design choices such as the video representation learning method, the augmentations used in the pre-training stage, and others. We also demonstrate the robustness of our proposed method over the supervised learning approaches on reduced amounts of labeled data.Comment: Accepted in IEEE Internet of Things Journal 202

    Toward contactless human thermal monitoring : a framework for Machine Learning-based human thermo-physiology modeling augmented with computer vision

    Get PDF
    The transition towards a human-centered indoor climate is beneficial from occupants’ thermal comfort and from an energy reduction perspective. However, achieving this goal requires the knowledge of the thermal state of individuals at the level of body parts. Many current solutions rely on intrusive wearable technologies, which require physical access to individuals facing limitations in scalability. Personalizing the indoor environment demands increased sensing at individual levels presenting challenges in terms of data collection and ensuring privacy protection. To address this challenge, this paper introduces a novel approach to non-intrusive personalized humans thermal sensing that can acquire personal data while minimizing the amount of sensing required. The method investigates multi-modal sensing solutions based on IR and RGB images, and it includes the development of a Machine Learning-based Human Thermo-Physiology Model (ML-HTPM). With the help of computer vision, features important for thermal comfort such as activity level, clothing insulation, posture, age, and sex can be extracted from an RGB image sequence using models such as the SlowFast network, YOLOv 7, while limited skin temperatures can be extracted from an IR image using OpenPifPaf for body parts detection. The developed ML-HTPM is based on data generated from an open-source JOS3 model after applying a prediction model based on Long Short-Term Memory (LSTM). The results showed that a human thermo-physiology model using machine learning can be trained, showing an RMSE of less than 0.5°C in most of the local skin temperatures
    corecore