19 research outputs found

    Effectiveness of an mHealth intervention combining a smartphone app and smart band on body composition in an overweight and obese population: Randomized controlled trial (EVIDENT 3 study)

    Get PDF
    Background: Mobile health (mHealth) is currently among the supporting elements that may contribute to an improvement in health markers by helping people adopt healthier lifestyles. mHealth interventions have been widely reported to achieve greater weight loss than other approaches, but their effect on body composition remains unclear. Objective: This study aimed to assess the short-term (3 months) effectiveness of a mobile app and a smart band for losing weight and changing body composition in sedentary Spanish adults who are overweight or obese. Methods: A randomized controlled, multicenter clinical trial was conducted involving the participation of 440 subjects from primary care centers, with 231 subjects in the intervention group (IG; counselling with smartphone app and smart band) and 209 in the control group (CG; counselling only). Both groups were counselled about healthy diet and physical activity. For the 3-month intervention period, the IG was trained to use a smartphone app that involved self-monitoring and tailored feedback, as well as a smart band that recorded daily physical activity (Mi Band 2, Xiaomi). Body composition was measured using the InBody 230 bioimpedance device (InBody Co., Ltd), and physical activity was measured using the International Physical Activity Questionnaire. Results: The mHealth intervention produced a greater loss of body weight (–1.97 kg, 95% CI –2.39 to –1.54) relative to standard counselling at 3 months (–1.13 kg, 95% CI –1.56 to –0.69). Comparing groups, the IG achieved a weight loss of 0.84 kg more than the CG at 3 months. The IG showed a decrease in body fat mass (BFM; –1.84 kg, 95% CI –2.48 to –1.20), percentage of body fat (PBF; –1.22%, 95% CI –1.82% to 0.62%), and BMI (–0.77 kg/m2, 95% CI –0.96 to 0.57). No significant changes were observed in any of these parameters in men; among women, there was a significant decrease in BMI in the IG compared with the CG. When subjects were grouped according to baseline BMI, the overweight group experienced a change in BFM of –1.18 kg (95% CI –2.30 to –0.06) and BMI of –0.47 kg/m2 (95% CI –0.80 to –0.13), whereas the obese group only experienced a change in BMI of –0.53 kg/m2 (95% CI –0.86 to –0.19). When the data were analyzed according to physical activity, the moderate-vigorous physical activity group showed significant changes in BFM of –1.03 kg (95% CI –1.74 to –0.33), PBF of –0.76% (95% CI –1.32% to –0.20%), and BMI of –0.5 kg/m2 (95% CI –0.83 to –0.19). Conclusions: The results from this multicenter, randomized controlled clinical trial study show that compared with standard counselling alone, adding a self-reported app and a smart band obtained beneficial results in terms of weight loss and a reduction in BFM and PBF in female subjects with a BMI less than 30 kg/m2 and a moderate-vigorous physical activity level. Nevertheless, further studies are needed to ensure that this profile benefits more than others from this intervention and to investigate modifications of this intervention to achieve a global effect

    Exploring Contrastive Learning in Human Activity Recognition for Healthcare

    No full text
    Human Activity Recognition (HAR) constitutes one of the most important tasks for wearable and mobile sensing given its implications in human well-being and health monitoring. Motivated by the limitations of labeled datasets in HAR, particularly when employed in healthcare-related applications, this work explores the adoption and adaptation of SimCLR, a contrastive learning technique for visual representations, to HAR. The use of contrastive learning objectives causes the representations of corresponding views to be more similar, and those of non-corresponding views to be more different. After an extensive evaluation exploring 64 combinations of different signal transformations for augmenting the data, we observed significant performance differences owing to the order and the function thereof. In particular, preliminary results indicated an improvement over supervised and unsupervised learning methods when using fine-tuning and random rotation for augmentation, however, future work should explore under which conditions SimCLR is beneficial for HAR systems and other healthcare-related applications

    Self-supervised transfer learning of physiological representations from free-living wearable data.

    No full text
    Wearable devices such as smartwatches are becoming increasingly popular tools for objectively monitoring physical activity in free-living conditions. To date, research has primarily focused on the purely supervised task of human activity recognition, demonstrating limited success in inferring high-level health outcomes from low-level signals. Here, we present a novel self-supervised representation learning method using activity and heart rate (HR) signals without semantic labels. With a deep neural network, we set HR responses as the supervisory signal for the activity data, leveraging their underlying physiological relationship. In addition, we propose a custom quantile loss function that accounts for the long-tailed HR distribution present in the general population. We evaluate our model in the largest free-living combined-sensing dataset (comprising >280k hours of wrist accelerometer & wearable ECG data). Our contributions are two-fold: I) the pre-training task creates a model that can accurately forecast HR based only on cheap activity sensors, and ii) we leverage the information captured through this task by proposing a simple method to aggregate the learnt latent representations (embeddings) from the window-level to user-level. Notably, we show that the embeddings can generalize in various downstream tasks through transfer learning with linear classifiers, capturing physiologically meaningful, personalized information. For instance, they can be used to predict variables associated with individuals' health, fitness and demographic characteristics (AUC >70), outperforming unsupervised autoencoders and common bio-markers. Overall, we propose the first multimodal self-supervised method for behavioral and physiological data with implications for large-scale health and lifestyle monitoring. Code: Https://github.com/sdimi/Step2heart

    Longitudinal cardio-respiratory fitness prediction through wearables in free-living environments

    No full text
    Cardiorespiratory fitness is an established predictor of metabolic disease and mortality. Fitness is directly measured as maximal oxygen consumption (VO2max_{2}max), or indirectly assessed using heart rate responses to standard exercise tests. However, such testing is costly and burdensome because it requires specialized equipment such as treadmills and oxygen masks, limiting its utility. Modern wearables capture dynamic real-world data which could improve fitness prediction. In this work, we design algorithms and models that convert raw wearable sensor data into cardiorespiratory fitness estimates. We validate these estimates' ability to capture fitness profiles in free-living conditions using the Fenland Study (N=11,059), along with its longitudinal cohort (N=2,675), and a third external cohort using the UK Biobank Validation Study (N=181) who underwent maximal VO2max_{2}max testing, the gold standard measurement of fitness. Our results show that the combination of wearables and other biomarkers as inputs to neural networks yields a strong correlation to ground truth in a holdout sample (r = 0.82, 95CI 0.80-0.83), outperforming other approaches and models and detects fitness change over time (e.g., after 7 years). We also show how the model's latent space can be used for fitness-aware patient subtyping paving the way to scalable interventions and personalized trial recruitment. These results demonstrate the value of wearables for fitness estimation that today can be measured only with laboratory tests

    Recent advances of triboelectric nanogenerator based applications in biomedical systems

    No full text
    202202 bchyVersion of RecordRGCOthersChinese University of Hong Kong, Grant/Award Number: 4055086; Hong Kong Polytechnic University, Grant/Award Number: P0030234; Research Grants Council, University Grants Committee, Grant/Award Number: 24206919Publishe
    corecore