2 research outputs found
Learning Individualized Cardiovascular Responses from Large-scale Wearable Sensors Data
We consider the problem of modeling cardiovascular responses to physical
activity and sleep changes captured by wearable sensors in free living
conditions. We use an attentional convolutional neural network to learn
parsimonious signatures of individual cardiovascular response from data
recorded at the minute level resolution over several months on a cohort of 80k
people. We demonstrate internal validity by showing that signatures generated
on an individual's 2017 data generalize to predict minute-level heart rate from
physical activity and sleep for the same individual in 2018, outperforming
several time-series forecasting baselines. We also show external validity
demonstrating that signatures outperform plain resting heart rate (RHR) in
predicting variables associated with cardiovascular functions, such as age and
Body Mass Index (BMI). We believe that the computed cardiovascular signatures
have utility in monitoring cardiovascular health over time, including detecting
abnormalities and quantifying recovery from acute events.Comment: Machine Learning for Health (ML4H) Workshop at NeurIPS 2018
arXiv:1811.0721
Learning Generalizable Physiological Representations from Large-scale Wearable Data
To date, research on sensor-equipped mobile devices has primarily focused on
the purely supervised task of human activity recognition (walking, running,
etc), demonstrating limited success in inferring high-level health outcomes
from low-level signals, such as acceleration. Here, we present a novel
self-supervised representation learning method using activity and heart rate
(HR) signals without semantic labels. With a deep neural network, we set HR
responses as the supervisory signal for the activity data, leveraging their
underlying physiological relationship.
We evaluate our model in the largest free-living combined-sensing dataset
(comprising more than 280,000 hours of wrist accelerometer & wearable ECG data)
and show that the resulting embeddings can generalize in various downstream
tasks through transfer learning with linear classifiers, capturing
physiologically meaningful, personalized information. For instance, they can be
used to predict (higher than 70 AUC) variables associated with individuals'
health, fitness and demographic characteristics, outperforming unsupervised
autoencoders and common bio-markers. Overall, we propose the first multimodal
self-supervised method for behavioral and physiological data with implications
for large-scale health and lifestyle monitoring.Comment: Accepted to the Machine Learning for Mobile Health workshop at
NeurIPS 202