12 research outputs found
Building an Open-Source Community to Enhance Autonomic Nervous System Signal Analysis: DBDP-Autonomic
Smartphones and wearable sensors offer an unprecedented ability to collect
peripheral psychophysiological signals across diverse timescales, settings,
populations, and modalities. However, open-source software development has yet
to keep pace with rapid advancements in hardware technology and availability,
creating an analytical barrier that limits the scientific usefulness of
acquired data. We propose a community-driven, open-source peripheral
psychophysiological signal pre-processing and analysis software framework that
could advance biobehavioral health by enabling more robust, transparent, and
reproducible inferences involving autonomic nervous system data
The 2023 wearable photoplethysmography roadmap
Photoplethysmography is a key sensing technology which is used in wearable devices such as smartwatches and fitness trackers. Currently, photoplethysmography sensors are used to monitor physiological parameters including heart rate and heart rhythm, and to track activities like sleep and exercise. Yet, wearable photoplethysmography has potential to provide much more information on health and wellbeing, which could inform clinical decision making. This Roadmap outlines directions for research and development to realise the full potential of wearable photoplethysmography. Experts discuss key topics within the areas of sensor design, signal processing, clinical applications, and research directions. Their perspectives provide valuable guidance to researchers developing wearable photoplethysmography technology
Non-Invasive Cardiovascular Health Monitoring for Patients with Heart Failure using Seismocardiography
Heart failure (HF) is the leading cause of hospitalization and hospital readmission for patients aged over 65 and older in the United States, with roughly one in five individuals hospitalized with heart failure being readmitted within 30 days of discharge. HF affects 6.2 million Americans with health care costs of almost $31 billion per year. Management of HF is a complicated process that requires frequent clinic visits and outpatient management systems for hemodynamic monitoring and patient-reported symptoms. Hemodynamically guided HF management via tracking pulmonary congestion and taking proactive care have shown efficacy in reducing HF-related readmission significantly. However, the cost-prohibitive nature of such pulmonary congestion monitoring systems precludes their usage in the large patient population affected by HF. For that reason, an inexpensive alternative is necessary to bring hemodynamic monitoring systems to the large patient population affected by HF, not only in the United States but also around the world.
Advancement of novel biomedical sensor technologies and advanced signal processing and machine learning algorithms have merit in tracking health parameters unobtrusively. A promising sensing modality is seismocardiography (SCG), defined as the measurement of local chest wall vibrations associated with the cardiac cycle. SCG has shown efficacy in tracking changes in cardiac contractility via the cardiac timing intervals it yields, such as the pre-ejection period (PEP). However, different sensing modalities of SCG acquisition exist using accelerometer and gyroscope based sensors, and inter-subject variability of these acquired signals has made it challenging to develop a robust hemodynamic monitoring system using SCG. Accordingly, most researches in the field of SCG focus on advancing the understanding and processing of the signal in healthy individuals. The translation of the SCG-based hemodynamic monitoring approaches into the actual patient population, for example, in patients with HF, is necessary to validate such a system for both inpatient and outpatient HF management.
This work addresses some of these key aspects. First, the two sensing techniques for acquiring SCG, accelerometer and gyroscope sensors, are compared in their ability to track cardiac contractility changes via PEP estimation. Second, general time, frequency, and amplitude features are extracted from the SCG signals and used in a population level machine learning regression algorithm to estimate key cardiovascular features for healthy subjects and patients with HF, by overcoming the inter-subject variability of the signals. Third, the SCG sensing system, along with the signal processing and machine learning algorithm, is verified and validated with two gold-standard clinical procedures: cardiopulmonary exercise test (CPX) and right heart catheterization (RHC). Gas exchange variables from the CPX and changes in pulmonary congestion from the RHC procedures were estimated using features from simultaneously recorded SCG signals to demonstrate the efficacy of such a sensing system and algorithm to track relevant hemodynamic parameters in patients with HF.
The algorithms and methods presented in this work can enable remote cardiovascular health monitoring for patients with HF to enable personalized titration of care, and improving medication adherence in a hemodynamically-guided HF management system. The inexpensive wearable sensing technology has the potential to be a viable and ubiquitous alternative to the already-proven hemodynamic congestion monitoring systems, which can improve the quality of life and outcome in patients with HF by reducing hospitalization and reducing the overall health care costs.Ph.D
Wearable Patch-Based Estimation of Oxygen Uptake and Assessment of Clinical Status during Cardiopulmonary Exercise Testing in Patients With Heart Failure
BackgroundTo estimate oxygen uptake (VO2) from cardiopulmonary exercise testing (CPX) using simultaneously recorded seismocardiogram (SCG) and electrocardiogram (ECG) signals captured with a small wearable patch. CPX is an important risk stratification tool for patients with heart failure (HF) owing to the prognostic value of the features derived from the gas exchange variables such as VO2. However, CPX requires specialized equipment, as well as trained professionals to conduct the study.Methods and resultsWe have conducted a total of 68 CPX tests on 59 patients with HF with reduced ejection fraction (31% women, mean age 55 ± 13 years, ejection fraction 0.27 ± 0.11, 79% stage C). The patients were fitted with a wearable sensing patch and underwent treadmill CPX. We divided the dataset into a training-testing set (n = 44) and a separate validation set (n = 24). We developed globalized (population) regression models to estimate VO2 from the SCG and ECG signals measured continuously with the patch. We further classified the patients as stage D or C using the SCG and ECG features to assess the ability to detect clinical state from the wearable patch measurements alone. We developed the regression and classification model with cross-validation on the training-testing set and validated the models on the validation set. The regression model to estimate VO2 from the wearable features yielded a moderate correlation (R2 of 0.64) with a root mean square error of 2.51 ± 1.12 mL · kg-1 · min-1 on the training-testing set, whereas R2 and root mean square error on the validation set were 0.76 and 2.28 ± 0.93 mL · kg-1 · min-1, respectively. Furthermore, the classification of clinical state yielded accuracy, sensitivity, specificity, and an area under the receiver operating characteristic curve values of 0.84, 0.91, 0.64, and 0.74, respectively, for the training-testing set, and 0.83, 0.86, 0.67, and 0.92, respectively, for the validation set.ConclusionsWearable SCG and ECG can assess CPX VO2 and thereby classify clinical status for patients with HF. These methods may provide value in the risk stratification of patients with HF by tracking cardiopulmonary parameters and clinical status outside of specialized settings, potentially allowing for more frequent assessments to be performed during longitudinal monitoring and treatment
Assessment of ownership of smart devices and the acceptability of digital health data sharing
Abstract Smart portable devices- smartphones and smartwatches- are rapidly being adopted by the general population, which has brought forward an opportunity to use the large volumes of physiological, behavioral, and activity data continuously being collected by these devices in naturalistic settings to perform research, monitor health, and track disease. While these data can serve to revolutionize health monitoring in research and clinical care, minimal research has been conducted to understand what motivates people to use these devices and their interest and comfort in sharing the data. In this study, we aimed to characterize the ownership and usage of smart devices among patients from an expansive academic health system in the southeastern US and understand their willingness to share data collected by the smart devices. We conducted an electronic survey of participants from an online patient advisory group around smart device ownership, usage, and data sharing. Out of the 3021 members of the online patient advisory group, 1368 (45%) responded to the survey, with 871 female (64%), 826 and 390 White (60%) and Black (29%) participants, respectively, and a slight majority (52%) age 58 and older. Most of the respondents (98%) owned a smartphone and the majority (59%) owned a wearable. In this population, people who identify as female, Hispanic, and Generation Z (age 18–25), and those completing higher education and having full-time employment, were most likely to own a wearable device compared to their demographic counterparts. 50% of smart device owners were willing to share and 32% would consider sharing their smart device data for research purposes. The type of activity data they are willing to share varies by gender, age, education, and employment. Findings from this study can be used to design both equitable and cost-effective digital health studies, leveraging personally-owned smartphones and wearables in representative populations, ultimately enabling the development of equitable digital health technologies
Recommended from our members
Use of Ballistocardiography to Monitor Cardiovascular Hemodynamics in Preeclampsia.
Objective: Pregnancy requires a complex physiological adaptation of the maternal cardiovascular system, which is disrupted in women with pregnancies complicated by preeclampsia, putting them at higher risk of future cardiovascular events. The measurement of body movements in response to cardiac ejection via ballistocardiogram (BCG) can be used to assess cardiovascular hemodynamics noninvasively in women with preeclampsia. Methods: Using a previously validated, modified weighing scale for assessment of cardiovascular hemodynamics through measurement of BCG and electrocardiogram (ECG) signals, we collected serial measurements throughout pregnancy and postpartum and analyzed data in 30 women with preeclampsia and 23 normotensive controls. Using BCG and ECG signals, we extracted measures of cardiac output, J-wave amplitude × heart rate (J-amp × HR). Mixed-effect models with repeated measures were used to compare J-amp × HRs between groups at different time points in pregnancy and postpartum. Results: In normotensive controls, the J-amp × HR was significantly lower early postpartum (E-PP) compared with the second trimester (T2; p = 0.016) and third trimester (T3; p = 0.001). Women with preeclampsia had a significantly lower J-amp × HR compared with normotensive controls during the first trimester (T1; p = 0.026). In the preeclampsia group, there was a trend toward an increase in J-amp × HR from T1 to T2 and then a drop in J-amp × HR at T3 and further drop at E-PP. Conclusions: We observe cardiac hemodynamic changes consistent with those reported using well-validated tools. In pregnancies complicated by preeclampsia, the maximal force of contraction is lower, suggesting lower cardiac output and a trend in hemodynamics consistent with the hyperdynamic disease model of preeclampsia
Recommended from our members
Use of Ballistocardiography to Monitor Cardiovascular Hemodynamics in Preeclampsia.
Objective: Pregnancy requires a complex physiological adaptation of the maternal cardiovascular system, which is disrupted in women with pregnancies complicated by preeclampsia, putting them at higher risk of future cardiovascular events. The measurement of body movements in response to cardiac ejection via ballistocardiogram (BCG) can be used to assess cardiovascular hemodynamics noninvasively in women with preeclampsia. Methods: Using a previously validated, modified weighing scale for assessment of cardiovascular hemodynamics through measurement of BCG and electrocardiogram (ECG) signals, we collected serial measurements throughout pregnancy and postpartum and analyzed data in 30 women with preeclampsia and 23 normotensive controls. Using BCG and ECG signals, we extracted measures of cardiac output, J-wave amplitude × heart rate (J-amp × HR). Mixed-effect models with repeated measures were used to compare J-amp × HRs between groups at different time points in pregnancy and postpartum. Results: In normotensive controls, the J-amp × HR was significantly lower early postpartum (E-PP) compared with the second trimester (T2; p = 0.016) and third trimester (T3; p = 0.001). Women with preeclampsia had a significantly lower J-amp × HR compared with normotensive controls during the first trimester (T1; p = 0.026). In the preeclampsia group, there was a trend toward an increase in J-amp × HR from T1 to T2 and then a drop in J-amp × HR at T3 and further drop at E-PP. Conclusions: We observe cardiac hemodynamic changes consistent with those reported using well-validated tools. In pregnancies complicated by preeclampsia, the maximal force of contraction is lower, suggesting lower cardiac output and a trend in hemodynamics consistent with the hyperdynamic disease model of preeclampsia
Demographic Imbalances Resulting From the Bring-Your-Own-Device Study Design
Digital health technologies, such as smartphones and wearable devices, promise to revolutionize disease prevention, detection, and treatment. Recently, there has been a surge of digital health studies where data are collected through a bring-your-own-device (BYOD) approach, in which participants who already own a specific technology may voluntarily sign up for the study and provide their digital health data. BYOD study design accelerates the collection of data from a larger number of participants than cohort design; this is possible because researchers are not limited in the study population size based on the number of devices afforded by their budget or the number of people familiar with the technology. However, the BYOD study design may not support the collection of data from a representative random sample of the target population where digital health technologies are intended to be deployed. This may result in biased study results and biased downstream technology development, as has occurred in other fields. In this viewpoint paper, we describe demographic imbalances discovered in existing BYOD studies, including our own, and we propose the Demographic Improvement Guideline to address these imbalances
Recent Academic Research on Clinically Relevant Digital Measures: Systematic Review
BackgroundDigital clinical measures collected via various digital sensing technologies such as smartphones, smartwatches, wearables, ingestibles, and implantables are increasingly used by individuals and clinicians to capture health outcomes or behavioral and physiological characteristics of individuals. Although academia is taking an active role in evaluating digital sensing products, academic contributions to advancing the safe, effective, ethical, and equitable use of digital clinical measures are poorly characterized.
ObjectiveWe performed a systematic review to characterize the nature of academic research on digital clinical measures and to compare and contrast the types of sensors used and the sources of funding support for specific subareas of this research.
MethodsWe conducted a PubMed search using a range of search terms to retrieve peer-reviewed articles reporting US-led academic research on digital clinical measures between January 2019 and February 2021. We screened each publication against specific inclusion and exclusion criteria. We then identified and categorized research studies based on the types of academic research, sensors used, and funding sources. Finally, we compared and contrasted the funding support for these specific subareas of research and sensor types.
ResultsThe search retrieved 4240 articles of interest. Following the screening, 295 articles remained for data extraction and categorization. The top five research subareas included operations research (research analysis; n=225, 76%), analytical validation (n=173, 59%), usability and utility (data visualization; n=123, 42%), verification (n=93, 32%), and clinical validation (n=83, 28%). The three most underrepresented areas of research into digital clinical measures were ethics (n=0, 0%), security (n=1, 0.5%), and data rights and governance (n=1, 0.5%). Movement and activity trackers were the most commonly studied sensor type, and physiological (mechanical) sensors were the least frequently studied. We found that government agencies are providing the most funding for research on digital clinical measures (n=192, 65%), followed by independent foundations (n=109, 37%) and industries (n=56, 19%), with the remaining 12% (n=36) of these studies completely unfunded.
ConclusionsSpecific subareas of academic research related to digital clinical measures are not keeping pace with the rapid expansion and adoption of digital sensing products. An integrated and coordinated effort is required across academia, academic partners, and academic funders to establish the field of digital clinical measures as an evidence-based field worthy of our trust
A Systematic Review of Time Series Classification Techniques Used in Biomedical Applications
Background: Digital clinical measures collected via various digital sensing technologies such as smartphones, smartwatches, wearables, and ingestible and implantable sensors are increasingly used by individuals and clinicians to capture the health outcomes or behavioral and physiological characteristics of individuals. Time series classification (TSC) is very commonly used for modeling digital clinical measures. While deep learning models for TSC are very common and powerful, there exist some fundamental challenges. This review presents the non-deep learning models that are commonly used for time series classification in biomedical applications that can achieve high performance. Objective: We performed a systematic review to characterize the techniques that are used in time series classification of digital clinical measures throughout all the stages of data processing and model building. Methods: We conducted a literature search on PubMed, as well as the Institute of Electrical and Electronics Engineers (IEEE), Web of Science, and SCOPUS databases using a range of search terms to retrieve peer-reviewed articles that report on the academic research about digital clinical measures from a five-year period between June 2016 and June 2021. We identified and categorized the research studies based on the types of classification algorithms and sensor input types. Results: We found 452 papers in total from four different databases: PubMed, IEEE, Web of Science Database, and SCOPUS. After removing duplicates and irrelevant papers, 135 articles remained for detailed review and data extraction. Among these, engineered features using time series methods that were subsequently fed into widely used machine learning classifiers were the most commonly used technique, and also most frequently achieved the best performance metrics (77 out of 135 articles). Statistical modeling (24 out of 135 articles) algorithms were the second most common and also the second-best classification technique. Conclusions: In this review paper, summaries of the time series classification models and interpretation methods for biomedical applications are summarized and categorized. While high time series classification performance has been achieved in digital clinical, physiological, or biomedical measures, no standard benchmark datasets, modeling methods, or reporting methodology exist. There is no single widely used method for time series model development or feature interpretation, however many different methods have proven successful