12 research outputs found

    The Risk of Bias in Validity and Reliability Studies Testing Physiological Variables using Consumer-Grade Wearable Technology: A Systematic Review and WEAR-BOT Analysis

    Get PDF
    INTRODUCTION: Wearable technology is a quickly evolving field, and new devices with new features to measure/estimate physiological variables are being released constantly. Despite their use, the validity of the devices are largely unknown to the users or researchers, and the quality of the studies that do test validity and reliability vary widely. PURPOSE: Therefore, the purpose of this systematic review was to review the current validity and reliability literature concerning consumer-grade wearable technology measurements/estimates of physiological variables during exercise. Additionally, we sought to perform risk of bias assessments utilizing the novel WEArable technology Risk of Bias and Objectivity Tool (WEAR-BOT). METHODS: This review was conducted following PRISMA guidelines, searching 3 databases: Google Scholar, Scopus, and SPORTDiscus. After screening, 46 papers were identified that met the pre-determined criteria. Then data was extracted and risk of bias assessment performed by independent researchers. Descriptive statistics, weighted averages of mean absolute percentage error (MAPE) and Pearson correlations were calculated. Sample size statistics were performed utilizing the lower 95% confidence interval of the weighted correlation average. RESULTS: Of the 46 papers reviewed, 44 performed validity testing, while 9 performed reliability. The weighted average for MAPE was 12.48% for heart rate (HR) and 30.70% for energy expenditure (EE). The weighted average for Pearson correlations was 0.737 for HR and 0.672 for EE. Risk of bias assessment of validity studies resulted in 30/44 studies being classified as having a “High Risk of Bias”, and 14/44 having “Some Risk of Bias”. None had a “Low Risk of Bias”, according to the novel WEAR-BOT. For reliability studies, 7/9 were classified as “High Risk of Bias”, 2 as “Some Risk of Bias”, and 0 as “Low Risk of Bias”. CONCLUSION: The risk of bias assessment and descriptive statistics paint a troubling picture of the overall state of validity and reliability studies. Statistical analyses, methods, and reporting vary excessively. This review and associated WEAR-BOT analysis can be used by researchers to help standardize methodology, analytics, and reporting of validation and reliability studies of consumer-grade wearable technology

    Chronotype and Social Jetlag Influence Performance and Injury during Reserve Officers’ Training Corps Physical Training

    Get PDF
    Sleep and circadian rhythms are critically important for optimal physical performance and maintaining health during training. Chronotype and altered sleep may modulate the response to exercise training, especially when performed at specific times/days, which may contribute to musculoskeletal injury. The purpose of this study was to determine if cadet characteristics (chronotype, sleep duration, and social jetlag) were associated with injury incidence and inflammation during physical training. Reserve Officers’ Training Corps (ROTC) cadets (n = 42) completed the Morningness/Eveningness Questionnaire to determine chronotype, and 1-week sleep logs to determine sleep duration and social jetlag. Salivary IL-6 was measured before and after the first and fourth exercise sessions during training. Prospective injury incidence was monitored over 14 weeks of training, and Army Physical Fitness Test scores were recorded at the conclusion. Chronotype, sleep duration, and social jetlag were assessed as independent factors impacting IL-6, injury incidence, and APFT scores using ANOVAs, chi-squared tests, and the t-test where appropriate, with significance accepted at p \u3c 0.05. Evening chronotypes performed worse on the APFT (evening = 103.8 ± 59.8 vs. intermediate = 221.9 ± 40.3 vs. morning = 216.6 ± 43.6; p \u3c 0.05), with no difference in injury incidence. Sleep duration did not significantly impact APFT score or injury incidence. Social jetlag was significantly higher in injured vs. uninjured cadets (2:40 ± 1:03 vs. 1:32 ± 55, p \u3c 0.05). Exercise increased salivary IL-6, with no significant effects of chronotype, sleep duration, or social jetlag. Evening chronotypes and cadets with social jetlag display hampered performance during morning APFT. Social jetlag may be a behavioral biomarker for musculoskeletal injury risk, which requires further investigation

    The Effects of Sitting and Walking in Green Space on State Mindfulness and Connectedness to Nature

    Get PDF
    People report feeling connected to nature while spending time in green space. The modulators of this relationship are unclear. One modulator may be state mindfulness, which is how mindful someone is in a specific moment. The first step of studying state mindfulness as a potential modulator is describing how state mindfulness and connectedness to nature respond to acute exposure to green space. PURPOSE: This study aimed to determine whether sitting and walking in green space change state mindfulness and connectedness to nature in tandem. METHODS: Participants arrived at one of two green spaces: the Thunderbird Gardens Trailhead in Cedar City, UT, or the Clark County Wetlands Park in Las Vegas, NV. After giving verbal and written consent, the participants completed the State Mindfulness Scale (SMS) and Love and Care of Nature Scale (LCN). The participants then sat alone and undisturbed for 10 minutes near the trailhead and completed the SMS and LCN again. Next, the participants walked alone for 10 minutes on the trail and completed the SMS and LCN once more. The SMS and LCN scores were compared among pre-sit, post-sit, and post-walk via two separate one-way repeated-measures ANOVAs. Population effect sizes were estimated as partial omega squared (ωp2; large effect \u3e 0.14). After each ANOVA, the post hoc pairwise comparisons were dependent-samples t-tests with Bonferroni adjustments. The α-level was 0.05 for all the statistical analyses. RESULTS: Forty-two participants completed the study (22 females, 20 males, 0 intersex; 4 African American/Black, 4 Asian, 19 Caucasian/White, 9 Hispanic/Latino, 1 Mediterranean, 1 Middle Eastern, 3 Multi-Racial, 1 Polynesian; 26 ± 9 years, 170 ± 9 cm, 69 ± 16 kg, 24 ± 4 kg/m2). The SMS scores significantly increased from pre-sit to post-sit (+29 arbitrary units [AU], 95% CI: 20, 38; p \u3c 0.001) but not from post-sit to post-walk (p = 0.23). The LCN scores significantly increased from pre-sit to post-sit (+5 AU, 95% CI: 2, 8; p = 0.003) and from post-sit to post-walk (+4 AU, 95% CI: 1, 6; p = 0.002). CONCLUSION: Sitting for 10 minutes in green space increases state mindfulness and connectedness to nature. Walking for 10 minutes further increases connectedness to nature but not state mindfulness. The next step is determining whether state mindfulness predicts connectedness to nature while in green space

    Repetition Count Concurrent Validity of Various Garmin Wrist Watches During Light Circuit Resistance Training

    Get PDF
    Wearable technology and strength training with free weights are two of the top 5 fitness trends worldwide. However, minimal physiological research has been conducted on the two together and none have measured the accuracy of devices measuring repetition counts across exercises. PURPOSE: The purpose of this study was to determine the concurrent validity of four wrist-worn Garmin devices, Instinct (x2), Fenix 6 Pro, and Vivoactive 3, to record repetition counts while performing 4 different exercises during circuit resistance training. METHODS: Twenty participants (n=10 female, n=10 male; age: 23.2 ± 7.7 years) completed this study. Participants completed 4 circuits of 4 exercises (front squat, reverse lunge, push-ups, and shoulder press) using dumbbells at a light intensity with 1 set of 10 repetitions per exercise and 30 seconds rest between exercises and 1-1.5 min rest between circuits. Mean absolute percent error (MAPE, ≀10%) and Lin’s Concordance Coefficient (CCC, ρ≄0.7) were used to validate the device’s repetitions counts in all exercises compared to the criterion reference manual count. Dependent T-tests determined differences (p≀0.05). RESULTS: No devices were considered valid (meeting both the threshold for MAPE and CCC) for measuring repetition counts during front squats (MAPE range: 3.0-18.5% and CCC range: 0.27-0.68, p value range: 0.00-0.94), reverse lunge (MAPE range: 44.5-67.0% and CCC range: 0.19-0.31, p value range: 0.00-0.28), push-ups (MAPE range: 12.5-67.5% and CCC range: 0.10-0.34, p value range: 0.07-0.83), and shoulder press (MAPE range: 18.0-51.0% and CCC range: 0.11-0.43, p value range: 0.00-0.79) exercises. CONCLUSION: The wearable wrist-worn devices were not considered accurate for repetition counts and thus manual counting should be utilized. People who strength train using free weights will need to wait for either improved repetition counting algorithms or increased sensitivity of devices before this measure can be obtained with confidence

    Concurrent Validity and Reliability of Average Heart Rate and Energy Expenditure of Identical Garmin Instinct Watches During Low Intensity Resistance Training

    Get PDF
    ABSTRACT Wearable technology and resistance training are two of the top five worldwide fitness trends for 2022 as determined by ACSM. Many devices, such as Garmin’s Instinct, have functions to track various physiological aspects during resistance training. However, to our knowledge, independent verification of the validity and reliability of these devices for estimating average heart rate (HR) and energy expenditure (EE) during resistance training are nonexistent. PURPOSE: To determine the concurrent validity and reliability of identical Garmin Instinct watches during resistance training. METHODS: Twenty subjects (n=10 female and male; age: 23.2±7.7 years; height: 169.7±11.1; weight: 76.3±15.7 kg) completed this study. Two Garmin Instinct watches were evaluated, along with the Polar H10 chest strap and Cosmed K5 portable metabolic unit as the criterion devices for average HR and EE, respectively. Subjects completed 4 circuits of 4 exercises (front squat, reverse lunge, push-ups, and shoulder press) using dumbbells at a light intensity with 1 set of 10 repetitions per exercise, 30 seconds rest between exercises, and 1-1.5 min. rest between circuits. Data were analyzed for validity (Mean Absolute Percent Error [MAPE] and Lin’s Concordance Coefficient [CCC]) and reliability (Coefficient of Variation [CV]), with predetermined thresholds of MAPE0.70, and CVRESULTS: Garmin Instinct 1 and Instinct 2 were significantly (

    Average Heart Rate and Energy Expenditure Validity of Garmin Vivoactive 3 and Fenix 6 Wrist Watches During Light Circuit Resistance Training

    Get PDF
    Our laboratory recently found wrist-worn wearable technology devices to be valid for measuring average heart rate (HR), but not valid for estimated energy expenditure (EE) compared to criterion devices, during steady state aerobic training (walking, running, biking). However, the validity of wrist-worn devices for HR and EE measures during resistance training is largely unknown. PURPOSE: The purpose of this study was to determine if two wrist-worn devices, Garmin Vivoactive 3 and Garmin Fenix 6 Pro, record valid measures of average HR and EE while performing circuit resistance training. METHODS: Twenty participants (n=10 female, n=10 male; age: 23.2 ± 7.7 years) completed this study. The Garmin Vivoactive 3 and Garmin Fenix 6 Pro were tested along with the Polar H10 chest strap and Cosmed K5 portable metabolic unit as the criterions for average HR and EE, respectively. Participants completed 4 circuits of 4 exercises (front squat, reverse lunge, push-ups, and shoulder press) using dumbbells at a light intensity with 1 set of 10 repetitions per exercise and 30 seconds rest between exercises and 1-1.5 min. rest between circuits. Mean absolute percent error (MAPE, ≀10%) and Lin’s Concordance (ρ≄0.7) were used to validate the device’s average HR (in bpm) and estimated EE (in kcals) compared to criterion reference devices. Dependent T-tests determined differences (p≀0.05). RESULTS: Average HR for Garmin Vivoactive 3 and Fenix 6 Pro were significantly different (p\u3c0.01) than the Polar H10 (115.0±23.9 and 124.5±15.4 vs 128.9±19.0 bpm, respectively), and were not considered valid (MAPE: 44.8% and 25.1%; Lin’s Concordance: 0.50 and 0.63, respectively). Estimated EE for Garmin Vivoactive 3 and Fenix 6 Pro were significantly different (p\u3c0.0001) than the Cosmed K5 (31.7±12.3 and 39.7±13.1 vs 20.3±5.5 kcals, respectively), and were not considered valid (MAPE: 309.7% and 322.1%; Lin’s Concordance: 0.04 and 0.15, respectively). CONCLUSION: Anyone involved in any resistance training aspect should be aware of the limitations of these wrist-worn devices in measuring average HR or EE

    Effect of Exercise and Hypoxia on Plasma Telomerase

    Full text link
    Introduction: Telomerase reverse transcriptase (TERT) is the enzyme that adds telomeric sequences to the end of linear chromosomes. Exercise has shown to upregulate acutely leukocyte TERT after just 30 minutes of running on a treadmill at 80% of VO2max (Denham et al., 2016). Hypoxia inducible factor 1 (HIF-1) is also a mediator of TERT in in vitro (Nishi et al., 2004). Moderate acute exposure to hypoxia was associated with substantial increases in plasma TERT in a recent study on rats (Wang et al., 2014). The specific aim of the current study was to identify if acute hypoxia upregulates plasma TERT in healthy adult humans. We hypothesized that TERT would be increased after cycling exercise and that exercise in hypoxia would illicit a greater increase than exercise alone. Methods: Ten healthy adults (5 male, 5 females 23.8 ± 4.5 yrs.) volunteered for the study. Each participant visited the lab on three separate occasions separated by 72 hours but no more than 2 weeks. The conditions were defined as normoxia (FiO2 = 20.5%) and normobaric hypoxia (FiO2 = 14.4%) created by Altitude simulation machine. On the first visit, graded exercise tests (GXT) were performed in each condition to determine the resistance at 75% of age predicted maximum heart rate (HRmax) by cycling on a cycle ergometer at 60 revolutions per minute. Exercise trials took place on subsequent visits, conditions were counterbalanced and randomized. Exercise trials were defined as 30 minutes cycling at 60 RPM at an intensity set to 75% of age predicted HRmax. 600 ”L blood samples were taken from finger stick immediately before and 30 minutes after completion of exercise trials. Blood samples were then centrifuged and plasma aliquoted and stored at -80°C until all samples were collected for later analysis. Statistical significance was accepted at p \u3c 0.05. Results: ELISA analysis did not detect any levels of TERT in the plasma samples for any of the unknowns. Work load was decreased in hypoxia compared to normoxia (110.7 ± 34.5 W, 125.8 ± 49.6 W, p = 0.04) but mean exercise heart rate was not different between conditions (144.4 ± 4.5 BPM, 146.9 ± 5.5 BPM, p = 0.065). Discussion: Plasma TERT is not detectable by ELISA analysis in healthy adults. Intensity was matched between conditions and confirmed by mean heart rate. Further research is needed to determine if hypoxia has an effect on TERT in human tissue

    Chronotype and Social Jetlag Influence Performance and Injury during Reserve Officers’ Training Corps Physical Training

    No full text
    Sleep and circadian rhythms are critically important for optimal physical performance and maintaining health during training. Chronotype and altered sleep may modulate the response to exercise training, especially when performed at specific times/days, which may contribute to musculoskeletal injury. The purpose of this study was to determine if cadet characteristics (chronotype, sleep duration, and social jetlag) were associated with injury incidence and inflammation during physical training. Reserve Officers’ Training Corps (ROTC) cadets (n = 42) completed the Morningness/Eveningness Questionnaire to determine chronotype, and 1-week sleep logs to determine sleep duration and social jetlag. Salivary IL-6 was measured before and after the first and fourth exercise sessions during training. Prospective injury incidence was monitored over 14 weeks of training, and Army Physical Fitness Test scores were recorded at the conclusion. Chronotype, sleep duration, and social jetlag were assessed as independent factors impacting IL-6, injury incidence, and APFT scores using ANOVAs, chi-squared tests, and the t-test where appropriate, with significance accepted at p < 0.05. Evening chronotypes performed worse on the APFT (evening = 103.8 ± 59.8 vs. intermediate = 221.9 ± 40.3 vs. morning = 216.6 ± 43.6; p < 0.05), with no difference in injury incidence. Sleep duration did not significantly impact APFT score or injury incidence. Social jetlag was significantly higher in injured vs. uninjured cadets (2:40 ± 1:03 vs. 1:32 ± 55, p < 0.05). Exercise increased salivary IL-6, with no significant effects of chronotype, sleep duration, or social jetlag. Evening chronotypes and cadets with social jetlag display hampered performance during morning APFT. Social jetlag may be a behavioral biomarker for musculoskeletal injury risk, which requires further investigation
    corecore