7 research outputs found

    Validity of energy expenditure estimation methods during 10 days of military training

    Get PDF
    Wearable physical activity (PA) monitors have improved the ability to estimate free-living total energy expenditure (TEE) but their application during arduous military training alongside more well-established research methods has not been widely documented. This study aimed to assess the validity of two wrist-worn activity monitors and a PA log against doubly-labelled water (DLW) during British Army Officer Cadet (OC) training. For 10 days of training, twenty (10 male and 10 female) OCs (mean ± SD: age 23 ± 2 years, height 1.74 ± 0.09 m, body mass 77.0 ± 9.3 kg) wore one research-grade accelerometer (GENEActiv, Cambridge, UK) on the dominant wrist, wore one commercially-available monitor (Fitbit SURGE, USA) on the non-dominant wrist and completed a self-report PA log. Immediately prior to this 10-day period, participants consumed a bolus of DLW and provided daily urine samples, which were analysed by mass spectrometry to determine TEE. Bivariate correlations and limits of agreement (LoA) were employed to compare TEE from each estimation method to DLW. Average daily TEE from DLW was 4112 ± 652 kcal·day against which the GENEActiv showed near identical average TEE (mean bias ± LoA: -15 ± 851 kcal day ) while Fitbit tended to underestimate (-656 ± 683 kcal·day ) and the PA log substantially overestimate (+1946 ± 1637 kcal·day ). Wearable physical activity monitors provide a cheaper and more practical method for estimating free-living TEE than DLW in military settings. The GENEActiv accelerometer demonstrated good validity for assessing daily TEE and would appear suitable for use in large-scale, longitudinal military studies

    Nutrition and Physical Activity during British Army Officer Cadet Training: Part 1 - Energy Balance and Energy Availability

    Get PDF
    Military training is characterised by high daily energy expenditures (EE) which are difficult to match with energy intake (EI) potentially resulting in negative energy balance (EB) and low energy availability (EA). The aim of this study was to quantify EB and EA during British Army Officer Cadet (OC) training. Thirteen (seven women) OCs (mean ± SD: age 24 ± 3 years) volunteered to participate. EB and EA were estimated from EI (weighing of food and food diaries) and EE (doubly-labelled water) measured in three periods of training; nine days on-camp (CAMP), a five-day field exercise (FEX) and a nine-day mixture of both (MIX). Variables were compared by condition and gender with a repeated measures ANOVA. Negative EB was greatest during FEX (-2197 ± 455 kcal·d-1) compared with CAMP (-692 ± 506 kcal·d-1; p<0.001) and MIX (-1280 ± 309 kcal·d-1; p<0.001). EA was greatest in CAMP (23 ± 10 kcal·d-1) compared with FEX (1 ± 16 kcal·d-1; p=0.002) and MIX (10 ± 7 kcal·d-1; p=0.003), with no apparent difference between FEX and MIX (p=0.071). Irrespective of condition, there were no apparent differences between gender in EB (p=0.375) or EA (p=0.385). These data can be used to inform evidenced-based strategies to manage EA and EB during military training and enhance the health and performance of military personnel

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    The Importance of Hydration in Body Composition Assessment in Children Aged 6-16 Years.

    No full text
    Body composition is associated with many noncommunicable diseases. The accuracy of many simple techniques used for the assessment of body composition is influenced by the fact that they do not take into account tissue hydration and this can be particularly problematic in paediatric populations. The aims of this study were: (1) to assess the agreement of two dual energy X-ray absorptiometry (DXA) systems for determining total and regional (arms, legs, trunk) fat, lean, and bone mass and (2) to compare lean soft tissue (LST) hydration correction methods in children. One hundred and twenty four healthy children aged between 6 and 16 years old underwent DXA scans using 2 GE healthcare Lunar systems (iDXA and Prodigy). Tissue hydration was either calculated by dividing total body water (TBW), by 4-component model derived fat free mass (HFFMTBW) or by using the age and sex specific coefficients of Lohman, 1986 (HFFMLohman) and used to correct LST. Regression analysis was performed to develop cross-calibration equations between DXA systems and a paired samples t-test was conducted to assess the difference between LST hydration correction methods. iDXA resulted in significantly lower estimates of total and regional fat and lean mass, compared to Prodigy. HFFMTBW showed a much larger age/sex related variability than HFFMLohman. A 2.0 % difference in LST was observed in the boys (34.5 kg vs 33.8 kg respectively, p < 0.05) and a 2.5% difference in the girls (28.2 kg vs 27.5 kg respectively, p < 0.05) when corrected using either HFFMTBW or HFFMLohman. Care needs to be exercised when combining data from iDXA and Prodigy, as total and regional estimates of body composition can differ significantly. Furthermore, tissue hydration should be taken into account when assessing body composition as it can vary considerably within a healthy paediatric population even within specific age and/or sex groups

    A Bayesian reanalysis of the Standard versus Accelerated Initiation of Renal-Replacement Therapy in Acute Kidney Injury (STARRT-AKI) trial

    No full text
    Background Timing of initiation of kidney-replacement therapy (KRT) in critically ill patients remains controversial. The Standard versus Accelerated Initiation of Renal-Replacement Therapy in Acute Kidney Injury (STARRT-AKI) trial compared two strategies of KRT initiation (accelerated versus standard) in critically ill patients with acute kidney injury and found neutral results for 90-day all-cause mortality. Probabilistic exploration of the trial endpoints may enable greater understanding of the trial findings. We aimed to perform a reanalysis using a Bayesian framework. Methods We performed a secondary analysis of all 2927 patients randomized in multi-national STARRT-AKI trial, performed at 168 centers in 15 countries. The primary endpoint, 90-day all-cause mortality, was evaluated using hierarchical Bayesian logistic regression. A spectrum of priors includes optimistic, neutral, and pessimistic priors, along with priors informed from earlier clinical trials. Secondary endpoints (KRT-free days and hospital-free days) were assessed using zero–one inflated beta regression. Results The posterior probability of benefit comparing an accelerated versus a standard KRT initiation strategy for the primary endpoint suggested no important difference, regardless of the prior used (absolute difference of 0.13% [95% credible interval [CrI] − 3.30%; 3.40%], − 0.39% [95% CrI − 3.46%; 3.00%], and 0.64% [95% CrI − 2.53%; 3.88%] for neutral, optimistic, and pessimistic priors, respectively). There was a very low probability that the effect size was equal or larger than a consensus-defined minimal clinically important difference. Patients allocated to the accelerated strategy had a lower number of KRT-free days (median absolute difference of − 3.55 days [95% CrI − 6.38; − 0.48]), with a probability that the accelerated strategy was associated with more KRT-free days of 0.008. Hospital-free days were similar between strategies, with the accelerated strategy having a median absolute difference of 0.48 more hospital-free days (95% CrI − 1.87; 2.72) compared with the standard strategy and the probability that the accelerated strategy had more hospital-free days was 0.66. Conclusions In a Bayesian reanalysis of the STARRT-AKI trial, we found very low probability that an accelerated strategy has clinically important benefits compared with the standard strategy. Patients receiving the accelerated strategy probably have fewer days alive and KRT-free. These findings do not support the adoption of an accelerated strategy of KRT initiation

    Effect of Antiplatelet Therapy on Survival and Organ Support–Free Days in Critically Ill Patients With COVID-19

    No full text
    International audienc
    corecore