41 research outputs found

    Smoking and Biochemical, Performance, and Muscle Adaptation to Military Training

    Get PDF
    PURPOSE: To determine whether physical performance adaptation is impaired in smokers during early stages of military training, and to examine some of the putative mechanistic candidates that could explain any impairment.METHODS: We examined measures of oxidative stress (malondialdehyde (MDA), lipid hydroperoxides), inflammation (C-reactive protein (CRP), interleukin-6), antioxidants (Vitamins A, E and carotenes) and hormones (cortisol, testosterone, insulin-like growth factor-1) in 65 male British Army Infantry recruits (mean ± SD age: 21 ± 3 yr; mass: 75.5 ± 8.4 kg; height: 1.78 ± 0.07 m) at week 1, week 5 and week 10 of basic training. Physical performance (static lift, grip strength, jump height, 2.4 km run time and two-minute press up and sit up scores) was examined and lower-leg muscle and adipose cross-sectional area (CSA) and density measured by peripheral Quantitative Computed Tomography.RESULTS: Basic Military training, irrespective of smoking status, elicited improvement in all physical performance parameters (main time effect; P &lt; 0.05) except grip strength and jump height, and resulted in increased muscle area and decreased fat area in the lower leg (P &lt; 0.05). MDA was higher in smokers at baseline, and both MDA and CRP were greater in smokers during training (main group effect; P &lt; 0.05), than non-smokers. Absolute performance measures, muscle characteristics of the lower leg and other oxidative stress, antioxidant, endocrine and inflammatory markers were similar in the two groups.CONCLUSIONS: Oxidative stress and inflammation were elevated in habitual smokers during basic military training, but there was no clear evidence that this was detrimental to physical adaptation in this population over the timescale studied.</p

    Smoking and Biochemical, Performance, and Muscle Adaptation to Military Training

    Get PDF
    Purpose To determine whether physical performance adaptation is impaired in smokers during early stages of military training, and to examine some of the putative mechanistic candidates that could explain any impairment. Methods We examined measures of oxidative stress (malondialdehyde (MDA), lipid hydroperoxides), inflammation (C-reactive protein (CRP), interleukin-6), antioxidants (Vitamins A, E and carotenes) and hormones (cortisol, testosterone, insulin-like growth factor-1) in 65 male British Army Infantry recruits (mean ± SD age: 21 ± 3 yr; mass: 75.5 ± 8.4 kg; height: 1.78 ± 0.07 m) at week 1, week 5 and week 10 of basic training. Physical performance (static lift, grip strength, jump height, 2.4 km run time and two-minute press up and sit up scores) was examined and lower-leg muscle and adipose cross-sectional area (CSA) and density measured by peripheral Quantitative Computed Tomography. Results Basic Military training, irrespective of smoking status, elicited improvement in all physical performance parameters (main time effect; P < 0.05) except grip strength and jump height, and resulted in increased muscle area and decreased fat area in the lower leg (P < 0.05). MDA was higher in smokers at baseline, and both MDA and CRP were greater in smokers during training (main group effect; P < 0.05), than non-smokers. Absolute performance measures, muscle characteristics of the lower leg and other oxidative stress, antioxidant, endocrine and inflammatory markers were similar in the two groups. Conclusions Oxidative stress and inflammation were elevated in habitual smokers during basic military training, but there was no clear evidence that this was detrimental to physical adaptation in this population over the timescale studied

    Dietary Intake and Nitrogen Balance in British Army Infantry Recruits Undergoing Basic Training

    Get PDF
    We assessed dietary intake and nitrogen balance during 14 weeks of Basic Training (BT) in British Army Infantry recruits. Nineteen men (mean ± SD: age 19.9 ± 2.6 years, height: 175.7 ± 6.5 cm, body mass 80.3 ± 10.1 kg) at the Infantry Training Centre, Catterick (ITC(C)) volunteered. Nutrient intakes and 24-h urinary nitrogen balance were assessed in weeks 2, 6 and 11 of BT. Nutrient intake was assessed using researcher-led weighed food records and food diaries, and Nutritics professional dietary software. Data were compared between weeks using a repeated-measures analysis of variance (ANOVA) with statistical significance set at p ≤ 0.05. There was a significant difference in protein intake (g) between weeks 2 and 11 of BT (115 ± 18 vs. 91 ± 20 g, p = 0.02, ES = 1.26). There was no significant difference in mean absolute daily energy (p = 0.44), fat (p = 0.79) or carbohydrate (CHO) intake (p = 0.06) between weeks. Nitrogen balance was maintained in weeks 2, 6 and 11, but declined throughout BT (2: 4.6 ± 4.1 g, 6: 1.6 ± 4.5 g, 11: −0.2 ± 5.5 g, p = 0.07). A protein intake of 1.5 g·kg−1·d−1 may be sufficient in the early stages of BT, but higher intakes may be individually needed later on in BT

    Does Protein Supplementation Support Adaptations to Arduous Concurrent Exercise Training? A Systematic Review and Meta-Analysis with Military Based Applications

    Get PDF
    We evaluated the impact of protein supplementation on adaptations to arduous concurrent training in healthy adults with potential applications to individuals undergoing military training. Peer-reviewed papers published in English meeting the population, intervention, comparison and outcome criteria were included. Database searches were completed in PubMed, Web of science and SPORTDiscus. Study quality was evaluated using the COnsensus based standards for the selection of health status measurement instruments checklist. Of 11 studies included, nine focused on performance, six on body composition and four on muscle recovery. Cohen’s d effect sizes showed that protein supplementation improved performance outcomes in response to concurrent training (ES = 0.89, 95% CI = 0.08–1.70). When analysed separately, improvements in muscle strength (SMD = +4.92 kg, 95% CI = −2.70–12.54 kg) were found, but not in aerobic endurance. Gains in fat-free mass (SMD = +0.75 kg, 95% CI = 0.44–1.06 kg) and reductions in fat-mass (SMD = −0.99, 95% CI = −1.43–0.23 kg) were greater with protein supplementation. Most studies did not report protein turnover, nitrogen balance and/or total daily protein intake. Therefore, further research is warranted. However, our findings infer that protein supplementation may support lean-mass accretion and strength gains during arduous concurrent training in physical active populations, including military recruits

    Low fitness, low body mass and prior injury predict injury risk during military recruit training: a prospective cohort study in the British Army

    Get PDF
    Background Injuries sustained by military recruits during initial training impede training progression and military readiness while increasing financial costs. This study investigated training-related injuries and injury risk factors among British Army infantry recruits. Methods Recruits starting infantry training at the British Army Infantry Training Centre between September 2008 and March 2010 were eligible to take part. Information regarding lifestyle behaviours and injury history was collected using the Military Pre-training Questionnaire. Sociodemographic, anthropometric, physical fitness and injury (lower limb and lower back) data were obtained from Army databases. Univariable and multivariable Cox regression models were used to explore the association between time to first training injury and potential risk factors. Results 58% (95% CI 55% to 60%) of 1810 recruits sustained at least 1 injury during training. Overuse injuries were more common than traumatic injuries (65% and 35%, respectively). The lower leg accounted for 81% of all injuries, and non-specific soft tissue damage was the leading diagnosis (55% of all injuries). Injuries resulted in 122 (118 to 126) training days lost per 1000 person-days. Slower 2.4 km run time, low body mass, past injury and shin pain were independently associated with higher risk of any injury. Conclusions There was a high incidence of overuse injuries in British Army recruits undertaking infantry training. Recruits with lower pretraining fitness levels, low body mass and past injuries were at higher risk. Faster 2.4 km run time performance and minimal body mass standards should be considered for physical entry criteria

    Distal tibial bone properties and bone stress injury risk in young men undergoing arduous physical training

    Get PDF
    Trabecular microarchitecture contributes to bone strength, but its role in bone stress injury (BSI) risk in young healthy adults is unclear. Tibial volumetric BMD (vBMD), geometry, and microarchitecture, whole-body areal BMD, lean and fat mass, biochemical markers of bone metabolism, aerobic fitness, and muscle strength and power were measured in 201 British Army male infantry recruits (age 20.7 [4.3] years, BMI 24.0 ± 2.7 kg·m2) in week one of basic training. Tibial scans were performed at the ultra-distal site, 22.5 mm from the distal endplate of the non-dominant leg using High Resolution Peripheral Quantitative Computed Tomography (XtremeCT, Scanco Medical AG, Switzerland). Binary logistic regression analysis was performed to identify associations with lower body BSI confirmed by MRI. 20 recruits (10.0%) were diagnosed with a lower body BSI. Pre-injured participants had lower cortical area, stiffness and estimated failure load (p = 0.029, 0.012 and 0.011 respectively) but tibial vBMD, geometry, and microarchitecture were not associated with BSI incidence when controlling for age, total body mass, lean body mass, height, total 25(OH)D, 2.4-km run time, peak power output and maximum dynamic lift strength. Infantry Regiment (OR 9.3 [95%CI, 2.6, 33.4]) Parachute versus Line Infantry, (p ≤ 0.001) and 2.4-km best effort run time (1.06 [95%CI, 1.02, 1.10], p < 0.033) were significant predictors. Intrinsic risk factors, including ultradistal tibial density, geometry, and microarchitecture, were not associated with lower body BSI during arduous infantry training. The ninefold increased risk of BSI in the Parachute Regiment compared with Line Infantry suggests that injury propensity is primarily a function of training load and risk factors are population-specific

    Vitamin D metabolites are associated with musculoskeletal injury in young adults: a prospective cohort study.

    Get PDF
    The relationship between vitamin D metabolites and lower body (pelvis and lower limb) overuse injury is unclear. In a prospective cohort study, we investigated the association between vitamin D metabolites and incidence of lower body overuse musculoskeletal and bone stress injury in young adults undergoing initial military training during all seasons. In 1637 men and 530 women (age, 22.6 ± 7.5 years; BMI, 24.0 ± 2.6 kg∙m−2; 94.3% white ethnicity), we measured serum 25-hydroxyvitamin D (25(OH)D) and 24,25-dihydroxyvitamin D (24,25(OH)2D) by high-performance liquid chromatography tandem mass spectrometry, and 1,25-dihydroxyvitamin D (1,25(OH)2D) by immunoassay during week 1 of training. We examined whether the relationship between 25(OH)D and 1,25(OH)2D:24,25(OH)2D ratio was associated with overuse injury. During 12 weeks training, 21.0% sustained ≥1 overuse musculoskeletal injury, and 5.6% sustained ≥1 bone stress injury. After controlling for sex, BMI, 2.4 km run time, smoking, bone injury history, and Army training course (Officer, standard, or Infantry), lower body overuse musculoskeletal injury incidence was higher for participants within the second lowest versus highest quartile of 24,25(OH)2D (OR: 1.62 [95%CI 1.13–2.32; P = 0.009]) and lowest versus highest cluster of 25(OH)D and 1,25(OH)2D:24,25(OH)2D (OR: 6.30 [95%CI 1.89–21.2; P = 0.003]). Lower body bone stress injury incidence was higher for participants within the lowest versus highest quartile of 24,25(OH)2D (OR: 4.02 [95%CI 1.82–8.87; P < 0.001]) and lowest versus highest cluster of 25(OH)D and 1,25(OH)2D:24,25(OH)2D (OR: 22.08 [95%CI 3.26–149.4; P = 0.001]), after controlling for the same covariates. Greater conversion of 25(OH)D to 24,25(OH)2D, relative to 1,25(OH)2D (i.e., low 1,25(OH)2D:24,25(OH)2D), and higher serum 24,25(OH)2D were associated with a lower incidence of lower body overuse musculoskeletal and bone stress injury. Serum 24,25(OH)2D may have a role in preventing overuse injury in young adults undertaking arduous physical training

    Influence of vitamin D supplementation by sunlight or oral D3 on exercise performance

    Get PDF
    Purpose: To determine the relationship between vitamin D status and exercise performance in a large, prospective cohort study of young men and women across seasons (Study-1). Then, in a randomized, placebo-controlled trial, to investigate the effects on exercise performance of achieving vitamin D sufficiency (serum 25(OH)D ≥ 50 nmol·L-1) by a unique comparison of safe, simulated-sunlight and oral vitamin D3 supplementation in wintertime (Study-2).  Methods: In Study-1, we determined 25(OH)D relationship with exercise performance in 967 military recruits. In Study-2, 137 men received either placebo, simulated-sunlight (1.3x standard erythemal dose in T-shirt and shorts, three-times-per-week for 4-weeks and then once-per-week for 8-weeks) or oral vitamin D3 (1,000 IU[BULLET OPERATOR]day-1 for 4-weeks and then 400 IU[BULLET OPERATOR]day-1 for 8-weeks). We measured serum 25(OH)D by LC-MS/MS and endurance, strength and power by 1.5-mile run, maximum-dynamic-lift and vertical jump, respectively.  Results: In Study-1, only 9% of men and 36% of women were vitamin D sufficient during wintertime. After controlling for body composition, smoking and season, 25(OH)D was positively associated with endurance performance (P ≤ 0.01, [INCREMENT]R2 = 0.03–0.06, small f2 effect sizes): 1.5-mile run time was ~half-a-second faster for every 1 nmol·L-1 increase in 25(OH)D. No significant effects on strength or power emerged (P > 0.05). In Study-2, safe simulated-sunlight and oral vitamin D3 supplementation were similarly effective in achieving vitamin D sufficiency in almost all (97%); however, this did not improve exercise performance (P > 0.05).  Conclusion: Vitamin D status was associated with endurance performance but not strength or power in a prospective cohort study. Achieving vitamin D sufficiency via safe, simulated summer sunlight or oral vitamin D3 supplementation did not improve exercise performance in a randomized-controlled trial

    Good perceived sleep quality protects against the raised risk of respiratory infection during sleep restriction in young adults

    Get PDF
    Study Objectives: Prospectively examine the association between sleep restriction, perceived sleep quality (PSQ) and upper respiratory tract infection (URTI). Methods: In 1318 military recruits (68% males) self-reported sleep was assessed at the beginning and end of a 12-week training course. Sleep restriction was defined as an individualized reduction in sleep duration of ≥2 hours/night compared with civilian life. URTIs were retrieved from medical records. Results: On commencing training, approximately half of recruits were sleep restricted (52%; 2.1 ± 1.6 h); despite the sleep debt, 58% of recruits with sleep restriction reported good PSQ. Regression adjusted for covariates showed that recruits commencing training with sleep restriction were more likely to suffer URTI during the course (OR = 2.93, 95% CI 1.29–6.69, p = .011). Moderation analysis showed this finding was driven by poor PSQ (B = −1.12, SE 0.50, p = .023), as no significant association between sleep restriction and URTI was observed in recruits reporting good PSQ, despite a similar magnitude of sleep restriction during training. Associations remained in the population completing training, accounting for loss to follow-up. Recruits reporting poor PSQ when healthy at the start and end of training were more susceptible to URTI (OR = 3.16, 95% CI 1.31–7.61, p = .010, vs good PSQ). Conclusion: Good perceived sleep quality was associated with protection against the raised risk of respiratory infection during sleep restriction. Studies should determine whether improvements in sleep quality arising from behavioral sleep interventions translate to reduced respiratory infection during sleep restriction

    Pre-sleep protein supplementation does not improve recovery from load carriage in British Army recruits (part 2)

    Get PDF
    British Army basic training (BT) is physically demanding with new recruits completing multiple bouts of physical activity each day with limited recovery. Load carriage is one of the most physically demanding BT activities and has been shown to induce acute exercise-induced muscle damage (EIMD) and impair muscle function. Protein supplementation can accelerate muscle recovery by attenuating EIMD and muscle function loss. This study investigated the impact of an additional daily bolus of protein prior to sleep throughout training on acute muscle recovery following a load carriage test in British Army recruits. Ninety nine men and 23 women (mean ± SD: age: 21.3 ± 3.5 yrs., height: 174.8 ± 8.4 cm, body mass 75.4 ± 12.2 kg) were randomized to dietary control (CON), carbohydrate placebo (PLA), moderate (20 g; MOD) or high (60 g; HIGH) protein supplementation. Muscle function (maximal jump height), perceived muscle soreness and urinary markers of muscle damage were assessed before (PRE), immediately post (POST), 24-h post (24 h-POST) and 40-h post (40 h-POST) a load carriage test. There was no impact of supplementation on muscle function at POST (p = 0.752) or 40 h-POST (p = 0.989) load carriage but jump height was greater in PLA compared to HIGH at 24 h-POST (p = 0.037). There was no impact of protein supplementation on muscle soreness POST (p = 0.605), 24 h-POST (p = 0.182) or 40 h-POST (p = 0.333). All groups had increased concentrations of urinary myoglobin and 3-methylhistidine, but there was no statistical difference between groups at any timepoint (p &gt; 0.05). We conclude that pre-sleep protein supplementation does not accelerate acute muscle recovery following load carriage in British Army recruits during basic training. The data suggests that consuming additional energy in the form of CHO or protein was beneficial at attenuating EIMD, although it is acknowledged there were no statistical differences between groups. Although EIMD did occur as indicated by elevated urinary muscle damage markers, it is likely that the load carriage test was not arduous enough to reduce muscle function, limiting the impact of protein supplementation. Practically, protein supplementation above protein intakes of 1.2 g⸱kg−1⸱day−1 following load carriage over similar distances (4 km) and carrying similar loads (15–20 kg) does not appear to be warranted
    corecore