36 research outputs found

    Considerations for the consumption of vitamin and mineral supplements in athlete populations

    Get PDF
    Vitamins and minerals are of fundamental importance to numerous human functions that are essential to optimise athlete performance. Athletes incur a high turnover of key vitamins and minerals and are therefore dependent on sufficient energy intake to replenish nutrient stores. However, many athletes are poor at servicing their energy replenishment needs, especially female athletes, and although a ‘food first approach’ to meeting nutrient requirements is the primary goal, it may be important for some athletes to consider a vitamin and/or mineral supplement to meet their daily needs. When working to determine if an athlete requires vitamin or mineral supplements, practitioners should use a robust framework to assess the overall energy requirements, current dietary practices and the biological and clinical status of their athletes. Of note, any supplementation plan should account for the various factors that may impact the efficacy of the approach (e.g. athlete sex, the nutrient recommended dietary intake, supplement dose/timing, co-consumption of other foods and any food–drug interactions). Importantly, there are numerous vitamins and minerals of key importance to athletes, each having specific relevance to certain situations (e.g. iron and B vitamins are significant contributors to haematological adaptation, calcium and vitamin D are important to bone health and folate is important in the female athlete); therefore, the appropriate supplement for a given situation should be carefully considered and consumed with the goal to augment an athlete’s diet

    Dietary iron and the elite dancer

    Get PDF
    Dancers are an athlete population at high risk of developing iron deficiency (ID). The aesthetic nature of the discipline means dancers potentially utilise dietary restriction to meet physique goals. In combination with high training demands, this means dancers are susceptible to problems related to low energy availability (LEA), which impacts nutrient intake. In the presence of LEA, ID is common because of a reduced mineral content within the low energy diet. Left untreated, ID becomes an issue that results in fatigue, reduced aerobic work capacity, and ultimately, iron deficient anaemia (IDA). Such progression can be detrimental to a dancer’s capacity given the physically demanding nature of training, rehearsal, and performances. Previous literature has focused on the manifestation and treatment of ID primarily in the context of endurance athletes; however, a dance-specific context addressing the interplay between dance training and performance, LEA and ID is essential for practitioners working in this space. By consolidating findings from identified studies of dancers and other relevant athlete groups, this review explores causal factors of ID and potential treatment strategies for dancers to optimise absorption from an oral iron supplementation regime to adequately support health and performance

    Dietary iron and the elite dancer

    Get PDF
    Dancers are an athlete population at high risk of developing iron deficiency (ID). The aesthetic nature of the discipline means dancers potentially utilise dietary restriction to meet physique goals. In combination with high training demands, this means dancers are susceptible to problems related to low energy availability (LEA), which impacts nutrient intake. In the presence of LEA, ID is common because of a reduced mineral content within the low energy diet. Left untreated, ID becomes an issue that results in fatigue, reduced aerobic work capacity, and ultimately, iron deficient anaemia (IDA). Such progression can be detrimental to a dancer’s capacity given the physically demanding nature of training, rehearsal, and performances. Previous literature has focused on the manifestation and treatment of ID primarily in the context of endurance athletes; however, a dance-specific context addressing the interplay between dance training and performance, LEA and ID is essential for practitioners working in this space. By consolidating findings from identified studies of dancers and other relevant athlete groups, this review explores causal factors of ID and potential treatment strategies for dancers to optimise absorption from an oral iron supplementation regime to adequately support health and performance

    Six days of low carbohydrate, not energy availability, alters the iron and immune response to exercise in elite athletes

    Get PDF
    Purpose To quantify the effects of a short-term (6-d) low carbohydrate (CHO) high fat (LCHF), and low energy availability (LEA) diet on immune, inflammatory, and iron-regulatory responses to exercise in endurance athletes. Methods Twenty-eight elite male race walkers completed two 6-d diet/training phases. During phase 1 (Baseline), all athletes consumed a high CHO/energy availability (CON) diet (65% CHO and ~40 kcal·kg−1 fat-free mass (FFM)·d−1). In phase 2 (Adaptation), athletes were allocated to either a CON (n = 10), LCHF (n = 8; <50 g·d−1 CHO and ~40 kcal·kg−1·FFM−1·d−1), or LEA diet (n = 10; 60% CHO and 15 kcal·kg−1·FFM−1·d−1). At the end of each phase, athletes completed a 25-km race walk protocol at ~75% V˙O2max. On each occasion, venous blood was collected before and after exercise for interleukin-6, hepcidin, cortisol, and glucose concentrations, as well as white blood cell counts. Results The LCHF athletes displayed a greater IL-6 (P = 0.019) and hepcidin (P = 0.011) response to exercise after Adaptation, compared with Baseline. Similarly, postexercise increases in total white blood cell counts (P = 0.026) and cortisol levels (P 0.05). No differences between CON and LEA were evident for any of the measured biological markers (all P > 0.05). Conclusions Short-term adherence to a LCHF diet elicited small yet unfavorable iron, immune, and stress responses to exercise. In contrast, no substantial alterations to athlete health were observed when athletes restricted energy availability compared with athletes with adequate energy availability. Therefore, short-term restriction of CHO, rather than energy, may have greater negative impacts on athlete health

    Short severe energy restriction with refueling reduces body mass without altering training-associated performance improvement

    Get PDF
    Purpose We investigated short-term (9 d) exposure to low energy availability (LEA) in elite endurance athletes during a block of intensified training on self-reported well-being, body composition, and performance. Methods Twenty-three highly trained race walkers undertook an ~3-wk research-embedded training camp during which they undertook baseline testing and 6 d of high energy/carbohydrate (HCHO) availability (40 kcal·kg FFM−1·d−1) before being allocated to 9 d continuation of this diet (n = 10 M, 2 F) or a significant decrease in energy availability to 15 kcal·kg FFM−1·d−1 (LEA: n = 10 M, 1 F). A real-world 10,000-m race walking event was undertaken before (baseline) and after (adaptation) these phases, with races being preceded by standardized carbohydrate fueling (8 g·kg body mass [BM]−1 for 24 h and 2 g·kg BM−1 prerace meal). Results Dual-energy x-ray absorptiometry–assessed body composition showed BM loss (2.0 kg, P < 0.001), primarily due to a 1.6-kg fat mass reduction (P < 0.001) in LEA, with smaller losses (BM = 0.9 kg, P = 0.008; fat mass = 0.9 kg, P < 0.001) in HCHO. The 76-item Recovery–Stress Questionnaire for Athletes, undertaken at the end of each dietary phase, showed significant diet–trial effects for overall stress (P = 0.021), overall recovery (P = 0.024), sport-specific stress (P = 0.003), and sport-specific recovery (P = 0.012). However, improvements in race performance were similar: 4.5% ± 4.1% and 3.5% ± 1.8% for HCHO and LEA, respectively (P < 0.001). The relationship between changes in performance and prerace BM was not significant (r = −0.08 [−0.49 to 0.35], P = 0.717). Conclusions A series of strategically timed but brief phases of substantially restricted energy availability might achieve ideal race weight as part of a long-term periodization of physique by high-performance athletes, but the relationship between BM, training quality, and performance in weight-dependent endurance sports is complicated

    Managing female athlete health : Auditing the representation of female versus male participants among research in supplements to manage diagnosed micronutrient issues

    Get PDF
    Micronutrient deficiencies and sub-optimal intakes among female athletes are a concern and are commonly prevented or treated with medical supplements. However, it is unclear how well women have been considered in the research underpinning current supplementation practices. We conducted an audit of the literature supporting the use of calcium, iron, and vitamin D. Of the 299 studies, including 25,171 participants, the majority (71%) of participants were women. Studies with exclusively female cohorts (37%) were also more prevalent than those examining males in isolation (31%). However, study designs considering divergent responses between sexes were sparse, accounting for 7% of the literature. Moreover, despite the abundance of female participants, the quality and quantity of the literature specific to female athletes was poor. Just 32% of studies including women defined menstrual status, while none implemented best-practice methodologies regarding ovarian hormonal control. Additionally, only 10% of studies included highly trained female athletes. Investigations of calcium supplementation were particularly lacking, with just two studies conducted in highly trained women. New research should focus on high-quality investigations specific to female athletes, alongside evaluating sex-based differences in the response to calcium, iron, and vitamin D, thus ensuring the specific needs of women have been considered in current protocols involving medical supplements

    The impact of acute calcium intake on bone turnover markers during a training day in elite male rowers

    Get PDF
    Introduction: While an acute exercise session typically increases bone turnover markers (BTM), the impact of subsequent sessions and the interaction with pre-exercise calcium intake remains unclear despite the application to the ‘real life’ training of many competitive athletes. Methods: Using a randomized crossover design, elite male rowers (n = 16) completed two trials, a week apart, consisting of two 90-minute rowing ergometer sessions (Ex1, Ex2) separated by 150 minutes. Prior to each trial, participants consumed a high (CAL: ~1000 mg) or isocaloric low (CON: \u3c 10 mg) calcium meal. Biochemical markers including parathyroid hormone: PTH; serum ionised calcium (iCa) and bone turnover markers (C-terminal telopeptide of type I collagen: β-CTX-I; osteocalcin: OC) were monitored from baseline to 3 hours post Ex2. Results: While each session caused perturbances of serum iCa, CAL maintained calcium concentrations above those of CON for most time points, 4.5 and 2.4 % higher post EX1 and EX2 respectively. The decrease in iCa in CON was associated with an elevation of blood PTH (p \u3c 0.05) and β-CTX-I (p \u3c 0.0001) over this period of repeated training sessions and their recovery, particularly during and after Ex2. Pre-exercise intake of calcium-rich foods lowered BTM over the course of a day with several training sessions. Conclusions: Pre-exercise intake of a calcium-rich meal prior to training sessions undertaken within the same day had a cumulative and prolonged effect on the stabilisation of blood iCa during exercise. In turn, this reduced the post-exercise PTH response, potentially attenuating the increase in markers of bone resorption. Such practical strategies may be integrated into the athlete’s overall sports nutrition plan, with the potential to safeguard long term bone health and reduce the risk of bone stress injuries

    Short-Term Very High Carbohydrate Diet and Gut-Training Have Minor Effects on Gastrointestinal Status and Performance in Highly Trained Endurance Athletes

    Get PDF
    We implemented a multi-pronged strategy (MAX) involving chronic (2 weeks high carbohydrate [CHO] diet + gut-training) and acute (CHO loading + 90 g·h(−1) CHO during exercise) strategies to promote endogenous and exogenous CHO availability, compared with strategies reflecting lower ranges of current guidelines (CON) in two groups of athletes. Nineteen elite male race walkers (MAX: 9; CON:10) undertook a 26 km race-walking session before and after the respective interventions to investigate gastrointestinal function (absorption capacity), integrity (epithelial injury), and symptoms (GIS). We observed considerable individual variability in responses, resulting in a statistically significant (p < 0.001) yet likely clinically insignificant increase (Δ 736 pg·mL(−1)) in I-FABP after exercise across all trials, with no significant differences in breath H(2) across exercise (p = 0.970). MAX was associated with increased GIS in the second half of the exercise, especially in upper GIS (p < 0.01). Eighteen highly trained male and female distance runners (MAX: 10; CON: 8) then completed a 35 km run (28 km steady-state + 7 km time-trial) supported by either a slightly modified MAX or CON strategy. Inter-individual variability was observed, without major differences in epithelial cell intestinal fatty acid binding protein (I-FABP) or GIS, due to exercise, trial, or group, despite the 3-fold increase in exercise CHO intake in MAX post-intervention. The tight-junction (claudin-3) response decreased in both groups from pre- to post-intervention. Groups achieved a similar performance improvement from pre- to post-intervention (CON = 39 s [95 CI 15–63 s]; MAX = 36 s [13–59 s]; p = 0.002). Although this suggests that further increases in CHO availability above current guidelines do not confer additional advantages, limitations in our study execution (e.g., confounding loss of BM in several individuals despite a live-in training camp environment and significant increases in aerobic capacity due to intensified training) may have masked small differences. Therefore, athletes should meet the minimum CHO guidelines for training and competition goals, noting that, with practice, increased CHO intake can be tolerated, and may contribute to performance outcomes

    Short-term very high carbohydrate diet and gut-training have minor effects on gastrointestinal status and performance in highly trained endurance athletes

    Get PDF
    We implemented a multi-pronged strategy (MAX) involving chronic (2 weeks high carbohydrate [CHO] diet + gut-training) and acute (CHO loading + 90 g·h−1 CHO during exercise) strategies to promote endogenous and exogenous CHO availability, compared with strategies reflecting lower ranges of current guidelines (CON) in two groups of athletes. Nineteen elite male race walkers (MAX: 9; CON:10) undertook a 26 km race-walking session before and after the respective interventions to investigate gastrointestinal function (absorption capacity), integrity (epithelial injury), and symptoms (GIS). We observed considerable individual variability in responses, resulting in a statistically significant (p < 0.001) yet likely clinically insignificant increase (Δ 736 pg·mL−1) in I-FABP after exercise across all trials, with no significant differences in breath H2 across exercise (p = 0.970). MAX was associated with increased GIS in the second half of the exercise, especially in upper GIS (p < 0.01). Eighteen highly trained male and female distance runners (MAX: 10; CON: 8) then completed a 35 km run (28 km steady-state + 7 km time-trial) supported by either a slightly modified MAX or CON strategy. Inter-individual variability was observed, without major differences in epithelial cell intestinal fatty acid binding protein (I-FABP) or GIS, due to exercise, trial, or group, despite the 3-fold increase in exercise CHO intake in MAX post-intervention. The tight-junction (claudin-3) response decreased in both groups from pre- to post-intervention. Groups achieved a similar performance improvement from pre- to post-intervention (CON = 39 s [95 CI 15–63 s]; MAX = 36 s [13–59 s]; p = 0.002). Although this suggests that further increases in CHO availability above current guidelines do not confer additional advantages, limitations in our study execution (e.g., confounding loss of BM in several individuals despite a live-in training camp environment and significant increases in aerobic capacity due to intensified training) may have masked small differences. Therefore, athletes should meet the minimum CHO guidelines for training and competition goals, noting that, with practice, increased CHO intake can be tolerated, and may contribute to performance outcomes

    Exercise and heat stress : Inflammation and the iron regulatory response

    No full text
    This study determined the impact of heat stress on postexercise inflammation and hepcidin levels. Twelve moderately trained males completed three, 60-min treadmill running sessions under different conditions: (a) COOL, 18 °C with speed maintained at 80% maximum heart rate; (b) HOTHR, 35 °C with speed maintained at 80% maximum heart rate; and (c) HOTPACE, 35 °C completed at the average running speed from the COOL trial. Venous blood samples were collected pre-, post-, and 3-hr postexercise and analyzed for serum ferritin, interleukin-6 (IL-6), and hepcidin concentrations. Average HR was highest during HOTPACE compared with HOTHR and COOL (p < .001). Running speed was slowest in HOTHR compared with COOL and HOTPACE (p < .001). The postexercise increase in IL-6 was greatest during HOTPACE (295%; p = .003). No differences in the IL-6 response immediately postexercise between COOL (115%) and HOTHR (116%) were evident (p = .992). No differences in hepcidin concentrations between the three trials were evident at 3 hr postexercise (p = .407). Findings from this study suggest the IL-6 response to exercise is greatest in hot compared with cool conditions when the absolute running speed was matched. No differences in IL-6 between hot and cool conditions were evident when HR was matched, suggesting the increased physiological strain induced from training at higher intensities in hot environments, rather than the heat per se, is likely responsible for this elevated response. Environmental temperature had no impact on hepcidin levels, indicating that exercising in hot conditions is unlikely to further impact transient alterations in iron regulation, beyond that expected in temperate conditions
    corecore