48 research outputs found
Comparison of Common Field/Clinical Measures to Standard Laboratory Measures of Hydration Status
Context: Accurately determining hydration status is a preventative measure for exertional heat illnesses (EHI).
Objective: To determine the validity of various field measures of urine specific gravity (Usg) compared to laboratory instruments.
Design: Observational research design to compare measures of hydration status: urine reagent strips (URS) and a urine color (Ucol) chart to a refractometer. Setting: We utilized the athletic training room of a Division I-A collegiate American football team.
Participants: Trial 1 involved urine samples of 69 veteran football players (age=20.1+1.2yr; body mass=229.7+44.4lb; height=72.2+2.1in). Trial 2 involved samples from 5 football players (age=20.4+0.5yr; body mass=261.4+39.2lb; height=72.3+2.3in).
Interventions: We administered the Heat Illness Index Score (HIIS) Risk Assessment, to identify athletes at-risk for EHI (Trial 1). For individuals “at-risk” (Trial 2), we collected urine samples before and after 15 days of pre-season “two-a-day” practices in a hot, humid environment(mean on-field WBGT=28.84+2.36oC).
Main Outcome Measures: Urine samples were immediately analyzed for Usg using a refractometer, Diascreen 7® (URS1), Multistix® (URS2), and Chemstrip10® (URS3). Ucol was measured using Ucol chart. We calculated descriptive statistics for all main measures; Pearson correlations to assess relationships between the refractometer, each URS, and Ucol, and transformed Ucol data to Z-scores for comparison to the refractometer.
Results: In Trial 1, we found a moderate relationship (r=0.491, p\u3c.01) between URS1 (1.020+0.006μg) and the refractometer (1.026+0.010μg). In Trial 2, we found marked relationships for Ucol (5.6+1.6shades, r=0.619, p\u3c0.01), URS2 (1.019+0.008μg, r=0.712, p\u3c0.01), and URS3 (1.022+0.007μg, r=0.689, p\u3c0.01) compared to the refractometer (1.028+0.008μg).
Conclusions: Our findings suggest that URS were inconsistent between manufacturers, suggesting practitioners use the clinical refractometer to accurately determine Usg and monitor hydration status
Non-steroidal Anti-inflammatory Drugs on Core Body Temperature During Exercise: A Systematic Review
BACKGROUND: Because of their anti-pyretic effects, some individuals prophylactically use non-steroidal anti-inflammatory drugs (NSAIDs) to blunt core temperature (Tc) increases during exercise, thus, potentially improving performance by preventing hyperthermia and/or exertional heat illness. However, NSAIDs induce gastrointestinal damage, alter renal function, and decrease cardiovascular function, which could compromise thermoregulation and increase Tc. The aim of this systematic review was to evaluate the effects of NSAIDs on Tc in exercising, adult humans. METHODS: We conducted searches in MEDLINE, PubMed, Cochrane Reviews, and Google Scholar for literature published up to November 2020. We conducted a quality assessment review using the Physiotherapy Evidence Database scale. Nine articles achieved a score ≥ seven to be included in the review. RESULTS: Seven studies found aspirin, ibuprofen, and naproxen had no effect (p \u3e .05) on Tc during walking, running, or cycling for ≤ 90 min in moderate to hot environments. Two studies found significant Tc changes. In one investigation, 81 mg of aspirin for 7-10 days prior to exercise significantly increased Tc during cycling (p \u3c .001); final Tc at the end of exercise = 38.3 ± 0.1 °C vs. control = 38.1 ± 0.1 °C. In contrast, participants administered 50 mg rofecoxib for 6 days experienced significantly lower Tc during 45 min of cycling compared to placebo (NSAID Tc range ≈ 36.7-37.2 °C vs control ≈ 37.3-37.8 °C, p \u3c 0.05). CONCLUSIONS: There are limited quality studies examining NSAID effects on Tc during exercise in humans. The majority suggest taking non-selective NSAIDs (e.g., aspirin) 1-14 days before exercise does not significantly affect Tc during exercise. However, it remains unclear whether Tc increases, decreases, or does not change during exercise with other NSAID drug types (e.g., naproxen), higher dosages, chronic use, greater exercise intensity, and/or greater environmental temperatures
Examination of the Cumulative Risk Assessment and Nutritional Profiles among College Ballet Dancers
This study examined female collegiate ballet dancers\u27 ( = 28) Female Athlete Triad (Triad) risk via the Cumulative Risk Assessment (CRA) and nutritional profiles (macro- and micronutrients; = 26). The CRA identified Triad return to play criteria (RTP: Full Clearance, Provisional Clearance, or Restricted/Medical Disqualified) by assessing eating disorder risk, low energy availability, menstrual cycle dysfunction, and low bone mineral density. Seven-day dietary assessments identified any energy imbalances of macro- and micronutrients. Ballet dancers were identified as low, within normal, or high for each of the 19 nutrients assessed. Basic descriptive statistics assessed CRA risk classification and dietary macro- and micronutrient levels. Dancers averaged 3.5 ± 1.6 total score on the CRA. Based on these scores, the RTP outcomes revealed Full Clearance 7.1%, = 2; Provisional Clearance 82.1%, = 23; and Restricted/Medical Disqualification 10.7%, = 3. Dietary reports revealed that 96.2% ( = 25) of ballet dancers were low in carbohydrates, 92.3% ( = 24) low in protein, 19.2% ( = 5) low in fat percent, 19.2% ( = 5) exceeding saturated fats, 100% ( = 26) low in Vitamin D, and 96.2% ( = 25) low in calcium. Due to the variability in individual risks and nutrient requirements, a patient-centered approach is a critical part of early prevention, evaluation, intervention, and healthcare for the Triad and nutritional-based clinical evaluations
Examination of Eating Disorder Risk Among University Marching Band Artists
BACKGROUND: Marching band artists are a physically active population, composed of approximately 27,000 people in the United States. University marching band artists face many of the same physically active demands and mental stressors as student athletes, potentially predisposing them to injury, illness, and risk for eating disorders (EDs). The purpose of this study was to examine ED risk across sex in university marching band artists, and to determine the type of risk based on the Eating Disorder Inventory-3 (EDI-3) and Eating Disorder Inventory-3 Symptom Check List (EDI-3 SC). A secondary aim examined marching band artists and pathogenic weight control behavior use across sex. METHODS: This was a cross-sectional study. A total of 150 marching band artists (female: n = 84, male: n = 66, age = 19.9 ± 1.1 years) from three National Collegiate Athletic Association Division I university marching bands participated in the study. We screened for ED risk using the EDI-3, and the EDI-3 SC. RESULTS: Overall, marching band artists were at risk for EDs, using only the EDI-3, 45.3% (n = 68) were at risk, with females at significant higher risk than males [χ = 5.228, p = .022]; using only the EDI-3 SC, 54% (n = 81) were at risk and no significant differences were found across sex. Overall, 48% of all participants reported dieting and 20.7% engaged in excessive exercise to control weight. Significant differences were found between sex and purging to control weight [χ = 3.94, p = .047] and laxative use [χ = 4.064, p = .044], with females engaging in behavior more than males. CONCLUSIONS: Eating disorder risk was prevalent for both female and male marching band artists, with females displaying higher risk for EDs than males. Furthermore, marching band artists are engaging in pathogenic behaviors to control their weight. Healthcare providers (e.g., physicians, athletic trainers, physical therapist, dietitians, etc.) working in this setting should be aware of the risk factors displayed in marching band artists, and be able to provide education, prevention, and clinical interventions to this population. Additionally, marching band administrators should be aware of all medical risk factors and the benefit of having a healthcare provider (e.g., athletic trainer) to oversee the healthcare and wellness of marching band artists
Examination of the Prevalence of Female Athlete Triad Components among Competitive Cheerleaders
The purpose of this study was to examine individual and combined Female Athlete Triad components within collegiate cheerleaders, an at-risk group. Cheerleaders ( = 19; age: 20.3 ± 1.2 years) completed anthropometric measurements, health history questionnaires, resting metabolic rate, the eating disorder inventory-3 and symptom checklist, blood sample, and DXA scan. Participants completed dietary and exercise logs for 7 days and used heart rate monitors to track daily and exercise energy expenditure. Proportions were calculated for low energy availability (LEA) risk, disordered eating risk, and pathogenic behaviors. Chi-square analysis was used to determine the difference between cheerleaders who experience low EA with or without disordered eating risk. All cheerleaders demonstrated LEA for the days they participated in cheerleading practice, 52.6% demonstrated LEA with eating disorder risk and 47.4% demonstrated LEA without eating disorder risk, 52.6% self-reported menstrual dysfunction, 14% experienced menstrual dysfunction via hormonal assessment, and 0% demonstrated low bone mineral density. Overall, 47.7% presented with one Triad component, 52.6% demonstrated two Triad components using self-reported menstrual data, and 10.5% demonstrated two Triad components using hormonal assessments. All cheerleaders displayed LEA. These findings support the need for increased education on the individual components of the Triad and their potential consequences by qualified personal
Exertional Heat Illness Risk Factors and Physiological Responses of Youth Football Players
OBJECTIVE: To determine which intrinsic and extrinsic exertional heat illness (EHI) risk factors exist in youth American football players and observe perceptual and physiological responses of players during events (games and practices). METHODS: Cross-sectional cohort study observing 63 youth football players, varying in position. Independent variables were league (weight-restricted (WR, n = 27) and age-restricted (AR, n = 36)) and event type. Dependent variables were anthropometrics, work-to-rest ratio, and wet bulb globe temperature. Descriptive variables included preparticipation examination and uniform configuration. A subset of 16 players participated in physiological variables (heart rate and gastrointestinal temperature). Data collection occurred on 7 AR and 8 WR nonconsecutive practices and the first 3 games of the season. RESULTS: Mean values for anthropometric variables were higher (p \u3c 0.05) in the AR league than the WR league. Work time (χ (1,111) = 4.232; p = 0.039) and rest time (χ (1,111) = 43.41; p \u3c 0.001) were significantly greater for games, but ratios were significantly higher for practices (χ (1,111) = 40.62; p \u3c 0.001). The majority of events (77%) observed were in black and red flag wet bulb globe temperature risk categories. A total of 57% of the players had a preparticipation examination, and up to 82% of events observed were in full uniforms. Individual gastrointestinal temperature and heart rate responses ranged widely and no players reached critical thresholds. CONCLUSION: Extrinsic (disproportionate work ratios, environmental conditions) and intrinsic (higher body mass index) EHI risk factors exist in youth football. Certain risk factors may be influenced by event and league type. National youth football organizations need to create thorough guidelines that address EHI risk factors for local leagues to adopt
A 24 hour naproxen dose on gastrointestinal distress and performance during cycling in the heat
Using a double-blind, randomized and counterbalanced, cross-over design, we assessed naproxen's effects on gastrointestinal (GI) distress and performance in eleven volunteers (6 male, 5 female). Participants completed 4 trials: 1) placebo and ambient); 2) placebo and heat; 3) naproxen and ambient; and 4) naproxen and heat. Independent variables were one placebo or 220 mg naproxen pill every 8 h (h) for 24 h and ambient (22.7 ± 1.8°C) or thermal environment (35.7 ± 1.3°C). Participants cycled 80 min at a steady heart rate then 10 min for maximum distance. Perceived exertion was measured throughout cycling. Gastrointestinal distress was assessed pre-, during, post-, 3 h post-, and 24 h post-cycling using a GI index for upper, lower, and systemic symptoms. No statistically significant differences occurred between conditions at any time for GI symptoms or perceived exertion, distance, or heart rate during maximum effort. A 24 h naproxen dose did not significantly affect performance or cause more frequent or serious GI distress when participants were euhydrated and cycling at moderate intensity in a thermal environment
An acute naproxen dose does not affect core temperature or Interleukin-6 during cycling in a hot environment
Non-steroidal anti-inflammatory drugs’ anti-pyretic and anti-inflammatory effects has led some individuals to theorize these medications may blunt core body temperature (Tc) increases during exercise. We utilized a double-blind, randomized, and counterbalanced cross-over design to examine the effects of a 24-h naproxen dose (3–220 mg naproxen pills) and placebo (0 mg naproxen) on Tc and plasma interleukin-6 (IL-6) concentrations during cycling in a hot or ambient environment. Participants (n = 11; 6 male, 5 female; age = 27.8 ± 6.5 years, weight = 79.1 ± 17.9 kg, height = 177 ± 9.5 cm) completed 4 conditions: 1) placebo and ambient (Control); 2) placebo and heat (Heat); 3) naproxen and ambient (Npx); and 4) naproxen and heat (NpxHeat). Dependent measures were taken before, during, and immediately after 90 min of cycling and then 3 h after cycling. Overall, Tc significantly increased pre- (37.1 ± 0.4 °C) to post-cycling (38.2 ± 0.3 °C, F1.7,67.3 = 150.5, p < 0.001) and decreased during rest (37.0 ± 0.3 °C, F2.0,81.5 = 201.6, p < 0.001). Rate of change or maximum Tc were not significantly different between conditions. IL-6 increased pre- (0.54 ± 0.06 pg/ml) to post-exercise (2.46 ± 0.28 pg/ml, p < 0.001) and remained significantly higher than pre-at 3 h post- (1.17 ± 0.14 pg/ml, 95% CI = −1.01 to −0.23, p = 0.001). No significant IL-6 differences occurred between conditions. A 24-h, over-the-counter naproxen dose did not significantly affect Tc or IL-6 among males and females cycling in hot or ambient environments
Examination of the Cumulative Risk Assessment and Nutritional Profiles among College Ballet Dancers
This study examined female collegiate ballet dancers’ (n = 28) Female Athlete Triad (Triad) risk via the Cumulative Risk Assessment (CRA) and nutritional profiles (macro- and micronutrients; n = 26). The CRA identified Triad return to play criteria (RTP: Full Clearance, Provisional Clearance, or Restricted/Medical Disqualified) by assessing eating disorder risk, low energy availability, menstrual cycle dysfunction, and low bone mineral density. Seven-day dietary assessments identified any energy imbalances of macro- and micronutrients. Ballet dancers were identified as low, within normal, or high for each of the 19 nutrients assessed. Basic descriptive statistics assessed CRA risk classification and dietary macro- and micronutrient levels. Dancers averaged 3.5 ± 1.6 total score on the CRA. Based on these scores, the RTP outcomes revealed Full Clearance 7.1%, n = 2; Provisional Clearance 82.1%, n = 23; and Restricted/Medical Disqualification 10.7%, n = 3. Dietary reports revealed that 96.2% (n = 25) of ballet dancers were low in carbohydrates, 92.3% (n = 24) low in protein, 19.2% (n = 5) low in fat percent, 19.2% (n = 5) exceeding saturated fats, 100% (n = 26) low in Vitamin D, and 96.2% (n = 25) low in calcium. Due to the variability in individual risks and nutrient requirements, a patient-centered approach is a critical part of early prevention, evaluation, intervention, and healthcare for the Triad and nutritional-based clinical evaluations
Gastrointestinal Cell Injury and Percieved Symptoms after Running the Boston Marathon
Gastrointestinal (GI) disturbances are a prevalent cause of marathon related complaints, and in extreme cases can promote life-threatening conditions such as exertional heat stroke. PURPOSE: Our aim was to study intestinal cell injury (via intestinal fatty acid binding protein [I-FABP]) and perceived GI distress symptoms among marathon runners. Potential risk factors (e.g., inadequate sleep) that could exacerbate GI disturbances in healthy, trained endurance runners were also examined. METHODS: A parallel mixed-methods study design was utilized. 2019 Boston Marathon participants were recruited via email. Before the race subjects completed surveys describing demographics and training history. Immediately pre-race, post-race, and 24-hours post-race participants completed a GI questionnaire to assess presence and severity of symptoms, a survey regarding risk factors (e.g., recent illness, medications) that could promote GI disturbances, and provided a urine sample. Due to weather, blood samples were only collected immediately and 24-hours post-race. RESULTS: A total of 40 runners (males: n = 19, age = 44.9 ± 10.8 years; females: n = 21, age = 44.8 ± 10.6 years) completed this study. I-FABP significantly decreased from post-race (3367.5 ± 2633.5 pg/ml) to 24-hours post-race (1657.3 ± 950.7 pg/ml, t(39) = -4.228, p \u3c .001, d = -.669). A significant difference in overall GI symptom scores across the three time points occurred (F(2, 39) = 41.37, p \u3c .001). Compared to pre-race (.09 ± .12) and 24-hour post-race (.44 ± .28), the highest average score occurred post-race (.84 ± .68). Post-race I-FABP (r = .31, p = .048) and post-race urine specific gravity (r = .33, p = .041) were significantly correlated with post-race GI symptom scores. CONCLUSION: Our study further supports the individualized presentation of GI disturbances, with participants experiencing a wide range of risk factors that can influence the extent of GI damage and perceived symptoms during and after exercise