23 research outputs found

    Examining Eating Attitudes and Behaviors in Collegiate Athletes, the Association Between Orthorexia Nervosa and Eating Disorders

    Get PDF
    Purpose: Orthorexia nervosa (Orthorexia) is an eating attitude and behavior associated with a fixation on healthy eating, while eating disorders (EDs) are clinically diagnosed psychiatric disorders associated with marked disturbances in eating that may cause impairment to psychosocial and physical health. The purpose of this study was to examine risk for Orthorexia and EDs in student-athletes across sex and sport type and determine the association between the two. Methods: Student-athletes (n = 1,090; age: 19.6 ± 1.4 years; females = 756; males = 334) completed a survey including demographics, the ORTO-15 test (values), the Eating Attitudes Test-26 (EAT-26; \u3e20 score), and additional questions about pathogenic behaviors to screen for EDs. Results: Using a ORTO-15, 67.9% were at risk for Orthorexia, a more restrictive threshold value of 17.7% prevalence across student-athletes with significant differences across sex [ \u3c40: \u3eχ2(1,1,090) = 4.914, p= 0.027; \u3c35: \u3eχ2(1,1,090) = 5.923, p = 0.015). Overall, ED risk (EAT-26 and/or pathogenic behavior use) resulted in a 20.9% prevalence, with significant differences across sex (χ2 = 11.360, p \u3c 0.001) and sport-type category (χ2 = 10.312, p = 0.035). Multiple logistic regressions indicated a significant association between EAT-26 subscales scores and Orthorexia, and between Orthorexia positivity, ORTO-15 scores, and risk for EDs. Conclusions: Risk for Orthorexia and ED is present in collegiate student-athletes. While healthy and balanced eating is important, obsessive healthy eating fixations may increase the risk for EDs in athletes. More education and awareness are warranted to minimize the risk for Orthorexia and EDs in student-athletes

    Examination of Anger Prevalence in NCAA Division I Student-Athletes

    Get PDF
    Purpose: Anger associated with sports participation may affect inability to acutely process anger, may decrease performance and increase the likelihood of risk-taking behavior in collegiate athletes. Therefore, the purpose was to examine the prevalence of anger in collegiate student-athletes across sex, academic status, and sport type. Methods: A cross sectional study over a three-year period examined 759 NCAA Division I student-athletes at one institution (age=20±1 years; males: n=259; females: n=500) completed an optional pre-participation behavioral health screening questionnaire, personal demographic information and the Anger Index Self-Test. Results: Overall, 37.2% (n=282/759; males=127/259, 49.0%; females=155/500, 31.0%) of participants were at high-risk for anger. We identified a significant difference between the anger and sex [Χ2(2, N=759) =28.1, P≤0.01]. We also identified a significant difference between the anger and sport type [Χ2(8, N=759)=32.1, P≤0.01] with 55.2% (n=419/759) at moderate risk for anger despite sport type; with the highest percentages presenting high-risk for anger within power sports (n= 64/116, 55.2%) and ball sports (n=98/240, 40.8%). No significant differences were identified for anger risk and academic status (P=0.66). Conclusions: Female collegiate student-athletes demonstrated a higher prevalence of anger than male collegiate student-athletes, yet more males were high-risk. Most student-athletes displayed moderate-risk for anger across different sports. Anger across academic status was not significantly different implying that anger management and coping skills may need to be taught during their student-athlete tenure to mitigate the identified risk. A collegiate student-athlete’s inability to process anger may affect sports performance and have negative consequences on their personal and social life. A primary prevention mechanism exists to explore proper coping mechanisms for anger during sport before the onset of mental health conditions that could exacerbate the experience for the individual

    Examination of the Cumulative Risk Assessment and Nutritional Profiles among College Ballet Dancers

    Get PDF
    This study examined female collegiate ballet dancers\u27 ( = 28) Female Athlete Triad (Triad) risk via the Cumulative Risk Assessment (CRA) and nutritional profiles (macro- and micronutrients; = 26). The CRA identified Triad return to play criteria (RTP: Full Clearance, Provisional Clearance, or Restricted/Medical Disqualified) by assessing eating disorder risk, low energy availability, menstrual cycle dysfunction, and low bone mineral density. Seven-day dietary assessments identified any energy imbalances of macro- and micronutrients. Ballet dancers were identified as low, within normal, or high for each of the 19 nutrients assessed. Basic descriptive statistics assessed CRA risk classification and dietary macro- and micronutrient levels. Dancers averaged 3.5 ± 1.6 total score on the CRA. Based on these scores, the RTP outcomes revealed Full Clearance 7.1%, = 2; Provisional Clearance 82.1%, = 23; and Restricted/Medical Disqualification 10.7%, = 3. Dietary reports revealed that 96.2% ( = 25) of ballet dancers were low in carbohydrates, 92.3% ( = 24) low in protein, 19.2% ( = 5) low in fat percent, 19.2% ( = 5) exceeding saturated fats, 100% ( = 26) low in Vitamin D, and 96.2% ( = 25) low in calcium. Due to the variability in individual risks and nutrient requirements, a patient-centered approach is a critical part of early prevention, evaluation, intervention, and healthcare for the Triad and nutritional-based clinical evaluations

    Effects of a 4-Week Heart Rate Variability Biofeedback Intervention on Psychological and Performance Variables in Student-Athletes: A Pilot Study

    Get PDF
    PURPOSE: To examine the effects of a 4-week biofeedback intervention on coherence, psychological, and performance variables in collegiate student-athletes. METHODS: Thirteen student-athletes were randomly assigned to the intervention (one weekly biofeedback session for 4-weeks) or control group (no sessions). Data were collected at pre and post-intervention using weekly averaged coherence scores, psychological measures for depression, arousal, stress, resiliency, and performance outcome measures. RESULTS: A 3 (Time) x 4 (Week average) repeated measures ANOVA was independently conducted to examine differences between time and weekly coherence average for coherence scores. No significant differences were found for “at rest”, pre, or post-practice coherence scores. A 2 (treatment group) x 4 (Week) repeated measures ANOVAs were independently conducted to examine differences between treatment groups and week average for performance, resilience, and recovery. Significant differences were found for performance by time (p = .029). For the psychological variables, 2 (treatment group) X 2 (Time) repeated measures ANOVAs were independently conducted to examine differences between treatment group and time for CESD, AD-ACL, CSSS, and the ASSQ sleep score and no significant differences were found. CONCLUSIONS: Overall the biofeedback intervention did not improve coherence, psychological, or performance variables between the groups. While the biofeedback intervention did not show significant changes in this pilot study, there is potential for future research to address male participants and a change in timing during the season

    Examination of the Cumulative Risk Assessment and Nutritional Profiles among College Ballet Dancers

    Get PDF
    This study examined female collegiate ballet dancers’ (n = 28) Female Athlete Triad (Triad) risk via the Cumulative Risk Assessment (CRA) and nutritional profiles (macro- and micronutrients; n = 26). The CRA identified Triad return to play criteria (RTP: Full Clearance, Provisional Clearance, or Restricted/Medical Disqualified) by assessing eating disorder risk, low energy availability, menstrual cycle dysfunction, and low bone mineral density. Seven-day dietary assessments identified any energy imbalances of macro- and micronutrients. Ballet dancers were identified as low, within normal, or high for each of the 19 nutrients assessed. Basic descriptive statistics assessed CRA risk classification and dietary macro- and micronutrient levels. Dancers averaged 3.5 ± 1.6 total score on the CRA. Based on these scores, the RTP outcomes revealed Full Clearance 7.1%, n = 2; Provisional Clearance 82.1%, n = 23; and Restricted/Medical Disqualification 10.7%, n = 3. Dietary reports revealed that 96.2% (n = 25) of ballet dancers were low in carbohydrates, 92.3% (n = 24) low in protein, 19.2% (n = 5) low in fat percent, 19.2% (n = 5) exceeding saturated fats, 100% (n = 26) low in Vitamin D, and 96.2% (n = 25) low in calcium. Due to the variability in individual risks and nutrient requirements, a patient-centered approach is a critical part of early prevention, evaluation, intervention, and healthcare for the Triad and nutritional-based clinical evaluations

    A 24 hour naproxen dose on gastrointestinal distress and performance during cycling in the heat

    Get PDF
    Using a double-blind, randomized and counterbalanced, cross-over design, we assessed naproxen's effects on gastrointestinal (GI) distress and performance in eleven volunteers (6 male, 5 female). Participants completed 4 trials: 1) placebo and ambient); 2) placebo and heat; 3) naproxen and ambient; and 4) naproxen and heat. Independent variables were one placebo or 220 mg naproxen pill every 8 h (h) for 24 h and ambient (22.7 ± 1.8°C) or thermal environment (35.7 ± 1.3°C). Participants cycled 80 min at a steady heart rate then 10 min for maximum distance. Perceived exertion was measured throughout cycling. Gastrointestinal distress was assessed pre-, during, post-, 3 h post-, and 24 h post-cycling using a GI index for upper, lower, and systemic symptoms. No statistically significant differences occurred between conditions at any time for GI symptoms or perceived exertion, distance, or heart rate during maximum effort. A 24 h naproxen dose did not significantly affect performance or cause more frequent or serious GI distress when participants were euhydrated and cycling at moderate intensity in a thermal environment

    An acute naproxen dose does not affect core temperature or Interleukin-6 during cycling in a hot environment

    Get PDF
    Non-steroidal anti-inflammatory drugs’ anti-pyretic and anti-inflammatory effects has led some individuals to theorize these medications may blunt core body temperature (Tc) increases during exercise. We utilized a double-blind, randomized, and counterbalanced cross-over design to examine the effects of a 24-h naproxen dose (3–220 ​mg naproxen pills) and placebo (0 ​mg naproxen) on Tc and plasma interleukin-6 (IL-6) concentrations during cycling in a hot or ambient environment. Participants (n ​= ​11; 6 male, 5 female; age ​= ​27.8 ​± ​6.5 years, weight ​= ​79.1 ​± ​17.9 ​kg, height ​= ​177 ​± ​9.5 ​cm) completed 4 conditions: 1) placebo and ambient (Control); 2) placebo and heat (Heat); 3) naproxen and ambient (Npx); and 4) naproxen and heat (NpxHeat). Dependent measures were taken before, during, and immediately after 90 ​min of cycling and then 3 ​h after cycling. Overall, Tc significantly increased pre- (37.1 ​± ​0.4 ​°C) to post-cycling (38.2 ​± ​0.3 ​°C, F1.7,67.3 ​= ​150.5, p ​< ​0.001) and decreased during rest (37.0 ​± ​0.3 ​°C, F2.0,81.5 ​= ​201.6, p ​< ​0.001). Rate of change or maximum Tc were not significantly different between conditions. IL-6 increased pre- (0.54 ​± ​0.06 ​pg/ml) to post-exercise (2.46 ​± ​0.28 ​pg/ml, p ​< ​0.001) and remained significantly higher than pre-at 3 ​h post- (1.17 ​± ​0.14 ​pg/ml, 95% CI ​= ​−1.01 to −0.23, p ​= ​0.001). No significant IL-6 differences occurred between conditions. A 24-h, over-the-counter naproxen dose did not significantly affect Tc or IL-6 among males and females cycling in hot or ambient environments

    Examination of Athlete Triad Symptoms Among Endurance-Trained Male Athletes: A Field Study

    Get PDF
    Background: Studies examining the physiological consequences associated with deficits in energy availability (EA) for male athletes are sparse. Purpose: To examine male athlete triad components; low energy availability (LEA) with or without an eating disorder risk (ED), reproductive hormone [testosterone (T)], and bone mineral density (BMD) in endurance-trained male athletes during different training periods. Methods: A cross-sectional design with 14 participants (age: 26.4 ± 4.2 years; weight: 70.6 ± 6.4 kg; height: 179.5 ± 4.3 cm; BMI: 21.9 ± 1.8 kg/m2) were recruited from the local community. Two separate training weeks [low (LV) and high (HV) training volumes] were used to collect the following: 7-day dietary and exercise logs, and blood concentration of T. Anthropometric measurements was taken prior to data collection. A one-time BMD measure (after the training weeks) and VO2max-HR regressions were utilized to calculate EEE. Results: Overall, EA presented as 27.6 ± 10.7 kcal/kgFFM·d-1 with 35% (n = 5) of participants demonstrating increased risk for ED. Examining male triad components, 64.3% presented with LEA (≤ 30 kcal/kgFFM·d-1) while participants presented with T (1780.6 ± 1672.6 ng/dl) and BMD (1.31 ±.09 g/cm2) within normal reference ranges. No differences were found across the 2 training weeks for EI, with slight differences for EA and EEE. Twenty-five participants (89.3%) under-ingested CHO across both weeks, with no differences between weeks. Conclusion: Majority of endurance-trained male athletes presented with one compromised component of the triad (LEA with or without ED risk); however, long-term negative effects on T and BMD were not demonstrated. Over 60% of the participants presented with an EA ≤ 30 kcal/kgFFM·d-1, along with almost 90% not meeting CHO needs. These results suggest male endurance-trained athletes may be at risk to negative health outcomes similar to mechanistic behaviors related to EA with or without ED in female athletes
    corecore