102 research outputs found

    Preseason Functional Test Scores are Associated with Future Sports Injury in Female Collegiate Athletes

    Full text link
    Brumitt, J, Heiderscheit, B, Manske, R, Niemuth, PE, Mattocks, A, and Rauh, MJ. Preseason functional test scores are associated with future sports injury in female collegiate athletes. J Strength Cond Res 32(6): 1692–1701, 2018—Recent prospective cohort studies have reported preseason functional performance test (FPT) measures and associations with future risk of injury; however, the findings associated with these studies have been equivocal. The purpose of this study was to determine the ability of a battery of FPTs as a preseason screening tool to identify female Division III (D III) collegiate athletes who may be at risk for a noncontact time-loss injury to the lower quadrant (LQ = low back and lower extremities). One hundred six female D III athletes were recruited for this study. Athletes performed 3 FPTs: standing long jump (SLJ), single-leg hop (SLH) for distance, and the lower extremity functional test (LEFT). Time-loss sport-related injuries were tracked during the season. Thirty-two (24 initial and 8 subsequent) time-loss LQ injuries were sustained during the study. Ten of the 24 initial injuries occurred at the thigh and knee. At-risk athletes with suboptimal FPT measures (SLJ #79% ht; (B) SLH #64% ht; LEFT $118 seconds) had significantly greater rates of initial (7.2 per 1,000 athletic exposures [AEs]) and total (7.6 per 1,000 AEs) time-loss thigh or knee injuries than the referent group (0.9 per 1,000 AEs; 1.0 per 1,000 AEs, respectively). At-risk athletes were 9 times more likely to experience a thigh or knee injury (odds ratio [OR] = 9.7, confidence interval [CI]: 2.3–39.9; p = 0.002) than athletes in the referent group. At-risk athletes with a history of LQ sports injury and lower off-season training habits had an 18-fold increased risk of a time-loss thigh or knee injury during the season (adjusted OR = 18.7, CI: 3.0–118.1; p = 0.002). This battery of FPTs appears useful as a tool for identifying female D III athletes at risk of an LQ injury, especially to the thigh or knee region

    Epidemiology of NCAA Track and Field Injuries From 2010 to 2014

    Get PDF
    Background: Track and field (T&F) athletes compete in a variety of events that require different skills and training characteristics. Descriptive epidemiology studies often fail to describe event-specific injury patterns. Purpose: To describe the epidemiology of injuries in National Collegiate Athletic Association (NCAA) T&F by sex, setting (practice vs competition), and time of season (indoor vs outdoor) and to compare injury patterns by events within the sport. Study Design: Descriptive epidemiology study. Methods: Data were obtained from the NCAA Injury Surveillance Program for all indoor and outdoor T&F injuries during the academic years 2009-2010 to 2013-2014. Injury rates, injury rate ratios, and injury proportion ratios (IPRs) were reported and compared by sex, injury setting, season, and event. Analysis included time-loss as well as no-time loss injuries. Results: Over the 5 seasons, the overall injury rate was 3.99 injuries per 1000 athletic-exposures (95% CI, 3.79-4.20). After controlling for injury diagnoses, women’s T&F athletes experienced an 18% higher risk of injury (95% CI, 7% to 31%) and missed 41% more time after an injury (95% CI, 4% to 93%) when compared with men. Among all athletes, the injury risk during competition was 71% higher (95% CI, 50% to 95%) compared with practice and required 59% more time loss (95% CI, 7% to 135%). Distance running accounted for a significantly higher proportion of overuse injuries (IPR, 1.70; 95% CI, 1.40-2.05; P \u3c .05) and required 168% more time loss (95% CI, 78% to 304%) than other events. The hip and thigh were the body regions most commonly injured; injury type, however, varied by T&F event. Sprinting accounted for the greatest proportion of hip and thigh injuries, distance running had the greatest proportion of lower leg injuries, and throwing reported the greatest proportion of spine and upper extremity injuries. Conclusion: Injury risk in NCAA T&F varied by sex, season, and setting. Higher injury rates were found in women versus men, indoor versus outdoor seasons, and competitions versus practices. The hip and thigh were the body regions most commonly injured; however, injury types varied by event. These findings may provide insight to programs aiming to reduce the risk of injury and associated time loss in collegiate T&F

    Lower Extremity Functional Tests and Risk of Injury in Division III Collegiate Athletes

    Full text link
    Purpose/Background: Functional tests have been used primarily to assess an athlete’s fitness or readiness to return to sport. The purpose of this prospective cohort study was to determine the ability of the standing long jump (SLJ) test, the single-leg hop (SLH) for distance test, and the lower extremity functional test (LEFT) as preseason screening tools to identify collegiate athletes who may be at increased risk for a time-loss sports-related low back or lower extremity injury. Methods: A total of 193 Division III athletes from 15 university teams (110 females, age 19.1 ± 1.1 y; 83 males, age 19.5 ± 1.3 y) were tested prior to their sports seasons. Athletes performed the functional tests in the following sequence: SLJ, SLH, LEFT. The athletes were then prospectively followed during their sports season for occurrence of low back or LE injury. Results: Female athletes who completed the LEFT in 118 s were 6 times more likely (OR=6.4, 95% CI: 1.3, 31.7) to sustain a thigh or knee injury. Male athletes who completed the LEFT in 100 s were more likely to experience a time-loss injury to the low back or LE (OR=3.2, 95% CI: 1.1, 9.5) or a foot or ankle injury (OR=6.7, 95% CI: 1.5, 29.7) than male athletes who completed the LEFT in 101 s or more. Female athletes with a greater than 10% side-to-side asymmetry between SLH distances had a 4-fold increase in foot or ankle injury (cut point: \u3e10%; OR=4.4, 95% CI: 1.2, 15.4). Male athletes with SLH distances (either leg) at least 75% of their height had at least a 3-fold increase (OR=3.6, 95% CI: 1.2, 11.2 for the right LE; OR=3.6, 95% CI: 1.2, 11.2 for left LE) in low back or LE injury. Conclusions: The LEFT and the SLH tests appear useful in identifying Division III athletes at risk for a low back or lower extremity sports injury. Thus, these tests warrant further consideration as preparticipatory screening examination tools for sport injury in this population. Clinical Relevance: The single-leg hop for distance and the lower extremity functional test, when administered to Division III athletes during the preseason, may help identify those at risk for a time-loss low back or lower extremity injury

    The Lower-Extremity Functional Test and Lower-Quadrant Injury in NCAA Division III Athletes: A Descriptive and Epidemiologic Report

    Full text link
    Context: The Lower-Extremity Functional Test (LEFT) has been used to assess readiness to return to sport after a lower extremity injury. Current recommendations suggest that women should complete the LEFT in 135 s (average; range 120-150 s) and men should complete the test in 100 s (average; range 90-125 s). However, these estimates are based on limited data and may not be reflective of college athletes. Thus, additional assessment, including normative data, of the LEFT in sport populations is warranted. Objective: To examine LEFT times based on descriptive information and off-season training habits in NCAA Division III (Dill) athletes. In addition, this study prospectively examined the LEFT’S ability to discriminate sport-related injury occurrence. Design: Descriptive epidemiology. Setting: Dill university. Subjects: 189 Dill college athletes (106 women, 83 men) from 15 teams. Main Outcome Measures: LEFT times, preseason questionnaire, and time-loss injuries during the sport season. Results: Men completed the LEFT (105 ± 9 s) significantly faster than their female counterparts (117 ± 10 s) (P \u3c .0001). Female athletes who reported \u3e3-5 h/wk of plyometric training during the off-season had significantly slower LEFT scores than those who performed \u3c3 h/wk of plyometric training (P - -03). The overall incidence of a lower-quadrant (LQ) time-loss injury for female athletes was 4.5/1000 athletic exposures (AEs) and 3.7/1000 AEs for male athletes. Female athletes with slower LEFT scores (\u3e118 s) experienced a higher rate of LQ time-loss injuries than those with faster LEFT scores (\u3c117 s) (P = .03). Conclusion: Only off-season plyometric training practices seem to affect LEFT score times among female athletes. Women with slower LEFT scores are more likely to be injured than those with faster LEFT scores. Injury rates in men were not influenced by performance on the LEFT

    Demographic and Genetic Patterns of Variation among Populations of Arabidopsis thaliana from Contrasting Native Environments

    Get PDF
    Background: Understanding the relationship between environment and genetics requires the integration of knowledge on the demographic behavior of natural populations. However, the demographic performance and genetic composition of Arabidopsis thaliana populations in the species' native environments remain largely uncharacterized. This information, in combination with the advances on the study of gene function, will improve our understanding on the genetic mechanisms underlying adaptive evolution in A. thaliana. Methodology/Principal Findings: We report the extent of environmental, demographic, and genetic variation among 10 A. thaliana populations from Mediterranean (coastal) and Pyrenean (montane) native environments in northeast Spain. Geographic, climatic, landscape, and soil data were compared. Demographic traits, including the dynamics of the soil seed bank and the attributes of aboveground individuals followed over a complete season, were also analyzed. Genetic data based on genome-wide SNP markers were used to describe genetic diversity, differentiation, and structure. Coastal and montane populations significantly differed in terms of environmental, demographic, and genetic characteristics. Montane populations, at higher altitude and farther from the sea, are exposed to colder winters and prolonged spring moisture compared to coastal populations. Montane populations showed stronger secondary seed dormancy, higher seedling/juvenile mortality in winter, and initiated flowering later than coastal populations. Montane and coastal regions were genetically differentiated, montane populations bearing lower genetic diversity than coastal ones. No significant isolation-by-distance pattern and no shared multilocus genotypes among populations were detected. Conclusions/Significance: Between-region variation in climatic patterns can account for differences in demographic traits, such as secondary seed dormancy, plant mortality, and recruitment, between coastal and montane A. thaliana populations. In addition, differences in plant mortality can partly account for differences in the genetic composition of coastal and montane populations. This study shows how the interplay between variation in environmental, demographic, and genetic parameters may operate in natural A. thaliana populations. © 2009 Montesinos et al

    Clonal hematopoiesis is associated with risk of severe Covid-19.

    Get PDF
    Acquired somatic mutations in hematopoietic stem and progenitor cells (clonal hematopoiesis or CH) are associated with advanced age, increased risk of cardiovascular and malignant diseases, and decreased overall survival. These adverse sequelae may be mediated by altered inflammatory profiles observed in patients with CH. A pro-inflammatory immunologic profile is also associated with worse outcomes of certain infections, including SARS-CoV-2 and its associated disease Covid-19. Whether CH predisposes to severe Covid-19 or other infections is unknown. Among 525 individuals with Covid-19 from Memorial Sloan Kettering (MSK) and the Korean Clonal Hematopoiesis (KoCH) consortia, we show that CH is associated with severe Covid-19 outcomes (OR = 1.85, 95%=1.15-2.99, p = 0.01), in particular CH characterized by non-cancer driver mutations (OR = 2.01, 95% CI = 1.15-3.50, p = 0.01). We further explore the relationship between CH and risk of other infections in 14,211 solid tumor patients at MSK. CH is significantly associated with risk of Clostridium Difficile (HR = 2.01, 95% CI: 1.22-3.30, p = 6×10-3) and Streptococcus/Enterococcus infections (HR = 1.56, 95% CI = 1.15-2.13, p = 5×10-3). These findings suggest a relationship between CH and risk of severe infections that warrants further investigation

    Plank Times and Lower Extremity Overuse Injury in Collegiate Track-and-Field and Cross Country Athletes

    No full text
    Trunk muscle endurance has been theorized to play a role in running kinematics and lower extremity injury. However, the evidence examining the relationships between static trunk endurance tests, such as plank tests, and lower extremity injury in athletes is conflicting. The purpose of this study was to assess if collegiate cross country and track-and-field athletes with shorter pre-season prone and side plank hold times would have a higher incidence of lower extremity time-loss overuse injury during their competitive sport seasons. During the first week of their competitive season, 75 NCAA Division III uninjured collegiate cross country and track-and-field athletes (52% female; mean age 20.0 ± 1.3 years) performed three trunk endurance plank tests. Hold times for prone plank (PP), right-side plank (RSP) and left-side plank (LSP) were recorded in seconds. Athletes were followed prospectively during the season for lower extremity overuse injury that resulted in limited or missed practices or competitions. Among the athletes, 25 (33.3%) experienced a lower extremity overuse injury. There were no statistically significant mean differences or associations found between PP, RSP or LSP plank test hold times (seconds) and occurrence of lower extremity overuse injury. In isolation, plank hold times appear to have limited utility as a screening test in collegiate track-and-field and cross country athletes

    Attitudes and Beliefs towards Sport Specialization, College Scholarships, and Financial Investment among High School Baseball Parents

    No full text
    Adolescent athletes are increasingly encouraged to specialize in a single sport year-round in an effort to receive a college scholarship. For collegiate baseball, only 11.7 scholarships are available for a 35-player team. The beliefs of the parents of baseball athletes towards sport specialization are unknown, along with whether they have an accurate understanding of college baseball scholarship availability. The parents of high school baseball athletes were recruited to complete an anonymous questionnaire that consisted of (1) parent and child demographics, (2) child baseball participation information, and (3) parent attitudes and beliefs regarding sport specialization and college baseball scholarships. One hundred and fifty-five parents participated in the questionnaire (female: 52.9%, age: 49.4 ± 5.5 years old). The parents spent a median of 3000 USD [Interquartile Range (IQR): 1500–6000] on their child’s baseball participation. Most parents believed that specialization increased their child’s chances of getting better at baseball (N = 121, 79.6%). The parents underestimated the number of college baseball scholarships available per team (median [IQR]: 5 [0–5]), but 55 parents (35.9%) believed it was likely that their child would receive a college baseball scholarship. Despite having a realistic understanding of the limited college scholarships available, the parents were optimistic that their child would receive a baseball scholarship

    Y-Balance Test Performance Does Not Determine Non-Contact Lower Quadrant Injury in Collegiate American Football Players

    No full text
    Collegiate American football has a high rate of injury. The Lower Quarter Y-Balance Test (YBT-LQ), a dynamic assessment of lower extremity strength, mobility, and balance, has been purported to identify athletes at risk for injury in different sports including football. Previous studies examining the association between YBT-LQ and injury have reported varied findings; therefore, the purpose of this study was to assess if preseason YBT-LQ performance predicted whether football players would sustain a non-contact lower extremity or low back (lower quarter (LQ)) injury during the season. Fifty-nine male collegiate American football players (age 20.8 ± 1.3 y, height 1.8 ± 0.1 m, body mass 94.6 ± 14.2 kg) completed a survey of training and injury history and had their YBT-LQ performance assessed at the start of the season. Athletic training staff tracked the occurrence of non-contact LQ injuries during the season. There were no significant relationships found between preseason YBT-LQ values and incidence of non-contact LQ injury in this population of collegiate American football players. This study is consistent with recent reports that have not found a significant association between preseason YBT-LQ values and LQ injury. These results suggest that, in isolation, the YBT-LQ may have limited utility as a screening test for non-contact injury in collegiate football players
    • …
    corecore