27 research outputs found

    Trends in movement quality in US Military Academy cadets 2005-17: A JUMP-ACL study

    Get PDF
    Objectives: This study sought to determine if there were significant trends in lower extremity movement quality, as assessed by the Landing Error Scoring System (LESS) scores and plane-specific LESS subscales, across in 12 recent cohorts of incoming USMA cadets. Design: prospective cohort study. Setting: United States Military Academy. Participants: 7,591. Main outcome measures: Landing Error Scoring System (LESS) scores, adjusted for sex and ACL injury history. Results: Statistically significant inverse trends were found between total LESS score and year (p < 0.01) and sagittal plane subscale and year (p < 0.01). A statistically significant direct trend was found for the frontal/transverse plane subscale and year (p < 0.01). However, each of these trends had a small associated effect size, and none were considered clinically meaningful. Conclusions: There were no meaningful changes in lower extremity movement quality in incoming US Military Academy cadets between 2005 and 2017

    Biomechanical risk factors for lower extremity stress fracture

    Get PDF
    Objectives: Stress fracture injuries disproportionately affect athletes and military service members and little is known about the modifiable biomechanical risk factors associated with these injuries. The purpose of this study was to prospectively examine the association between neuromuscular and biomechanical factors upon entry to military service and the subsequent incidence of lower-extremity stress fracture injury during four years of follow-up. Methods: We analyzed data from the JUMP-ACL cohort, an existing prospective cohort study of military cadets. JUMP-ACL conducted detailed motion analysis during a jump landing task at the initiation of each subject’s military career. We limited our analyses to the class years 2009-2013 (i.e., subjects who completed baseline testing in 2005-2008). There were 1895 subjects available for analysis. Fifty-two subjects reported a history of stress fracture at baseline and were excluded from further analysis leaving 1843 subjects. Incident lower extremity-stress fracture cases were identified through the Defense Medical Surveillance System and the Cadet Injury and Illness Tracking System during the follow-up period. The electronic medical records of each potential incident case were reviewed and each case was confirmed by an adjudication committee consisting of two sports medicine fellowship trained orthopaedic surgeons. The primary outcome of interest was the incidence rate of lower-extremity stress fracture during the follow-up period. The association between incident stress fracture and sagittal, frontal, and transverse plane hip and knee kinematics during the jump-landing task were examined at initial contact (IC), 15%(T15), 50%(T50), 85%(T85) and 100%(T100) of stance phase. Descriptive plots of all biomechanical variables along with 95% confidence intervals (CI) were generated during the stance phase of the jump landing task. Univariate and multivariable Poisson regression models were used to estimate the association between baseline biomechanical factors and the incidence rate of lower-extremity stress fracture during follow-up. Results: Overall, 94 (5.1%, 95%CI: 4.14, 6.21) subjects sustained an incident stress fracture during the follow-up period. The incidence rate for stress fracture injuries among females was nearly three times greater when compared to males (IRR=2.86, 95%CI: 1.88, 4.34, p<0.001). Compared to those with greater than 5° of knee valgus, subjects with neutral or varus knee alignment experienced incidence rates for stress fracture that were 43%-53% lower at IC (IRR=0.57, 95%CI: 0.29, 1.11, p=0.10), T50 (IRR=0.47, 95%CI=0.23, 1.00, p=0.05), and T85 (IRR=0.53, 95%CI: 0.29, 0.98, p=0.04). Subjects with greater than 5° of internal knee rotation exhibited rates for stress fracture that were 2-4 times higher at T15 (IRR=2.31, 95%CI: 1.01, 5.27, p=0.05), T50 (IRR=3.98, 95%CI: 0.99, 16.00, p=0.05), and T85 (IRR=2.31, 95%CI: 0.86, 6.23, p=0.10), when compared to those with neutral or external knee rotation alignment. Conclusion: Several potentially modifiable biomechanical factors at the time of entry into military service appear to be associated with the subsequent rate of stress fracture. It is possible that injury prevention programs targeted to address these biomechanical movement patterns may reduce the risk of stress fracture injury in athletes and military service members

    Automated quantification of the landing error scoring system with a markerless motion-Capture system

    Get PDF
    Context: The Landing Error Scoring System (LESS) can be used to identify individuals with an elevated risk of lower extremity injury. The limitation of the LESS is that raters identify movement errors from video replay, which is time-consuming and, therefore, may limit its use by clinicians. A markerless motion-capture system may be capable of automating LESS scoring, thereby removing this obstacle. Objective: To determine the reliability of an automated markerless motion-capture system for scoring the LESS. Design: Cross-sectional study. Setting: United States Military Academy. Patients or Other Participants: A total of 57 healthy, physically active individuals (47 men, 10 women; age ¼ 18.6 6 0.6 years, height ¼ 174.5 6 6.7 cm, mass ¼ 75.9 6 9.2 kg). Main Outcome Measure(s): Participants completed 3 jump-landing trials that were recorded by standard video cameras and a depth camera. Their movement quality was evaluated by expert LESS raters (standard video recording) using the LESS rubric and by software that automates LESS scoring (depth-camera data). We recorded an error for a LESS item if it was present on at least 2 of 3 jump-landing trials. We calculated j statistics, prevalence- and bias-adjusted j (PABAK) statistics, and percentage agreement for each LESS item. Interrater reliability was evaluated between the 2 expert rater scores and between a consensus expert score and the markerless motion-capture system score. Results: We observed reliability between the 2 expert LESS raters (average j ¼ 0.45 6 0.35, average PABAK ¼ 0.67 6 0.34; percentage agreement ¼ 0.83 6 0.17). The markerless motion-capture system had similar reliability with consensus expert scores (average j ¼ 0.48 6 0.40, average PABAK ¼ 0.71 6 0.27; percentage agreement ¼ 0.85 6 0.14). However, reliability was poor for 5 LESS items in both LESS score comparisons. Conclusions: A markerless motion-capture system had the same level of reliability as expert LESS raters, suggesting that an automated system can accurately assess movement. Therefore, clinicians can use the markerless motion-capture system to reliably score the LESS without being limited by the time requirements of manual LESS scoring

    Automated Landing Error Scoring System Performance and the Risk of Bone Stress Injury in Military Trainees

    Get PDF
    Context: Lower extremity bone stress injuries (BSIs) place a significant burden on the health and readiness of the US Armed Forces. Objective: To determine if preinjury baseline performance on an expanded and automated 22-item version of the Landing Error Scoring System (LESS-22) was associated with the incidence of BSIs in a military training population. Design: Prospective cohort study. Setting: US Military Academy at West Point, NY. Patients or Other Participants: A total of 2235 incoming cadets (510 females [22.8%]). Main Outcome Measure(s): Multivariable Poisson regression models were used to produce adjusted incidence rate ratios (IRRs) to quantify the association between preinjury LESS scores and BSI incidence rate during follow-up and were adjusted for pertinent risk factors. Risk factors were included as covariates in the final model if the 95% CI for the crude IRR did not contain 1.00. Results: A total of 54 BSIs occurred during the study period, resulting in an overall incidence rate of 0.07 BSI per 1000 person-days (95% CI = 0.05, 0.09). The mean number of exposure days was 345.4 6 61.12 (range = 3–368 days). The final model was adjusted for sex and body mass index and yielded an adjusted IRR for a LESS-22 score of 1.06 (95% CI = 1.002, 1.13; P = .04), indicating that each additional LESS error documented at baseline was associated with a 6.0% increase in the incidence rate of BSI during the follow-up period. In addition, 6 individual LESS-22 items, including 2 newly added items, were significantly associated with the BSI incidence. Conclusions: We provided evidence that performance on the expanded and automated version of the LESS was associated with the BSI incidence in a military training population. The automated LESS-22 may be a scalable solution for screening military training populations for BSI risk

    Barriers and facilitators to the adoption and implementation of evidence-based injury prevention training programmes: a narrative review

    Get PDF
    While there is a multitude of evidence supporting the efficacy of injury prevention training programmes, the literature investigating the implementation of these programmes is, in contrast, rather limited. This narrative review sought to describe the commonly reported barriers and facilitators of the implementation of injury prevention training programmes among athletes in organised sport. We also aimed to identify necessary steps to promote the uptake and sustainable use of these programmes in non-elite athletic communities. We identified 24 publications that discussed implementing evidence-based injury prevention training programmes. Frequently reported barriers to implementation include the perceived time and financial cost of the programme, coaches lacking confidence in their ability to implement it, and the programme including exercises that were difficult or confusing to follow. Frequently reported facilitators to implementation include the coach being aware of programme efficacy, shared motivation to complete the programme from both coaches and athletes, and the ability to easily integrate the programme into practice schedules. The current literature is focused on high-income, high-resource settings. We recommend that future studies focus on understanding the best practices of programme dissemination in culturally and economically diverse regions. Programmes ought to be of no financial burden to the user, be simply adaptable to different sports and individual athletes and be available for use in easily accessible forms, such as in a mobile smartphone application

    Differences in Lower Extremity Movement Quality by Level of Sport Specialization in Cadets Entering a United States Service Academy

    Get PDF
    Background: Sport specialization in youth athletes is associated with increased risk for musculoskeletal injury; however, little is known about whether sport specialization is associated with lower extremity movement quality. The purpose of this study was to examine differences in lower extremity movement quality by level of sport specialization in US Service Academy cadets. Hypothesis: Cadets who report an increased level of sport specialization would have a lower level of movement quality than those who are less specialized. Study Design: Cross-sectional analysis from an ongoing prospective cohort study. Level of Evidence: Level 3. Methods: Cadets completed the Landing Error Scoring System (LESS) and a baseline questionnaire evaluating level of sport specialization during high school. Data were analyzed using separate 1-way analysis of variance models. Results: Among all participants (n = 1950), 1045 (53.6%) reported low sport specialization, 600 (30.8%) reported moderate sport specialization, and 305 (15.6%) reported high sport specialization at the time of data collection during the first week. Ages ranged from 17 to 23 years. Men (1491) and women (459) reported comparable specialization levels (P = 0.45). There were no statistically significant differences in lower extremity movement quality by level of specialization for all subjects combined (P = 0.15) or when only men were included in the analyses (P = 0.69). However, there were statistically significant differences in movement quality by level of specialization in women (P = 0.02). Moderately specialized women had the best movement quality (mean, 4.63; SD, 2.21) followed by those with high specialization (mean, 4.90; SD, 2.08) and those with low levels of specialization (mean, 5.23; SD, 2.07). Conclusion: Women reporting moderate sport specialization had improved movement quality and significantly better LESS scores compared to those with high/low specialization. Clinical Relevance: Athletes, especially women, should be encouraged to avoid early sport specialization to optimize movement quality, which may affect injury risk

    The Effects of an Injury Prevention Program on Landing Biomechanics over Time

    Get PDF
    Background: Knowledge is limited regarding how long improvements in biomechanics remain after completion of a lower extremity injury prevention program. Purpose: To evaluate the effects of an injury prevention program on movement technique and peak vertical ground-reaction forces (VGRF) over time compared with a standard warm-up (SWU) program. Study Design: Controlled laboratory study. Methods: A total of 1104 incoming freshmen (age range, 17-22 years) at a military academy in the United States volunteered to participate. Participants were cluster-randomized by military company to either the Dynamic Integrated Movement Enhancement (DIME) injury prevention program or SWU. A random subsample of participants completed a standardized jump-landing task at each time point: immediately before the intervention (PRE), immediately after (POST), and 2 (POST2M), 4 (POST4M), 6 (POST6M), and 8 months (POST8M) after the intervention. VGRF data collected during the jump-landing task were normalized to body weight (%BW). The Landing Error Scoring System (LESS) was used to evaluate movement technique during the jump landing. The change scores (δ) for each variable (LESS, VGRF) between the group's average value at PRE and each time point were calculated. Separate univariate analyses of variance were performed to evaluate group differences. Results: The results showed a greater decrease in mean (±SD) VGRF in the DIME group compared with the SWU group at all retention time points: POST2M (SWU [δ%BW], -0.13 ± 0.82; DIME, -0.62 ± 0.91; P =.001), POST4M (SWU, -0.15 ± 0.98; DIME,-0.46 ± 0.64; P =.04), POST6M (SWU, -0.04 ± 0.96; DIME, -0.53 ± 0.83; P =.004), and POST8M (SWU, 0.38 ± 0.95; DIME, -0.11 ± 0.98; P =.003), but there was not a significant improvement in the DIME group between PRE and POST8M (δ%BW, -0.11 ± 0.98). No group differences in δ LESS were observed. Conclusion: The study findings demonstrated that an injury prevention program performed as a warm-up can reduce vertical ground-reaction forces compared with a standard warm-up but a maintenance program is likely necessary in order for continued benefit. Clinical Relevance: Injury prevention programs may need to be performed constantly, or at least every sport season, in order for participants to maintain the protective effects against injury

    Landing Error Scoring System (LESS) Items are Associated with the Incidence Rate of Lower Extremity Stress Fracture

    Get PDF
    Objectives: Lower-extremity stress fracture injuries are a major cause of morbidity in physically active populations. The ability to efficiently screen for modifiable risk factors associated with injury is critical in developing and implementing effective injury prevention programs. The purpose of this study was to determine if baseline Landing Error Scoring System (LESS) scores were associated with the incidence rate of lower-extremity stress fracture during four years of follow-up. Methods: To accomplish this objective we conducted a prospective cohort study at a US Service Academy. A total of 1772 eligible subjects with complete baseline data and no history of lower-extremity stress fracture were included in this study. At baseline we conducted motion analysis during a jump landing task using the LESS. Incident lower-extremity stress fracture cases were identified during the four year follow-up period using the injury surveillance systems at our institution. The primary outcome of interest was the incidence rate of lower-extremity stress fracture during follow-up. The electronic medical records of each potential incident case were reviewed and case status was determined by an adjudication committee consisting of two sports medicine fellowship-trained orthopaedic surgeons who were blinded to baseline LESS data. The association between baseline LESS scores and the incidence rate of lower-extremity stress fracture was examined for total LESS score and for each individual LESS item. Univariate and multivariable Poisson regression models were used to estimate the association between baseline LESS scores and the incidence rate of lower-extremity stress fracture during follow-up. Results: During the follow-up period, 94 incident lower-extremity stress fractures were documented in the study cohort and the cumulative incidence of stress fracture was 5.3% (95%CI: 4.3%, 6.5%). In univariate analyses total LESS score at baseline was associated with the incidence rate of lower-extremity stress fracture during follow-up. For every additional movement error documented at baseline there was a 15% increase in the incidence rate of lower-extremity stress fracture during follow-up (IRR=1.15; 95%CI: 1.02, 1.31, p=0.025). Based on univariate analyses, several individual LESS items at baseline were also associated with the incidence rate of stress fracture during follow-up. Ankle flexion at initial contact (p=0.055), stance width at initial contact (p=0.026), asymmetrical landing at initial contact (p=0.003), trunk flexion at initial contact (p=0.036), and overall impression (p=0.021) were significantly associated with the incidence rate of stress fracture. In multivariable analyses controlling for sex and year of entry into the cohort, subjects who consistently landed flat-footed or heel-to-toe were 2.33 times (IRR=2.33; 95%CI: 1.36, 3.97, p=0.002) more likely to sustain a lower-extremity stress fracture during follow-up. Similarly, subjects who consistently demonstrated asymmetric landing at initial contact were 2.53 times (IRR=2.53; 95%CI: 1.34, 4.74, p=0.004) more likely to sustain a stress fracture during follow-up. Conclusion: These data suggest that specific LESS items may be predictive of lower-extremity stress fracture risk and may be helpful in injury screening and prevention

    Effect of a lower extremity preventive training program on physical performance scores in military recruits

    Get PDF
    Exercise-based preventive training programs are designed to improve movement patterns associated with lower extremity injury risk; however, the impact of these programs on general physical fitness has not been evaluated. The purpose of this study was to compare fitness scores between participants in a preventive training program and a control group. One thousand sixty-eight freshmen from a U.S. Service Academy were cluster-randomized into either the intervention or control group during 6 weeks of summer training. The intervention group performed a preventive training program, specifically the Dynamic Integrated Movement Enhancement (DIME), which is designed to improve lower extremity movement patterns. The control group performed the Army Preparation Drill (PD), a warm-up designed to prepare soldiers for training. Main outcome measures were the Army Physical Fitness Test (APFT) raw and scaled (for age and sex) scores. Independent t tests were used to assess between-group differences. Multivariable logistic regression models were used to control for the influence of confounding variables. Dynamic Integrated Movement Enhancement group participants completed the APFT 2-mile run 20 seconds faster compared with the PD group (p, 0.001), which corresponded with significantly higher scaled scores (p, 0.001). Army Physical Fitness Test push-up scores were significantly higher in the DIME group (p = 0.041), but there were no significant differences in APFT sit-up scores. The DIME group had significantly higher total APFT scores compared with the PD group (p, 0.001). Similar results were observed in multivariable models after controlling for sex and body mass index (BMI). Committing time to the implementation of a preventive training program does not appear to negatively affect fitness test scores

    The first decade of web-based sports injury surveillance: Descriptive epidemiology of injuries in US high school boys' soccer (2005-2006 through 2013-2014) and national collegiate athletic association men's soccer (2004-2005 through 2013-2014)

    Get PDF
    Context: The advent of Web-based sports injury surveillance via programs such as the High School Reporting Information Online system and the National Collegiate Athletic Association Injury Surveillance Program has aided the acquisition of boys' and men's soccer injury data. Objective: To describe the epidemiology of injuries sustained in high school boys' soccer in the 2005-2006 through 2013-2014 academic years and collegiate men's soccer in the 2004-2005 through 2013-2014 academic years using Web-based sports injury surveillance. Design: Descriptive epidemiology study. Setting: Online injury surveillance from soccer teams of high school boys (annual average ¼ 100) and collegiate men (annual average ¼ 41). Patients or Other Participants: Boys' or men's soccer players who participated in practices and competitions during the 2005-2006 through 2013-2014 academic years in high school and the 2004-2005 through 2013-2014 academic years in college, respectively. Main Outcome Measure(s): Athletic trainers collected time-loss (24 hours) injury and exposure data. Injury rates per 1000 athlete-exposures (AEs), injury rate ratios (IRRs) with 95% confidence intervals (CIs), and injury proportions by body site and diagnosis were calculated. Results: High School Reporting Information Online documented 2912 time-loss injuries during 1 592 238 AEs; the National Collegiate Athletic Association Injury Surveillance Program documented 4765 time-loss injuries during 686 918 AEs. The injury rate was higher in college than in high school (6.94 versus 1.83/1000 AEs; IRR ¼ 3.79; 95% CI ¼ 3.62, 3.97). Injury rates increased with smaller school size for high schools and were higher in Division I than in Divisions II and III. The injury rate was higher during competitions than during practices in both high school (IRR ¼ 3.55; 95% CI ¼ 3.30, 3.83) and college (IRR ¼ 3.45; 95% CI ¼ 3.26, 3.65). Most injuries were to the lower extremity. However, concussion was a common injury, particularly in collegiate goalkeepers and at all positions for high school players. Concussions accounted for more than one-fifth of injuries in high school games. Conclusions: Injury-prevention interventions should be tailored to reflect variations in the incidence and type of injury by level of competition, event type, and position
    corecore