341 research outputs found

    Trends in movement quality in US Military Academy cadets 2005-17: A JUMP-ACL study

    Get PDF
    Objectives: This study sought to determine if there were significant trends in lower extremity movement quality, as assessed by the Landing Error Scoring System (LESS) scores and plane-specific LESS subscales, across in 12 recent cohorts of incoming USMA cadets. Design: prospective cohort study. Setting: United States Military Academy. Participants: 7,591. Main outcome measures: Landing Error Scoring System (LESS) scores, adjusted for sex and ACL injury history. Results: Statistically significant inverse trends were found between total LESS score and year (p < 0.01) and sagittal plane subscale and year (p < 0.01). A statistically significant direct trend was found for the frontal/transverse plane subscale and year (p < 0.01). However, each of these trends had a small associated effect size, and none were considered clinically meaningful. Conclusions: There were no meaningful changes in lower extremity movement quality in incoming US Military Academy cadets between 2005 and 2017

    Validation of a commercially available markerless motion-capture system for trunk and lower extremity kinematics during a jump-landing assessment

    Get PDF
    Context: Field-based, portable motion-capture systems can be used to help identify individuals at greater risk of lower extremity injury. Microsoft Kinect-based markerless motion-capture systems meet these requirements; however, until recently, these systems were generally not automated, required substantial data postprocessing, and were not commercially available. Objective: To validate the kinematic measures of a commercially available markerless motion-capture system. Design: Descriptive laboratory study. Setting: Laboratory. Patients or Other Participants: A total of 20 healthy, physically active university students (10 males, 10 females; age ¼ 20.50 6 2.78 years, height ¼ 170.36 6 9.82 cm, mass ¼ 68.38 6 10.07 kg, body mass index ¼ 23.50 6 2.40 kg/m2). Intervention(s): Participants completed 5 jump-landing trials. Kinematic data were simultaneously recorded using Kinect-based markerless and stereophotogrammetric motion-capture systems. Main Outcome Measure(s): Sagittal- and frontal-plane trunk, hip-joint, and knee-joint angles were identified at initial ground contact of the jump landing (IC), for the maximum joint angle during the landing phase of the initial landing (MAX), and for the joint-angle displacement from IC to MAX (DSP). Outliers were removed, and data were averaged across trials. We used intraclass correlation coefficients (ICCs [2,1]) to assess intersystem reliability and the paired-samples t test to examine mean differences (a &lt; .05). Results: Agreement existed between the systems (ICC range ¼1.52 to 0.96; ICC average ¼ 0.58), with 75.00% (n ¼ 24/ 32) of the measures being validated (P &lt; .05). Agreement was better for sagittal- (ICC average ¼ 0.84) than frontal- (ICC average ¼ 0.35) plane measures. Agreement was best for MAX (ICC average ¼ 0.77) compared with IC (ICC average ¼ 0.56) and DSP (ICC average ¼ 0.41) measures. Pairwise comparisons identified differences for 18.75% (6/32) of the measures. Fewer differences were observed for sagittal- (0.00%; 0/15) than for frontal- (35.29%; 6/17) plane measures. Between-systems differences were equivalent for MAX (18.18%; 2/11), DSP (18.18%; 2/11), and IC (20.00%; 2/10) measures. The markerless system underestimated sagittal-plane measures (86.67%; 13/15) and overestimated frontal-plane measures (76.47%; 13/ 17). No trends were observed for overestimating or underestimating IC, MAX, or DSP measures. Conclusions: Moderate agreement existed between markerless and stereophotogrammetric motion-capture systems. Better agreement existed for larger (eg, sagittal-plane, MAX) than for smaller (eg, frontal-plane, IC) joint angles. The DSP angles had the worst agreement. Markerless motion-capture systems may help clinicians identify individuals at greater risk of lower extremity injury

    Solar-Cycle Characteristics Examined in Separate Hemispheres: Phase, Gnevyshev Gap, and Length of Minimum

    Full text link
    Research results from solar-dynamo models show the northern and southern hemispheres may evolve separately throughout the solar cycle. The observed phase lag between the hemispheres provides information regarding the strength of hemispheric coupling. Using hemispheric sunspot-area and sunspot-number data from Cycles 12 - 23, we determine how out of phase the separate hemispheres are during the rising, maximum, and declining period of each solar cycle. Hemispheric phase differences range from 0 - 11, 0 - 14, and 2 - 19 months for the rising, maximum, and declining periods, respectively. The phases appear randomly distributed between zero months (in phase) and half of the rise (or decline) time of the solar cycle. An analysis of the Gnevyshev gap is conducted to determine if the double-peak is caused by the averaging of two hemispheres that are out of phase. We confirm previous findings that the Gnevyshev gap is a phenomenon that occurs in the separate hemispheres and is not due to a superposition of sunspot indices from hemispheres slightly out of phase. Cross hemispheric coupling could be strongest at solar minimum, when there are large quantities of magnetic flux at the Equator. We search for a correlation between the hemispheric phase difference near the end of the solar cycle and the length of solar-cycle minimum, but found none. Because magnetic flux diffusion across the Equator is a mechanism by which the hemispheres couple, we measured the magnetic flux crossing the Equator by examining magnetograms for Solar Cycles 21 - 23. We find, on average, a surplus of northern hemisphere magnetic flux crossing during the mid-declining phase of each solar cycle. However, we find no correlation between magnitude of magnetic flux crossing the Equator, length of solar minima, and phase lag between the hemispheres.Comment: 15 pages, 7 figure

    Lower Extremity Musculoskeletal Injury in US Military Academy Cadet Basic Training: A Survival Analysis Evaluating Sex, History of Injury, and Body Mass Index

    Get PDF
    Background: Injury incidence for physically active populations with a high volume of physical load can exceed 79%. There is little existing research focused on timing of injury and how that timing differs based on certain risk factors. Purpose/Hypothesis: The purpose of this study was to report both the incidence and timing of lower extremity injuries during cadet basic training. We hypothesized that women, those with a history of injury, and those in underweight and obese body mass index (BMI) categories would sustain lower extremity musculoskeletal injury earlier in the training period than men, those without injury history, and those in the normal-weight BMI category. Study Design: Cohort study; Level of evidence, 2. Methods: Cadets from the class of 2022, arriving in 2018, served as the study population. Baseline information on sex and injury history was collected via questionnaire, and BMI was calculated from height and weight taken during week 1 at the United States Military Academy. Categories were underweight (BMI <20), middleweight (20-29.99), and obese (≥30). Injury surveillance was performed over the first 60 days of training via electronic medical record review and monitoring. Kaplan-Meier survival curves were used to estimate group differences in time to the first musculoskeletal injury. Cox proportional hazard regression was used to estimate hazard ratios (HRs). Results: A total of 595 cadets participated. The cohort was 76.8% male, with 29.9% reporting previous injury history and 93.3% having a BMI between 20 and 30. Overall, 16.3% of cadets (12.3% of male cadets and 29.7% of female cadets) experienced an injury during the follow-up period. Women experienced significantly greater incident injury than did men (P <.001). Separation of survival curves comparing the sexes and injury history occurred at weeks 3 and 4, respectively. Hazards for first musculoskeletal injury were significantly greater for women versus men (HR, 2.63; 95% CI, 1.76-3.94) and for those who reported a history of injury versus no injury history (HR, 1.76; 95% CI, 1.18-2.64). No differences were observed between BMI categories. Conclusion: Female cadets and those reporting previous musculoskeletal injury demonstrated a greater hazard of musculoskeletal injury during cadet basic training. This study did not observe an association between BMI and injury

    Risk of knee osteoarthritis over 24 months in individuals who decrease walking speed during a 12-month period: Data from the osteoarthritis initiative

    Get PDF
    Objective. To assess the association between change in walking speed over a 12-month period and risk of developing radiographic knee osteoarthritis (rKOA) over a 24-month period. Methods.We included participants without rKOA from the Osteoarthritis Initiative. Change in walking speed was determined from a 20-m walk assessment, calculated using walking speed at 12-month followup minus baseline speed and/or 24-month followup walking speed minus 12-month speed. Incident rKOA was defined as progressing to Kellgren-Lawrence arthritis grading scale ≥ 2 within 24 months (i.e., incidence between 12 and 36 mos or 24 and 48 mos). Self-reported significant knee injury during the exposure period, age, body mass index (BMI), and Physical Activity Scale for the Elderly (PASE) score were adjusted for analytically. Results. We included 2638 observations among 1460 unique participants (58% women; aged 59 ± 9 yrs, range 45-79). The mean change in walking speed over 12 months was 0.001 ± 0.13 m/s (range -0.6271 to 1.4968). About 5% of the sample (n = 122) developed rKOA over a 24-month period. After controlling for significant knee injury, age, BMI, and PASE score, we found an 8% relative increase in risk of developing rKOA for every 0.1 m/s decrease in walking speed over a 12-month period (risk ratio 1.08, 95% CI 1.00-1.15, p = 0.05). Conclusion. Evaluating change in speed over a 12-month period using a 20-m walk test may be useful in identifying individuals at increased risk of developing rKOA over the subsequent 24 months. Identification of patients at high risk for developing rKOA would allow medical providers to implement early interventions to maximize joint health

    Trunk and lower extremity movement patterns, stress fracture risk factors, and biomarkers of bone turnover in military trainees

    Get PDF
    Context: Military service members commonly sustain lower extremity stress fractures (SFx). How SFx risk factors influence bone metabolism is unknown. Understanding how SFx risk factors influence bone metabolism may help to optimize risk-mitigation strategies. Objective: To determine how SFx risk factors influence bone metabolism. Design: Cross-sectional study. Setting: Military service academy. Patients or Other Participants: Forty-five men (agepre ¼ 18.56 6 1.39 years, heightpre ¼ 176.95 6 7.29 cm, masspre ¼ 77.20 6 9.40 kg; body mass indexpre ¼ 24.68 6 2.87) who completed Cadet Basic Training (CBT). Individuals with neurologic or metabolic disorders were excluded. Intervention(s): We assessed SFx risk factors (independent variables) with (1) the Landing Error Scoring System (LESS), (2) self-reported injury and physical activity questionnaires, and (3) physical fitness tests. We assessed bone biomarkers (dependent variables; procollagen type I amino-terminal propeptide [PINP] and cross-linked collagen telopeptide [CTx-1]) via serum. Main Outcome Measure(s): A markerless motion-capture system was used to analyze trunk and lower extremity biomechanics via the LESS. Serum samples were collected post-CBT; enzyme-linked immunosorbent assays determined PINP and CTx-1 concentrations, and PINP: CTx-1 ratios were calculated. Linear regression models demonstrated associations between SFx risk factors and PINP and CTx-1 concentrations and PINP: CTx-1 ratio. Biomarker concentration mean differences with 95% confidence intervals were calculated. Significance was set a priori using a ≤ .10 for simple and a ≤ .05 for multiple regression analyses. Results: The multiple regression models incorporating LESS and SFx risk factor data predicted the PINP concentration (R2 ¼ 0.47, P ¼ .02) and PINP: CTx-1 ratio (R2 ¼ 0.66, P ¼ .01). The PINP concentration was increased by foot internal rotation, trunk flexion, CBT injury, sit-up score, and pre- to post-CBT mass changes. The CTx-1 concentration was increased by heel-to-toe landing and post-CBT mass. The PINP: CTx-1 ratio was increased by foot internal rotation, lower extremity sagittal-plane displacement (inversely), CBT injury, sit-up score, and pre- to post-CBT mass changes. Conclusions: Stress fracture risk factors accounted for 66% of the PINP: CTx-1 ratio variability, a potential surrogate for bone health. Our findings provide insight into how SFx risk factors influence bone health. This information can help guide SFx risk-mitigation strategies

    Biomechanical risk factors for lower extremity stress fracture

    Get PDF
    Objectives: Stress fracture injuries disproportionately affect athletes and military service members and little is known about the modifiable biomechanical risk factors associated with these injuries. The purpose of this study was to prospectively examine the association between neuromuscular and biomechanical factors upon entry to military service and the subsequent incidence of lower-extremity stress fracture injury during four years of follow-up. Methods: We analyzed data from the JUMP-ACL cohort, an existing prospective cohort study of military cadets. JUMP-ACL conducted detailed motion analysis during a jump landing task at the initiation of each subject’s military career. We limited our analyses to the class years 2009-2013 (i.e., subjects who completed baseline testing in 2005-2008). There were 1895 subjects available for analysis. Fifty-two subjects reported a history of stress fracture at baseline and were excluded from further analysis leaving 1843 subjects. Incident lower extremity-stress fracture cases were identified through the Defense Medical Surveillance System and the Cadet Injury and Illness Tracking System during the follow-up period. The electronic medical records of each potential incident case were reviewed and each case was confirmed by an adjudication committee consisting of two sports medicine fellowship trained orthopaedic surgeons. The primary outcome of interest was the incidence rate of lower-extremity stress fracture during the follow-up period. The association between incident stress fracture and sagittal, frontal, and transverse plane hip and knee kinematics during the jump-landing task were examined at initial contact (IC), 15%(T15), 50%(T50), 85%(T85) and 100%(T100) of stance phase. Descriptive plots of all biomechanical variables along with 95% confidence intervals (CI) were generated during the stance phase of the jump landing task. Univariate and multivariable Poisson regression models were used to estimate the association between baseline biomechanical factors and the incidence rate of lower-extremity stress fracture during follow-up. Results: Overall, 94 (5.1%, 95%CI: 4.14, 6.21) subjects sustained an incident stress fracture during the follow-up period. The incidence rate for stress fracture injuries among females was nearly three times greater when compared to males (IRR=2.86, 95%CI: 1.88, 4.34, p<0.001). Compared to those with greater than 5° of knee valgus, subjects with neutral or varus knee alignment experienced incidence rates for stress fracture that were 43%-53% lower at IC (IRR=0.57, 95%CI: 0.29, 1.11, p=0.10), T50 (IRR=0.47, 95%CI=0.23, 1.00, p=0.05), and T85 (IRR=0.53, 95%CI: 0.29, 0.98, p=0.04). Subjects with greater than 5° of internal knee rotation exhibited rates for stress fracture that were 2-4 times higher at T15 (IRR=2.31, 95%CI: 1.01, 5.27, p=0.05), T50 (IRR=3.98, 95%CI: 0.99, 16.00, p=0.05), and T85 (IRR=2.31, 95%CI: 0.86, 6.23, p=0.10), when compared to those with neutral or external knee rotation alignment. Conclusion: Several potentially modifiable biomechanical factors at the time of entry into military service appear to be associated with the subsequent rate of stress fracture. It is possible that injury prevention programs targeted to address these biomechanical movement patterns may reduce the risk of stress fracture injury in athletes and military service members

    Automated quantification of the landing error scoring system with a markerless motion-Capture system

    Get PDF
    Context: The Landing Error Scoring System (LESS) can be used to identify individuals with an elevated risk of lower extremity injury. The limitation of the LESS is that raters identify movement errors from video replay, which is time-consuming and, therefore, may limit its use by clinicians. A markerless motion-capture system may be capable of automating LESS scoring, thereby removing this obstacle. Objective: To determine the reliability of an automated markerless motion-capture system for scoring the LESS. Design: Cross-sectional study. Setting: United States Military Academy. Patients or Other Participants: A total of 57 healthy, physically active individuals (47 men, 10 women; age ¼ 18.6 6 0.6 years, height ¼ 174.5 6 6.7 cm, mass ¼ 75.9 6 9.2 kg). Main Outcome Measure(s): Participants completed 3 jump-landing trials that were recorded by standard video cameras and a depth camera. Their movement quality was evaluated by expert LESS raters (standard video recording) using the LESS rubric and by software that automates LESS scoring (depth-camera data). We recorded an error for a LESS item if it was present on at least 2 of 3 jump-landing trials. We calculated j statistics, prevalence- and bias-adjusted j (PABAK) statistics, and percentage agreement for each LESS item. Interrater reliability was evaluated between the 2 expert rater scores and between a consensus expert score and the markerless motion-capture system score. Results: We observed reliability between the 2 expert LESS raters (average j ¼ 0.45 6 0.35, average PABAK ¼ 0.67 6 0.34; percentage agreement ¼ 0.83 6 0.17). The markerless motion-capture system had similar reliability with consensus expert scores (average j ¼ 0.48 6 0.40, average PABAK ¼ 0.71 6 0.27; percentage agreement ¼ 0.85 6 0.14). However, reliability was poor for 5 LESS items in both LESS score comparisons. Conclusions: A markerless motion-capture system had the same level of reliability as expert LESS raters, suggesting that an automated system can accurately assess movement. Therefore, clinicians can use the markerless motion-capture system to reliably score the LESS without being limited by the time requirements of manual LESS scoring

    School-level determinants of incidence of sports-related concussion: Findings from the CARE Consortium

    Get PDF
    OBJECTIVE: Epidemiologic research on sports-related concussion (SRC) has focused on individual risk factors, with limited research on institutional risk factors and variability in concussion rates. METHODS: This study used data from 53,822 athletes-seasons collected at 30 United States sites (26 civilian institutions and 4 military service academies), from 2014/15 to 2018/19 academic years, by the Concussion Assessment, Research, and Education Consortium. School-level risk factors included competitive division (DI, DII, DIII), school type (military/civilian) and a Sport Risk Index (SRI; Low, Medium, High). For comparability between civilian institutions and military academies, only NCAA athletes and concussions in sports games and practices were included. Random intercepts log-binomial regression was used to estimate Risk Ratios (RRs) and model variability in SRC risk. RESULTS: A total of 2,503 SRCs were observed during the study period, including 829 competition SRCs (33%) and 1,674 practice SRCs (67%). Most variability in SRC risk was at the level of athlete or team (within-school), rather than at the school-level. Specifically, across the three SRC outcomes (all [competition and practice combined], competition-only, and practice-only), within-school variability was 5 to 7 times greater than between-school variability. Three school-level risk factors (Division, School Type, and SRI) accounted for over one-third (36%) of between-school variability. SRI was the strongest school-level predictor of SRC risk (RR = 5.7; 95%CI: 4.2, 7.6 for High vs. Low). SRC risk was higher for Division I compared to Divisions II/III (RR = 1.6; 95%CI: 0.9, 2.9 for DI vs. DIII), and military academies had a moderately elevated risk of SRC (RR = 1.4; 95%CI: 0.7, 2.7). CONCLUSION: A large portion of the apparent variability between schools was attributable to structural factors (sport risk and competitive level), suggesting that there were minimal systemic differences in concussion identification between schools. While most variability is within-school, understanding school-level determinants of concussion risk may still be important in providing the implementation science context for individual-level interventions
    • …
    corecore