3,482 research outputs found

    The Positional Match Running Performance of Elite Gaelic Football.

    Get PDF
    There is currently limited information available on match running performance in Gaelic football. The objective of the current study was to report on the match running profile of elite male Gaelic football and assess positional running performance. In this observational study 50 elite male Gaelic football players wore 4-Hz GPS units (VXsports, New Zealand) across 30 competitive games with a total of 212 full game data sets collected. Activity was classed according to total distance, high speed distance (≥17 km.h), sprint distance (≥22 km.h), mean velocity (km.h), peak velocity (km.h) and number of accelerations. The average match distance was 8160 ± 1482 m, reflective of a relative distance of 116 ± 21 m.min, with 1731 ± 659 m covered at high speed which is reflective of a relative high speed distance of 25 ± 9 m.min. The observed sprint distance was 445 ± 169 m distributed across 44 sprint actions. The peak velocity was 30.3 ± 1.8 km.h with a mean velocity of 6.5 ± 1.2 km.h. Players completed 184 ± 40 accelerations which represent 2.6 ± 0.5 a.min. There were significant differences between positional groups for both total running distance, high speed running distance and sprint distance, with midfielders covering more total and high speed running distance, compared to other positions (p<0.001). There was a reduction in high speed and sprint distance between the first and second half (p<0.001). Reductions in running performance were position dependant with the middle three positions experiencing the highest decrement in performance. The current study is the first to communicate a detailed description of match running performance during competitive elite Gaelic football match play

    Aerobic Fitness and Playing Experience Protect Against Spikes in Workload: The Role of the Acute:Chronic Workload Ratio on Injury Risk in Elite Gaelic Football.

    Get PDF
    PURPOSE: To examine the association between combined session-RPE workload measures and injury risk in elite Gaelic footballers. METHODS: Thirty-seven elite Gaelic footballers (mean ± SD age of 24.2 ± 2.9 yr) from one elite squad were involved in a single season study. Weekly workload (session-RPE multiplied by duration) and all time-loss injuries (including subsequent week injuries) were recorded during the period. Rolling weekly sums and week-to-week changes in workload were measured, allowing for the calculation of the 'acute:chronic workload ratio' that was calculated by dividing acute workload (i.e. 1-week workload) by chronic workload (i.e. rolling average 4-weekly workload). Workload measures were then modelled against all injury data sustained using a logistic regression model. Odds ratios (OR) were reported against a reference group. RESULTS: High 1-weekly workloads (≥2770 AU, OR = 1.63 - 6.75) were associated with significantly higher risk of injury compared to a low training load reference group (1.5), players with 1 year experience had a higher risk of injury (OR = 2.22) and players with 2-3 (OR = 0.20) and 4-6 years (OR = 0.24) of experience had a lower risk of injury. Players with poorer aerobic fitness (estimated from a 1 km time trial) had a higher injury risk compared to players with higher aerobic fitness (OR = 1.50-2.50). An acute:chronic workload ratio of (≥2.0) demonstrated the greatest risk of injury. CONCLUSIONS: These findings highlight an increased risk of injury for elite Gaelic football players with high (>2.0) acute:chronic workload ratios and high weekly workloads. A high aerobic capacity and playing experience appears to offer injury protection against rapid changes in workload and high acute:chronic workload ratios. Moderate workloads, coupled with moderate-high changes in the acute:chronic workload ratio appear to be protective for Gaelic football players

    High chronic training loads and exposure to bouts of maximal velocity running reduce injury risk in elite Gaelic football.

    Get PDF
    OBJECTIVES: To examine the relationship between chronic training loads, number of exposures to maximal velocity, the distance covered at maximal velocity, percentage of maximal velocity in training and match-play and subsequent injury risk in elite Gaelic footballers. DESIGN: Prospective cohort design. METHODS: Thirty-seven elite Gaelic footballers from one elite squad were involved in a one-season study. Training and game loads (session-RPE multiplied by duration in min) were recorded in conjunction with external match and training loads (using global positioning system technology) to measure the distance covered at maximal velocity, relative maximal velocity and the number of player exposures to maximal velocity across weekly periods during the season. Lower limb injuries were also recorded. Training load and GPS data were modelled against injury data using logistic regression. Odds ratios (OR) were calculated based on chronic training load status, relative maximal velocity and number of exposures to maximal velocity with these reported against the lowest reference group for these variables. RESULTS: Players who produced over 95% maximal velocity on at least one occasion within training environments had lower risk of injury compared to the reference group of 85% maximal velocity on at least one occasion (OR: 0.12, p=0.001). Higher chronic training loads (≥4750AU) allowed players to tolerate increased distances (between 90 to 120m) and exposures to maximal velocity (between 10 to 15 exposures), with these exposures having a protective effect compared to lower exposures (OR: 0.22 p=0.026) and distance (OR=0.23, p=0.055). CONCLUSIONS: Players who had higher chronic training loads (≥4750AU) tolerated increased distances and exposures to maximal velocity when compared to players exposed to low chronic training loads (≤4750AU). Under- and over-exposure of players to maximal velocity events (represented by a U-shaped curve) increased the risk of injury

    The metabolic power and energetic demands of elite Gaelic football match play.

    Get PDF
    BACKGROUND: Metabolic power has not yet been investigated within elite Gaelic football. The aim of the current investigation was to compare the metabolic power demands between positional groups and examine the temporal profile of elite Gaelic football match play. METHODS: Global positional satellite system (GPS) data were collected from 50 elite Gaelic football players from 4 inter-county teams during 35 elite competitive matches over a three season period. A total of 351 complete match samples were obtained for final analysis. Players were categorised based on positional groups; full-back, half-back, midfield, half- forward and full-forward. Instantaneous raw velocity data was obtained from the GPS and exported to a customised spreadsheet which provided estimations of both speed based, derived metabolic power and energy expenditure variables (total distance, high speed distance, average metabolic power, high power distance and total energy expenditure). RESULTS: Match mean distance was 9222 ± 1588 m, reflective of an average metabolic power of 9.5-12.5 W∙kg-1, with an average energy expenditure of 58-70 Kj∙kg-1 depending on position. There were significant differences between positional groups for both speed-based and metabolic power indices. Midfielders covered more total and high-speed distance, as well as greater average and overall energy expenditure compared to other positions (p < 0.001). A reduction in total, high-speed, and high-power distance, as well as average metabolic power throughout the match (p < 0.001) was observed. CONCLUSIONS: Positional differences exist for both metabolic power and traditional running based variables. The middle three positions (midfield, half-back and half-forward) possess greater activity profiles when compared to other positional groups. The reduction in metabolic power and traditional running based variables are comparable across match play. The current study demonstrates that metabolic power may contribute to our understanding of Gaelic football match-play

    The Integration of Internal and External Training Load Metrics in Hurling

    Get PDF
    The current study aimed to assess the relationship between the hurling player's fitness profile and integrated training load (TL) metrics. Twenty-five hurling players performed treadmill testing for VO2max, the speed at blood lactate concentrations of 2 mmol•L-1 (vLT) and 4 mmol•L-1 (vOBLA) and the heart rate-blood lactate profile for calculation of individual training impulse (iTRIMP). The total distance (TD; m), high speed distance (HSD; m) and sprint distance (SD; m) covered were measured using GPS technology (4-Hz, VX Sport, Lower Hutt, New Zealand) which allowed for the measurement of the external TL. The external TL was divided by the internal TL to form integration ratios. Pearson correlation analyses allowed for the assessment of the relationships between fitness measures and the ratios to performance during simulated match play. External measures of the TL alone showed limited correlations with fitness measures. Integrated TL ratios showed significant relationships with fitness measures in players. TD:iTRIMP was correlated with aerobic fitness measures VO2max (r = 0.524; p = 0.006; 95% CI: 0.224 to 0.754; large) and vOBLA (r = 0.559; p = 0.003; 95% CI: 0.254 to 0.854; large). HSD:iTRIMP also correlated with aerobic markers for fitness vLT (r = 0.502; p = 0.009; 95% CI: 0.204 to 0.801; large); vOBLA (r = 0.407; p = 0.039; 95% CI: 0.024 to 0.644; moderate). Interestingly SD:iTRIMP also showed significant correlations with vLT (r = 0.611; p = 0.001; 95% CI: 0.324 to 0.754; large). The current study showed that TL ratios can provide practitioners with a measure of fitness as external performance alone showed limited relationships with aerobic fitness measures. © Editorial Committee of Journal of Human Kinetics 2016

    Can the workload–injury relationship be moderated by improved strength, speed and repeated-sprint qualities?

    Get PDF
    Objectives The aim of this study was to investigate potential moderators (i.e. lower body strength, repeated-sprint ability [RSA] and maximal velocity) of injury risk within a team-sport cohort. Design Observational Cohort Study. Methods Forty male amateur hurling players (age: 26.2 ± 4.4 yr, height: 184.2 ± 7.1 cm, mass: 82.6 ± 4.7 kg) were recruited. During a two-year period, workload (session RPE x duration), injury and physical qualities were assessed. Specific physical qualities assessed were a three-repetition maximum Trapbar deadlift, 6 × 35-m repeated-sprint (RSA) and 5-, 10- and 20-m sprint time. All derived workload and physical quality measures were modelled against injury data using regression analysis. Odds ratios (OR) were reported against a reference group. Results Moderate weekly loads between ≥ 1400 AU and ≤ 1900 AU were protective against injury during both the pre-season (OR: 0.44, 95%CI: 0.18–0.66) and in-season periods (OR: 0.59, 95% CI: 0.37–0.82) compared to a low load reference group (≤ 1200 AU). When strength was considered as a moderator of injury risk, stronger athletes were better able to tolerate the given workload at a reduced risk. Stronger athletes were also better able to tolerate larger week-to-week changes ( > 550 AU to 1000 AU) in workload than weaker athletes (OR = 2.54–4.52). Athletes who were slower over 5-m (OR: 3.11, 95% CI: 2.33–3.87), 10-m (OR: 3.45, 95% CI: 2.11–4.13) and 20-m (OR: 3.12, 95% CI: 2.11–4.13) were at increased risk of injury compared to faster athletes. When repeated-sprint total time (RSAt) was considered as a moderator of injury risk at a given workload (≥ 1750 AU), athletes with better RSAt were at reduced risk compared to those with poor RSAt (OR: 5.55, 95%: 3.98–7.94). Conclusions These findings demonstrate that well-developed lower-body strength, RSA and speed are associated with better tolerance to higher workloads and reduced risk of injury in team-sport athletes

    The Lysosomal Diseases Testing Laboratory: A review of the past 47 years.

    Get PDF
    Lysosomal disorders are diseases that involve mutations in genes responsible for the coding of lysosomal enzymes, transport proteins, activator proteins and protein processing enzymes. These defects lead to the storage of specific metabolites within lysosomes resulting in a great variety of clinical features depending on the tissues with the storage, the storage products and the extent of the storage. The methods for rapidly diagnosing patients started in the late 1960\u27s when the enzyme defects were identified eliminating the need for tissue biopsies. The first requests for diagnostic help in this laboratory came in 1973. In that year, patients with Krabbe disease and Niemann-Pick type A were diagnosed. Since that time samples from about 62 000 individuals have been received for diagnostic studies, and 4900 diagnoses have been made. The largest number of diagnosed individuals had metachromatic leukodystrophy and Krabbe disease because of our research interest in leukodystrophies. A number of new disorders were identified and the primary defects in other disorders were clarified. With new methods for diagnosis, including newborn screening, molecular analysis, microarrays, there is still a need for biochemical confirmation before treatment is considered. With new treatments, including gene therapy, stem cell transplantation, enzyme replacement used alone or in combination becoming more available, the need for rapid, accurate diagnosis is critical

    First-trimester or second-trimester screening, or both, for Down's syndrome

    Get PDF
    BACKGROUND: It is uncertain how best to screen pregnant women for the presence of fetal Down's syndrome: to perform first-trimester screening, to perform second-trimester screening, or to use strategies incorporating measurements in both trimesters.METHODS: Women with singleton pregnancies underwent first-trimester combined screening (measurement of nuchal translucency, pregnancy-associated plasma protein A [PAPP-A], and the free beta subunit of human chorionic gonadotropin at 10 weeks 3 days through 13 weeks 6 days of gestation) and second-trimester quadruple screening (measurement of alpha-fetoprotein, total human chorionic gonadotropin, unconjugated estriol, and inhibin A at 15 through 18 weeks of gestation). We compared the results of stepwise sequential screening (risk results provided after each test), fully integrated screening (single risk result provided), and serum integrated screening (identical to fully integrated screening, but without nuchal translucency).RESULTS: First-trimester screening was performed in 38,167 patients; 117 had a fetus with Down's syndrome. At a 5 percent false positive rate, the rates of detection of Down's syndrome were as follows: with first-trimester combined screening, 87 percent, 85 percent, and 82 percent for measurements performed at 11, 12, and 13 weeks, respectively; with second-trimester quadruple screening, 81 percent; with stepwise sequential screening, 95 percent; with serum integrated screening, 88 percent; and with fully integrated screening with first-trimester measurements performed at 11 weeks, 96 percent. Paired comparisons found significant differences between the tests, except for the comparison between serum integrated screening and combined screening.CONCLUSIONS: First-trimester combined screening at 11 weeks of gestation is better than second-trimester quadruple screening but at 13 weeks has results similar to second-trimester quadruple screening. Both stepwise sequential screening and fully integrated screening have high rates of detection of Down's syndrome, with low false positive rates

    Process evaluation of appreciative inquiry to translate pain management evidence into pediatric nursing practice

    Get PDF
    Background Appreciative inquiry (AI) is an innovative knowledge translation (KT) intervention that is compatible with the Promoting Action on Research in Health Services (PARiHS) framework. This study explored the innovative use of AI as a theoretically based KT intervention applied to a clinical issue in an inpatient pediatric care setting. The implementation of AI was explored in terms of its acceptability, fidelity, and feasibility as a KT intervention in pain management. Methods A mixed-methods case study design was used. The case was a surgical unit in a pediatric academic-affiliated hospital. The sample consisted of nurses in leadership positions and staff nurses interested in the study. Data on the AI intervention implementation were collected by digitally recording the AI sessions, maintaining logs, and conducting individual semistructured interviews. Data were analysed using qualitative and quantitative content analyses and descriptive statistics. Findings were triangulated in the discussion. Results Three nurse leaders and nine staff members participated in the study. Participants were generally satisfied with the intervention, which consisted of four 3-hour, interactive AI sessions delivered over two weeks to promote change based on positive examples of pain management in the unit and staff implementation of an action plan. The AI sessions were delivered with high fidelity and 11 of 12 participants attended all four sessions, where they developed an action plan to enhance evidence-based pain assessment documentation. Participants labeled AI a 'refreshing approach to change' because it was positive, democratic, and built on existing practices. Several barriers affected their implementation of the action plan, including a context of change overload, logistics, busyness, and a lack of organised follow-up. Conclusions Results of this case study supported the acceptability, fidelity, and feasibility of AI as a KT intervention in pain management. The AI intervention requires minor refinements (e.g., incorporating continued follow-up meetings) to enhance its clinical utility and sustainability. The implementation process and effectiveness of the modified AI intervention require evaluation in a larger multisite study
    • …
    corecore