1,146 research outputs found
High chronic training loads and exposure to bouts of maximal velocity running reduce injury risk in elite Gaelic football.
OBJECTIVES: To examine the relationship between chronic training loads, number of exposures to maximal velocity, the distance covered at maximal velocity, percentage of maximal velocity in training and match-play and subsequent injury risk in elite Gaelic footballers. DESIGN: Prospective cohort design. METHODS: Thirty-seven elite Gaelic footballers from one elite squad were involved in a one-season study. Training and game loads (session-RPE multiplied by duration in min) were recorded in conjunction with external match and training loads (using global positioning system technology) to measure the distance covered at maximal velocity, relative maximal velocity and the number of player exposures to maximal velocity across weekly periods during the season. Lower limb injuries were also recorded. Training load and GPS data were modelled against injury data using logistic regression. Odds ratios (OR) were calculated based on chronic training load status, relative maximal velocity and number of exposures to maximal velocity with these reported against the lowest reference group for these variables. RESULTS: Players who produced over 95% maximal velocity on at least one occasion within training environments had lower risk of injury compared to the reference group of 85% maximal velocity on at least one occasion (OR: 0.12, p=0.001). Higher chronic training loads (â„4750AU) allowed players to tolerate increased distances (between 90 to 120m) and exposures to maximal velocity (between 10 to 15 exposures), with these exposures having a protective effect compared to lower exposures (OR: 0.22 p=0.026) and distance (OR=0.23, p=0.055). CONCLUSIONS: Players who had higher chronic training loads (â„4750AU) tolerated increased distances and exposures to maximal velocity when compared to players exposed to low chronic training loads (â€4750AU). Under- and over-exposure of players to maximal velocity events (represented by a U-shaped curve) increased the risk of injury
Aerobic Fitness and Playing Experience Protect Against Spikes in Workload: The Role of the Acute:Chronic Workload Ratio on Injury Risk in Elite Gaelic Football.
PURPOSE: To examine the association between combined session-RPE workload measures and injury risk in elite Gaelic footballers. METHODS: Thirty-seven elite Gaelic footballers (mean ± SD age of 24.2 ± 2.9 yr) from one elite squad were involved in a single season study. Weekly workload (session-RPE multiplied by duration) and all time-loss injuries (including subsequent week injuries) were recorded during the period. Rolling weekly sums and week-to-week changes in workload were measured, allowing for the calculation of the 'acute:chronic workload ratio' that was calculated by dividing acute workload (i.e. 1-week workload) by chronic workload (i.e. rolling average 4-weekly workload). Workload measures were then modelled against all injury data sustained using a logistic regression model. Odds ratios (OR) were reported against a reference group. RESULTS: High 1-weekly workloads (â„2770 AU, OR = 1.63 - 6.75) were associated with significantly higher risk of injury compared to a low training load reference group (1.5), players with 1 year experience had a higher risk of injury (OR = 2.22) and players with 2-3 (OR = 0.20) and 4-6 years (OR = 0.24) of experience had a lower risk of injury. Players with poorer aerobic fitness (estimated from a 1 km time trial) had a higher injury risk compared to players with higher aerobic fitness (OR = 1.50-2.50). An acute:chronic workload ratio of (â„2.0) demonstrated the greatest risk of injury. CONCLUSIONS: These findings highlight an increased risk of injury for elite Gaelic football players with high (>2.0) acute:chronic workload ratios and high weekly workloads. A high aerobic capacity and playing experience appears to offer injury protection against rapid changes in workload and high acute:chronic workload ratios. Moderate workloads, coupled with moderate-high changes in the acute:chronic workload ratio appear to be protective for Gaelic football players
Can the workloadâinjury relationship be moderated by improved strength, speed and repeated-sprint qualities?
Objectives The aim of this study was to investigate potential moderators (i.e. lower body strength, repeated-sprint ability [RSA] and maximal velocity) of injury risk within a team-sport cohort. Design Observational Cohort Study. Methods Forty male amateur hurling players (age: 26.2 ± 4.4 yr, height: 184.2 ± 7.1 cm, mass: 82.6 ± 4.7 kg) were recruited. During a two-year period, workload (session RPE x duration), injury and physical qualities were assessed. Specific physical qualities assessed were a three-repetition maximum Trapbar deadlift, 6 Ă 35-m repeated-sprint (RSA) and 5-, 10- and 20-m sprint time. All derived workload and physical quality measures were modelled against injury data using regression analysis. Odds ratios (OR) were reported against a reference group. Results Moderate weekly loads between â„ 1400 AU and †1900 AU were protective against injury during both the pre-season (OR: 0.44, 95%CI: 0.18â0.66) and in-season periods (OR: 0.59, 95% CI: 0.37â0.82) compared to a low load reference group (†1200 AU). When strength was considered as a moderator of injury risk, stronger athletes were better able to tolerate the given workload at a reduced risk. Stronger athletes were also better able to tolerate larger week-to-week changes ( > 550 AU to 1000 AU) in workload than weaker athletes (OR = 2.54â4.52). Athletes who were slower over 5-m (OR: 3.11, 95% CI: 2.33â3.87), 10-m (OR: 3.45, 95% CI: 2.11â4.13) and 20-m (OR: 3.12, 95% CI: 2.11â4.13) were at increased risk of injury compared to faster athletes. When repeated-sprint total time (RSAt) was considered as a moderator of injury risk at a given workload (â„ 1750 AU), athletes with better RSAt were at reduced risk compared to those with poor RSAt (OR: 5.55, 95%: 3.98â7.94). Conclusions These findings demonstrate that well-developed lower-body strength, RSA and speed are associated with better tolerance to higher workloads and reduced risk of injury in team-sport athletes
Long-term simulation of growth stage-based irrigation scheduling in maize under various water constraints in Colorado, USA
© The Author(s) 2017. Due to varying crop responses to water stress at different growth stages, scheduling irrigation is a challenge for farmers, especially when water availability varies on a monthly, seasonal and yearly basis. The objective of this study was to optimize irrigation between the vegetative (V) and reproductive (R) phases of maize under different available water levels in Colorado. Longterm (1992-2013) scenarios simulated with the calibrated Root Zone Water Quality Model were designed to meet 40%-100% of crop evapotranspiration (ET) requirements at V and R phases, subject to seasonal water availabilities (300, 400, 500 mm, and no water limit), with and without monthly limits (total of 112 scenarios). The most suitable irrigation between Vand R phases of maize was identified as 60/100, 80/100, and 100/100 of crop ET requirement for the 300, 400, 500 mm water available, respectively, based on the simulations from 1992 to 2013. When a monthly water limit was imposed, the corresponding suitable irrigation targets between V and R stages were 60/100, 100/100, and 100/100 of crop ET requirement for the above three seasonal water availabilities, respectively. Irrigation targets for producing higher crop yield with reduced risk of poor yield were discussed for projected five-year water availabilities
Variations of training load, monotony, and strain and dose-response relationships with maximal aerobic speed, maximal oxygen uptake, and isokinetic strength in professional soccer players
This study aimed to identify variations in weekly training load, training monotony, and training strain across a 10-week period (during both, pre- and in-season phases); and to analyze the dose-response relationships between training markers and maximal aerobic speed (MAS), maximal oxygen uptake, and isokinetic strength. Twenty-seven professional soccer players (24.9±3.5 years old) were monitored across the 10-week period using global positioning system units. Players were also tested for maximal aerobic speed, maximal oxygen uptake, and isokinetic strength before and after 10 weeks of training. Large positive correlations were found between sum of training load and extension peak torque in the right lower limb (r = 0.57, 90%CI[0.15;0.82]) and the ratio agonist/antagonist in the right lower limb (r = 0.51, [0.06;0.78]). It was observed that loading measures fluctuated across the period of the study and that the load was meaningfully associated with changes in the fitness status of players. However, those magnitudes of correlations were small-to-large, suggesting that variations in fitness level cannot be exclusively explained by the accumulated load and loading profile
Development and assessment of the Alberta Context Tool
<p>Abstract</p> <p>Background</p> <p>The context of healthcare organizations such as hospitals is increasingly accepted as having the potential to influence the use of new knowledge. However, the mechanisms by which the organizational context influences evidence-based practices are not well understood. Current measures of organizational context lack a theory-informed approach, lack construct clarity and generally have modest psychometric properties. This paper presents the development and initial psychometric validation of the Alberta Context Tool (ACT), an eight dimension measure of organizational context for healthcare settings.</p> <p>Methods</p> <p>Three principles guided the development of the ACT: substantive theory, brevity, and modifiability. The Promoting Action on Research Implementation in Health Services (PARiHS) framework and related literature were used to guide selection of items in the ACT. The ACT was required to be brief enough to be tolerated in busy and resource stretched work settings and to assess concepts of organizational context that were potentially <it>modifiable</it>. The English version of the ACT was completed by 764 nurses (752 valid responses) working in seven Canadian pediatric care hospitals as part of its initial validation. Cronbach's alpha, exploratory factor analysis, analysis of variance, and tests of association were used to assess instrument reliability and validity.</p> <p>Results</p> <p>Factor analysis indicated a 13-factor solution (accounting for 59.26% of the variance in 'organizational context'). The composition of the factors was similar to those originally conceptualized. Cronbach's alpha for the 13 factors ranged from .54 to .91 with 4 factors performing below the commonly accepted alpha cut off of .70. Bivariate associations between instrumental research utilization levels (which the ACT was developed to predict) and the ACT's 13 factors were statistically significant at the 5% level for 12 of the 13 factors. Each factor also showed a trend of increasing mean score ranging from the lowest level to the highest level of instrumental research use, indicating construct validity.</p> <p>Conclusions</p> <p>To date, no completely satisfactory measures of organizational context are available for use in healthcare. The ACT assesses several core domains to provide a comprehensive account of organizational context in healthcare settings. The tool's strengths are its brevity (allowing it to be completed in busy healthcare settings) and its focus on dimensions of organizational context that are modifiable. Refinements of the instrument for acute, long term care, and home care settings are ongoing.</p
Stress related epigenetic changes may explain opportunistic success in biological invasions in Antipode mussels
Different environmental factors could induce epigenetic changes, which are likely involved in the biological invasion process. Some of these factors are driven by humans as, for example, the pollution and deliberate or accidental introductions and others are due to natural conditions such as salinity. In this study, we have analysed the relationship between different stress factors: time in the new location, pollution and salinity with the methylation changes that could be involved in the invasive species tolerance to new environments. For this purpose, we have analysed two different musselsâ species, reciprocally introduced in antipode areas: the Mediterranean blue mussel Mytilus galloprovincialis and the New Zealand pygmy mussel Xenostrobus securis, widely recognized invaders outside their native distribution ranges. The demetylathion was higher in more stressed population, supporting the idea of epigenetic is involved in plasticity process. These results can open a new management protocols, using the epigenetic signals as potential pollution monitoring tool. We could use these epigenetic marks to recognise the invasive status in a population and determine potential biopollutants
Designing high-quality implementation research: development, application, feasibility and preliminary evaluation of the implementation science research development (ImpRes) tool and guide
Background:Â Designing implementation research can be a complex and daunting task, especially for applied health researchers who have not received specialist training in implementation science. We developed the Implementation Science Research Development (ImpRes) tool and supplementary guide to address this challenge and provide researchers with a systematic approach to designing implementation research. Methods:Â A multi-method and multi-stage approach was employed. An international, multidisciplinary expert panel engaged in an iterative brainstorming and consensus-building process to generate core domains of the ImpRes tool, representing core implementation science principles and concepts that researchers should consider when designing implementation research. Simultaneously, an iterative process of reviewing the literature and expert input informed the development and content of the tool. Once consensus had been reached, specialist expert input was sought on involving and engaging patients/service users; and economic evaluation. ImpRes was then applied to 15 implementation and improvement science projects across the National Institute of Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) South London, a research organisation in London, UK. Researchers who applied the ImpRes tool completed an 11-item questionnaire evaluating its structure, content and usefulness. Results:Â Consensus was reached on ten implementation science domains to be considered when designing implementation research. These include implementation theories, frameworks and models, determinants of implementation, implementation strategies, implementation outcomes and unintended consequences. Researchers who used the ImpRes tool found it useful for identifying project areas where implementation science is lacking (median 5/5, IQR 4â5) and for improving the quality of implementation research (median 4/5, IQR 4â5) and agreed that it contained the key components that should be considered when designing implementation research (median 4/5, IQR 4â4). Qualitative feedback from researchers who applied the ImpRes tool indicated that a supplementary guide was needed to facilitate use of the tool. Conclusions:Â We have developed a feasible and acceptable tool, and supplementary guide, to facilitate consideration and incorporation of core principles and concepts of implementation science in applied health implementation research. Future research is needed to establish whether application of the tool and guide has an effect on the quality of implementation research
Medical Student Professionalism Narratives: A Thematic Analysis and Interdisciplinary Comparative Investigation
<p>Abstract</p> <p>Background</p> <p>Professionalism development is influenced by the informal and hidden curriculum. The primary objective of this study was to better understand this experiential learning in the setting of the Emergency Department (ED). Secondarily, the study aimed to explore differences in the informal curriculum between Emergency Medicine (EM) and Internal Medicine (IM) clerkships.</p> <p>Methods</p> <p>A thematic analysis was conducted on 377 professionalism narratives from medical students completing a required EM clerkship from July 2008 through May 2010. The narratives were analyzed using established thematic categories from prior research as well as basic descriptive characteristics. Chi-square analysis was used to compare the frequency of thematic categories to prior research in IM. Finally, emerging themes not fully appreciated in the established thematic categories were created using grounded theory.</p> <p>Results</p> <p>Observations involving interactions between attending physician and patient were most abundant. The narratives were coded as positive 198 times, negative 128 times, and hybrid 37 times. The two most abundant narrative themes involved <it>manifesting respect </it>(36.9%) and <it>spending time </it>(23.7%). Both of these themes were statistically more likely to be noted by students on EM clerkships compared to IM clerkships. Finally, one new theme regarding <it>cynicism </it>emerged during analysis.</p> <p>Conclusions</p> <p>This analysis describes an informal curriculum that is diverse in themes. Student narratives suggest their clinical experiences to be influential on professionalism development. Medical students focus on different aspects of professionalism depending on clerkship specialty.</p
Gene dispersion is the key determinant of the read count bias in differential expression analysis of RNA-seq data
Background: In differential expression analysis of RNA-sequencing (RNA-seq) read count data for two sample groups, it is known that highly expressed genes (or longer genes) are more likely to be differentially expressed which is called read count bias (or gene length bias). This bias had great effect on the downstream Gene Ontology over-representation analysis. However, such a bias has not been systematically analyzed for different replicate types of RNA-seq data. Results: We show that the dispersion coefficient of a gene in the negative binomial modeling of read counts is the critical determinant of the read count bias (and gene length bias) by mathematical inference and tests for a number of simulated and real RNA-seq datasets. We demonstrate that the read count bias is mostly confined to data with small gene dispersions (e.g., technical replicates and some of genetically identical replicates such as cell lines or inbred animals), and many biological replicate data from unrelated samples do not suffer from such a bias except for genes with some small counts. It is also shown that the sample-permuting GSEA method yields a considerable number of false positives caused by the read count bias, while the preranked method does not. Conclusion: We showed the small gene variance (similarly, dispersion) is the main cause of read count bias (and gene length bias) for the first time and analyzed the read count bias for different replicate types of RNA-seq data and its effect on gene-set enrichment analysis
- âŠ