189 research outputs found

    Aerobic Fitness and Playing Experience Protect Against Spikes in Workload: The Role of the Acute:Chronic Workload Ratio on Injury Risk in Elite Gaelic Football.

    Get PDF
    PURPOSE: To examine the association between combined session-RPE workload measures and injury risk in elite Gaelic footballers. METHODS: Thirty-seven elite Gaelic footballers (mean ± SD age of 24.2 ± 2.9 yr) from one elite squad were involved in a single season study. Weekly workload (session-RPE multiplied by duration) and all time-loss injuries (including subsequent week injuries) were recorded during the period. Rolling weekly sums and week-to-week changes in workload were measured, allowing for the calculation of the 'acute:chronic workload ratio' that was calculated by dividing acute workload (i.e. 1-week workload) by chronic workload (i.e. rolling average 4-weekly workload). Workload measures were then modelled against all injury data sustained using a logistic regression model. Odds ratios (OR) were reported against a reference group. RESULTS: High 1-weekly workloads (≥2770 AU, OR = 1.63 - 6.75) were associated with significantly higher risk of injury compared to a low training load reference group (1.5), players with 1 year experience had a higher risk of injury (OR = 2.22) and players with 2-3 (OR = 0.20) and 4-6 years (OR = 0.24) of experience had a lower risk of injury. Players with poorer aerobic fitness (estimated from a 1 km time trial) had a higher injury risk compared to players with higher aerobic fitness (OR = 1.50-2.50). An acute:chronic workload ratio of (≥2.0) demonstrated the greatest risk of injury. CONCLUSIONS: These findings highlight an increased risk of injury for elite Gaelic football players with high (>2.0) acute:chronic workload ratios and high weekly workloads. A high aerobic capacity and playing experience appears to offer injury protection against rapid changes in workload and high acute:chronic workload ratios. Moderate workloads, coupled with moderate-high changes in the acute:chronic workload ratio appear to be protective for Gaelic football players

    High chronic training loads and exposure to bouts of maximal velocity running reduce injury risk in elite Gaelic football.

    Get PDF
    OBJECTIVES: To examine the relationship between chronic training loads, number of exposures to maximal velocity, the distance covered at maximal velocity, percentage of maximal velocity in training and match-play and subsequent injury risk in elite Gaelic footballers. DESIGN: Prospective cohort design. METHODS: Thirty-seven elite Gaelic footballers from one elite squad were involved in a one-season study. Training and game loads (session-RPE multiplied by duration in min) were recorded in conjunction with external match and training loads (using global positioning system technology) to measure the distance covered at maximal velocity, relative maximal velocity and the number of player exposures to maximal velocity across weekly periods during the season. Lower limb injuries were also recorded. Training load and GPS data were modelled against injury data using logistic regression. Odds ratios (OR) were calculated based on chronic training load status, relative maximal velocity and number of exposures to maximal velocity with these reported against the lowest reference group for these variables. RESULTS: Players who produced over 95% maximal velocity on at least one occasion within training environments had lower risk of injury compared to the reference group of 85% maximal velocity on at least one occasion (OR: 0.12, p=0.001). Higher chronic training loads (≥4750AU) allowed players to tolerate increased distances (between 90 to 120m) and exposures to maximal velocity (between 10 to 15 exposures), with these exposures having a protective effect compared to lower exposures (OR: 0.22 p=0.026) and distance (OR=0.23, p=0.055). CONCLUSIONS: Players who had higher chronic training loads (≥4750AU) tolerated increased distances and exposures to maximal velocity when compared to players exposed to low chronic training loads (≤4750AU). Under- and over-exposure of players to maximal velocity events (represented by a U-shaped curve) increased the risk of injury

    Can the workload–injury relationship be moderated by improved strength, speed and repeated-sprint qualities?

    Get PDF
    Objectives The aim of this study was to investigate potential moderators (i.e. lower body strength, repeated-sprint ability [RSA] and maximal velocity) of injury risk within a team-sport cohort. Design Observational Cohort Study. Methods Forty male amateur hurling players (age: 26.2 ± 4.4 yr, height: 184.2 ± 7.1 cm, mass: 82.6 ± 4.7 kg) were recruited. During a two-year period, workload (session RPE x duration), injury and physical qualities were assessed. Specific physical qualities assessed were a three-repetition maximum Trapbar deadlift, 6 × 35-m repeated-sprint (RSA) and 5-, 10- and 20-m sprint time. All derived workload and physical quality measures were modelled against injury data using regression analysis. Odds ratios (OR) were reported against a reference group. Results Moderate weekly loads between ≥ 1400 AU and ≤ 1900 AU were protective against injury during both the pre-season (OR: 0.44, 95%CI: 0.18–0.66) and in-season periods (OR: 0.59, 95% CI: 0.37–0.82) compared to a low load reference group (≤ 1200 AU). When strength was considered as a moderator of injury risk, stronger athletes were better able to tolerate the given workload at a reduced risk. Stronger athletes were also better able to tolerate larger week-to-week changes ( > 550 AU to 1000 AU) in workload than weaker athletes (OR = 2.54–4.52). Athletes who were slower over 5-m (OR: 3.11, 95% CI: 2.33–3.87), 10-m (OR: 3.45, 95% CI: 2.11–4.13) and 20-m (OR: 3.12, 95% CI: 2.11–4.13) were at increased risk of injury compared to faster athletes. When repeated-sprint total time (RSAt) was considered as a moderator of injury risk at a given workload (≥ 1750 AU), athletes with better RSAt were at reduced risk compared to those with poor RSAt (OR: 5.55, 95%: 3.98–7.94). Conclusions These findings demonstrate that well-developed lower-body strength, RSA and speed are associated with better tolerance to higher workloads and reduced risk of injury in team-sport athletes

    Environmental control on the distribution of metabolic strategies of benthic microbial mats in Lake Fryxell, Antarctica

    Get PDF
    naEcological theories posit that heterogeneity in environmental conditions greatly affects community structure and function. However, the degree to which ecological theory developed using plant- and animal-dominated systems applies to microbiomes is unclear. Investigating the metabolic strategies found in microbiomes are particularly informative for testing the universality of ecological theories because microorganisms have far wider metabolic capacity than plants and animals. We used metagenomic analyses to explore the relationships between the energy and physicochemical gradients in Lake Fryxell and the metabolic capacity of its benthic microbiome. Statistical analysis of the relative abundance of metabolic marker genes and gene family diversity shows that oxygenic photosynthesis, carbon fixation, and flavin-based electron bifurcation differentiate mats growing in different environmental conditions. The pattern of gene family diversity points to the likely importance of temporal environmental heterogeneity in addition to resource gradients. Overall, we found that the environmental heterogeneity of photosynthetically active radiation (PAR) and oxygen concentration ([O2]) in Lake Fryxell provide the framework by which metabolic diversity and composition of the community is structured, in accordance with its phylogenetic structure. The organization of the resulting microbial ecosystems are consistent with the maximum power principle and the species sorting model.© 2020 Dillon et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited

    A hot and fast ultra-stripped supernova that likely formed a compact neutron star binary.

    Get PDF
    Compact neutron star binary systems are produced from binary massive stars through stellar evolution involving up to two supernova explosions. The final stages in the formation of these systems have not been directly observed. We report the discovery of iPTF 14gqr (SN 2014ft), a type Ic supernova with a fast-evolving light curve indicating an extremely low ejecta mass (≈0.2 solar masses) and low kinetic energy (≈2 × 1050 ergs). Early photometry and spectroscopy reveal evidence of shock cooling of an extended helium-rich envelope, likely ejected in an intense pre-explosion mass-loss episode of the progenitor. Taken together, we interpret iPTF 14gqr as evidence for ultra-stripped supernovae that form neutron stars in compact binary systems

    Mapping Migratory Bird Prevalence Using Remote Sensing Data Fusion

    Get PDF
    This is the publisher’s final pdf. The published article is copyrighted by the Public Library of Science and can be found at: http://www.plosone.org/home.action.Background: Improved maps of species distributions are important for effective management of wildlife under increasing anthropogenic pressures. Recent advances in lidar and radar remote sensing have shown considerable potential for mapping forest structure and habitat characteristics across landscapes. However, their relative efficacies and integrated use in habitat mapping remain largely unexplored. We evaluated the use of lidar, radar and multispectral remote sensing data in predicting multi-year bird detections or prevalence for 8 migratory songbird species in the unfragmented temperate deciduous forests of New Hampshire, USA. \ud \ud Methodology and Principal Findings: A set of 104 predictor variables describing vegetation vertical structure and variability from lidar, phenology from multispectral data and backscatter properties from radar data were derived. We tested the accuracies of these variables in predicting prevalence using Random Forests regression models. All data sets showed more than 30% predictive power with radar models having the lowest and multi-sensor synergy ("fusion") models having highest accuracies. Fusion explained between 54% and 75% variance in prevalence for all the birds considered. Stem density from discrete return lidar and phenology from multispectral data were among the best predictors. Further analysis revealed different relationships between the remote sensing metrics and bird prevalence. Spatial maps of prevalence were consistent with known habitat preferences for the bird species. \ud \ud Conclusion and Significance: Our results highlight the potential of integrating multiple remote sensing data sets using machine-learning methods to improve habitat mapping. Multi-dimensional habitat structure maps such as those generated from this study can significantly advance forest management and ecological research by facilitating fine-scale studies at both stand and landscape level

    Design, rationale, and baseline characteristics of a cluster randomized controlled trial of pay for performance for hypertension treatment: study protocol

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Despite compelling evidence of the benefits of treatment and well-accepted guidelines for treatment, hypertension is controlled in less than one-half of United States citizens.</p> <p>Methods/design</p> <p>This randomized controlled trial tests whether explicit financial incentives promote the translation of guideline-recommended care for hypertension into clinical practice and improve blood pressure (BP) control in the primary care setting. Using constrained randomization, we assigned 12 Veterans Affairs hospital outpatient clinics to four study arms: physician-level incentive; group-level incentive; combination of physician and group incentives; and no incentives (control). All participants at the hospital (cluster) were assigned to the same study arm. We enrolled 83 full-time primary care physicians and 42 non-physician personnel. The intervention consisted of an educational session about guideline-recommended care for hypertension, five audit and feedback reports, and five disbursements of incentive payments. Incentive payments rewarded participants for chart-documented use of guideline-recommended antihypertensive medications, BP control, and appropriate responses to uncontrolled BP during a prior four-month performance period over the 20-month intervention. To identify potential unintended consequences of the incentives, the study team interviewed study participants, as well as non-participant primary care personnel and leadership at study sites. Chart reviews included data collection on quality measures not related to hypertension. To evaluate the persistence of the effect of the incentives, the study design includes a washout period.</p> <p>Discussion</p> <p>We briefly describe the rationale for the interventions being studied, as well as the major design choices. Rigorous research designs such as the one described here are necessary to determine whether performance-based payment arrangements such as financial incentives result in meaningful quality improvements.</p> <p>Trial Registration</p> <p><url>http://www.clinicaltrials.gov</url><a href="http://www.clinicaltrials.gov/ct2/show/NCT00302718">NCT00302718</a></p

    Microbial transformations of selenite by methane-oxidizing bacteria

    Get PDF
    Abstract Methane oxidizing bacteria are well known for their role in the global methane cycle and their potential for microbial transformation of wide range of hydrocarbon and chlorinated hydrocarbon pollution. Recently, it has also emerged that methane-oxidizing bacteria interact with inorganic pollutants in the environment. Here we report what we believe to be the first study of the interaction of pure strains of methane-oxidizing bacteria with selenite. Results indicate that the commonly used laboratory model strains of methane oxidizing bacteria, Methylococcus capsulatus (Bath) and Methylosinus trichosporium OB3b are both able to reduce the toxic selenite (SeO32-) but not selenate (SeO42-) to red spherical nanoparticulate elemental selenium (Se0), which was characterised via EDX and EXAFS. The cultures also produced volatile selenium-containing species, which suggests that both strains may have an additional activity that can either transform Se0 or selenite into volatile methylated forms of selenium. Transmission electron microscopy (TEM) measurements and experiments with the cell fractions: cytoplasm, cell wall and cell membrane show that the nanoparticles are formed mainly on the cell wall. Collectively these results are promising for the use of methane-oxidizing bacteria for bioremediation or suggest possible uses in the production of selenium nanoparticles for biotechnology

    Effective health care for older people living and dying in care homes: A realist review

    Get PDF
    Background: Care home residents in England have variable access to health care services. There is currently no coherent policy or consensus about the best arrangements to meet these needs. The purpose of this review was to explore the evidence for how different service delivery models for care home residents support and/or improve wellbeing and health-related outcomes in older people living and dying in care homes. Methods: We conceptualised models of health care provision to care homes as complex interventions. We used a realist review approach to develop a preliminary understanding of what supported good health care provision to care homes. We completed a scoping of the literature and interviewed National Health Service and Local Authority commissioners, providers of services to care homes, representatives from the Regulator, care home managers, residents and their families. We used these data to develop theoretical propositions to be tested in the literature to explain why an intervention may be effective in some situations and not others. We searched electronic databases and related grey literature. Finally the findings were reviewed with an external advisory group. Results: Strategies that support and sustain relational working between care home staff and visiting health care professionals explained the observed differences in how health care interventions were accepted and embedded into care home practice. Actions that encouraged visiting health care professionals and care home staff jointly to identify, plan and implement care home appropriate protocols for care, when supported by ongoing facilitation from visiting clinicians, were important. Contextual factors such as financial incentives or sanctions, agreed protocols, clinical expertise and structured approaches to assessment and care planning could support relational working to occur, but of themselves appeared insufficient to achieve change. Conclusion: How relational working is structured between health and care home staff is key to whether health service interventions achieve health related outcomes for residents and their respective organisations. The belief that either paying clinicians to do more in care homes and/or investing in training of care home staff is sufficient for better outcomes was not supported.This research was funded by National Institute of Health Research Health Service Delivery and Research programme (HSDR 11/021/02)
    • …
    corecore