320 research outputs found

    Organised chaos in late specialisation team sports: Weekly training loads of elite adolescent rugby union players participating with multiple teams

    Get PDF
    The aim of this study was to quantify the mean weekly training load (TL) of elite adolescent rugby union players participating in multiple teams, and examine the differences between playing positions. Twenty elite male adolescent rugby union players (17.4 ± 0.7 years) were recruited from a regional academy and categorised by playing position; forwards (n=10) and backs (n=10). Global positioning system and accelerometer microtechnology was used to quantify external TL, and session-rating of perceived exertion (sRPE) was used to quantify internal TL during all sessions throughout a 10-week in-season period. A total of 97 complete observations (5 ± 3 weeks per participant) were analysed, and differences between-positions were assessed using Cohen’s d effect sizes (ES) and magnitude-based inferences. Mean weekly sRPE was 1217 ± 364 AU (between-subject coefficient of variation (CV) = 30%), with a total distance (TD) of 11629 ± 3445 m (CV= 30%), and PlayerLoadTM (PL) of 1124 ± 330 AU (CV= 29%). Within-subject CV ranged between 5-78% for sRPE, 24-82% for TD, and 19-84% for PL. Mean TD (13063 ± 3933 vs. 10195 ± 2242 m), and PL (1246 ± 345 vs. 1002 ± 279 AU) were both likely greater for backs compared to forwards (moderate ES), however differences in sRPE were unclear (small ES). Although mean internal TLs and volumes were low, external TLs were higher than previously reported during pre-season and in-season periods in senior professional players. Additionally, the large between-subject and within-subject variation in weekly TL suggests players participate in a chaotic training system

    Trends in resources for neonatal intensive care at delivery hospitals for infants born younger than 30 weeks' gestation, 2009-2020

    Get PDF
    Importance: In an ideal regionalized system, all infants born very preterm would be delivered at a large tertiary hospital capable of providing all necessary care. Objective: To examine whether the distribution of extremely preterm births changed between 2009 and 2020 based on neonatal intensive care resources at the delivery hospital. Design, setting, and participants: This retrospective cohort study was conducted at 822 Vermont Oxford Network (VON) centers in the US between 2009 and 2020. Participants included infants born at 22 to 29 weeks' gestation, delivered at or transferred to centers participating in the VON. Data were analyzed from February to December 2022. Exposures: Hospital of birth at 22 to 29 weeks' gestation. Main outcomes and measures: Birthplace neonatal intensive care unit (NICU) level was classified as A, restriction on assisted ventilation or no surgery; B, major surgery; or C, cardiac surgery requiring bypass. Level B centers were further divided into low-volume (<50 inborn infants at 22 to 29 weeks' gestation per year) and high-volume (≥50 inborn infants at 22 to 29 weeks' gestation per year) centers. High-volume level B and level C centers were combined, resulting in 3 distinct NICU categories: level A, low-volume B, and high-volume B and C NICUs. The main outcome was the change in the percentage of births at hospitals with level A, low-volume B, and high-volume B or C NICUs overall and by US Census region. Results: A total of 357 181 infants (mean [SD] gestational age, 26.4 [2.1] weeks; 188 761 [52.9%] male) were included in the analysis. Across regions, the Pacific (20 239 births [38.3%]) had the lowest while the South Atlantic (48 348 births [62.7%]) had the highest percentage of births at a hospital with a high-volume B- or C-level NICU. Births at hospitals with A-level NICUs increased by 5.6% (95% CI, 4.3% to 7.0%), and births at low-volume B-level NICUs increased by 3.6% (95% CI, 2.1% to 5.0%), while births at hospitals with high-volume B- or C-level NICUs decreased by 9.2% (95% CI, -10.3% to -8.1%). By 2020, less than half of the births for infants at 22 to 29 weeks' gestation occurred at hospitals with high-volume B- or C-level NICUs. Most US Census regions followed the nationwide trends; for example, births at hospitals with high-volume B- or C-level NICUs decreased by 10.9% [95% CI, -14.0% to -7.8%) in the East North Central region and by 21.1% (95% CI, -24.0% to -18.2%) in the West South Central region. Conclusions and relevance: This retrospective cohort study identified concerning deregionalization trends in birthplace hospital level of care for infants born at 22 to 29 weeks' gestation. These findings should serve to encourage policy makers to identify and enforce strategies to ensure that infants at the highest risk of adverse outcomes are born at the hospitals where they have the best chances to attain optimal outcomes

    First Report of Soybean Vein Necrosis Disease Caused by Soybean vein necrosis-associated virus in Wisconsin and Iowa

    Get PDF
    Several viral diseases of soybean (Glycine max) have been identified in the north-central U.S. soybean production area, which includes Wisconsin and Iowa (2). Previously, Soybean vein necrosis disease (SVND) caused by Soybean vein necrosis-associated virus was reported in Arkansas, Tennessee, and other southern states (4). In September 2012, soybean plants with symptoms similar to those reported for SVND (4) were observed in fields across Wisconsin and Iowa. Symptoms included leaf-vein and leaf chlorosis, followed by necrosis of the leaf veins and eventually necrosis of the entire leaf. Six samples with symptoms indicative of SVNaV were collected from research plots located at the West Madison Agricultural Research Station located in Madison, WI. An additional three samples were collected from three locations in central Iowa. Total RNA extracted from each sample using the Trizol Plus RNA purification kit (Invitrogen, Carlsbad, CA) was used to generate complementary DNA (cDNA) using the iScript cDNA synthesis kit (Bio-Rad Laboratories, Hercules, CA) following the manufacturers\u27 suggested protocols. The resulting cDNA was used as template in a PCR with SVNaV-specific primers, SVNaV-f1 and SVNaV-r1 (3). PCRs of two of the six Wisconsin samples and two Iowa samples were positive. Amplification products were not detected in the other five samples. The amplification products from the four strongly positive samples were purified using the Wizard SV Gel and PCR Purification Kit (Promega, Madison, WI) following the manufacturer\u27s suggested protocol and were subjected to automated sequencing (University of Wisconsin Biotechnology Center or Iowa State University, DNA Sequencing Facilities). BLASTn (1) alignments of the 915-bp consensus sequence revealed 98% and \u3e99% identity of the Wisconsin and Iowa samples, respectively, with the ‘S’ segment of the SVNaV ‘TN’ isolate (GenBank Accession No. GU722319.1). Samples from the same leaf tissue used above, were subjected to serological tests for SVNaV using antigen coated-indirect ELISA (3). Asymptomatic soybeans grown in the greenhouse were used as a source of leaves for negative controls. These tests confirmed the presence of SVNaV in eight symptomatic soybean leaflets collected in Wisconsin and Iowa. The asymptomatic control and one Iowa sample, which was also PCR-negative, were also negative by serological testing. Six additional samples from soybean fields in as many Wisconsin counties (Fond Du Lac, Grant, Green, Juneau, Richland, Rock) tested positive for SVNaV using specific primers that amplify the ‘L’ segment (4). The sequenced amplification products (297-bp) showed 99 to 100% homology to the L segment of the TN isolate (GU722317.1). To our knowledge, this is the first report of SVNaV associated with soybean and the first report of SVND in Wisconsin and Iowa. Considering that little is known about SVNaV, it is assumed that it is like other Tospoviruses and can cause significant yield loss (4). Soybean is a major cash crop for Wisconsin and Iowa, and infection by SVNaV could result in potential yield loss in years where epidemics begin early and at a high initial inoculum level

    The effect of physical contact on changes in fatigue markers following rugby union field-based training.

    Get PDF
    Repeated physical contact in rugby union is thought to contribute to post-match fatigue; however, no evidence exists on the effect of contact activity during field-based training on fatigue responses. Therefore, the purpose of this study was to examine the effect of contact during training on fatigue markers in rugby union players. Twenty academy rugby union players participated in the cross-over study. The magnitude of change in upper- and lower-body neuromuscular function (NMF), whole blood creatine kinase concentration [CK] and perception of well-being was assessed pre-training (baseline), immediately and 24 h post-training following contact and non-contact, field-based training. Training load was measured using mean heart rate, session rating of perceived exertion (sRPE) and microtechnology (Catapult Optimeye S5). The inclusion of contact during field-based training almost certainly increased mean heart rate (9.7; ±3.9%) and sRPE (42; ±29.2%) and resulted in likely and very likely greater decreases in upper-body NMF (-7.3; ±4.7% versus 2.7; ±5.9%) and perception of well-being (-8.0; ±4.8% versus  -3.4; ±2.2%) 24 h post-training, respectively, and almost certainly greater elevations in [CK] (88.2; ±40.7% versus 3.7; ±8%). The exclusion of contact from field-based training almost certainly increased running intensity (19.8; ±5%) and distance (27.5; ±5.3%), resulting in possibly greater decreases in lower-body NMF (-5.6; ±5.2% versus 2.3; ±2.4%). Practitioners should be aware of the different demands and fatigue responses of contact and non-contact, field-based training and can use this information to appropriately schedule such training in the weekly microcycle

    Crc Is Involved in Catabolite Repression Control of the bkd Operons of Pseudomonas putida and Pseudomonas aeruginosa

    Get PDF
    Crc (catabolite repression control) protein of Pseudomonas aeruginosa has shown to be involved in carbon regulation of several pathways. In this study, the role of Crc in catabolite repression control has been studied in Pseudomonas putida. The bkd operons of P. putida and P. aeruginosa encode the inducible multienzyme complex branched-chain keto acid dehydrogenase, which is regulated in both species by catabolite repression. We report here that this effect is mediated in both species by Crc. A 13-kb cloned DNA fragment containing the P. putida crc gene region was sequenced. Crc regulates the expression of branched-chain keto acid dehydrogenase, glucose-6-phosphate dehydrogenase, and amidase in both species but not urocanase, although the carbon sources responsible for catabolite repression in the two species differ. Transposon mutants affected in their expression of BkdR, the transcriptional activator of the bkd operon, were isolated and identified as crc and vacB (rnr) mutants. These mutants suggested that catabolite repression in pseudomonads might, in part, involve control of BkdR levels. Originally published Journal of Bacteriology, Vol. 182, No. 4, Feb 200

    Applied Sport Science for Male Age-Grade Rugby Union in England

    Get PDF
    Rugby union (RU) is a skill-collision team sport played at junior and senior levels worldwide. Within England, age-grade rugby governs the participation and talent development of youth players. The RU player development pathway has recently been questioned, regarding player performance and wellbeing, which sport science research can address. The purpose of this review was to summarise and critically appraise the literature in relation to the applied sport science of male age-grade RU players in England focusing upon 1) match-play characteristics, 2) training exposures, 3) physical qualities, 4) fatigue and recovery, 5) nutrition, 6) psychological challenges and development, and 7) injury. Current research evidence suggests that age, playing level and position influence the match-play characteristics of age-grade RU. Training exposures of players are described as ‘organised chaos’ due to the multiple environments and stakeholders involved in coordinating training schedules. Fatigue is apparent up to 72 hours post match-play. Well developed physical qualities are important for player development and injury risk reduction. The nutritional requirements are high due to the energetic costs of collisions. Concerns around the psychological characteristics have also been identified (e.g., perfectionism). Injury risk is an important consideration with prevention strategies available. This review highlights the important multi-disciplinary aspects of sport science for developing age-grade RU players for continued participation and player development. The review describes where some current practices may not be optimal, provides a framework to assist practitioners to effectively prepare age-grade players for the holistic demands of youth RU and considers areas for future research

    The appropriateness of training exposures for match-play preparation in adolescent schoolboy and academy rugby union players

    Get PDF
    The aim of this study was to compare the physical and movement demands between training and match-play in schoolboy and academy adolescent rugby union (RU) players. Sixty-one adolescent male RU players (mean ± SD; age 17.0 ± 0.7 years) were recruited from four teams representing school and regional academy standards. Players were categorised into four groups based on playing standard and position: schoolboy forwards (n=15), schoolboy backs (n=15), academy forwards (n=16) and academy backs (n=15). Global positioning system and accelerometry measures were obtained from training and match-play to assess within-group differences between conditions. Maximum data were analysed from 79 match files across 8 matches (1.3 ± 0.5 matches per participant) and 152 training files across 15 training sessions (2.5 ± 0.5 training sessions per participant). Schoolboy forwards were underprepared for low-intensity activities experienced during match-play, with schoolboy backs underprepared for all movement demands. Academy forwards were exposed to similar physical demands in training to matches, with academy backs similar to or exceeding values for all measured variables. Schoolboy players were underprepared for many key, position-specific aspects of match-play, which could place them at greater risk of injury and hinder performance, unlike academy players who were better prepared

    The effect of rugby training on indirect markers of gut permeability and gut damage in academy level rugby players.

    Get PDF
    PURPOSE: To assess indirect markers of intestinal endothelial cell damage and permeability in academy rugby players in response to rugby training at the beginning and end of preseason. METHODS: Blood and urinary measures (intestinal fatty acid binding protein and lactulose:rhamnose) as measures of gastrointestinal cell damage and permeability were taken at rest and after a standardised collision-based rugby training session in 19 elite male academy rugby players (age: 20 ± 1 years, backs: 89.3 ± 8.4 kg; forwards: 111.8 ± 7.6 kg) at the start of preseason. A subsample (n = 5) repeated the protocol after six weeks of preseason training. Gastrointestinal symptoms (GIS; range of thirteen standard symptoms), aerobic capacity (30-15 intermittent fitness test), and strength (1 repetition maximum) were also measured. RESULTS: Following the rugby training session at the start of preseason, there was an increase (median; interquartile range) in intestinal fatty acid binding protein (2140; 1260-2730 to 3245; 1985-5143 pg/ml, p = 0.003) and lactulose:rhamnose (0.31; 0.26-0.34 to 0.97; 0.82-1.07, p < 0.001). After six weeks of preseason training players physical qualities improved, and the same trends in blood and urinary measures were observed within the subsample. Overall, the frequency and severity of GIS were low and not correlated to markers of endothelial damage. CONCLUSIONS: Rugby training resulted in increased intestinal endothelial cell damage and permeability compared to rest. A similar magnitude of effect was observed after six weeks of pre-season training. This was not related to the experience of GIS

    The organised chaos of English adolescent rugby union; Influence of weekly match frequency on the variability of match and training loads

    Get PDF
    The aims of this study were to determine the variability of weekly match and training loads in adolescent rugby union players across a competitive season, and to investigate the effect of match frequency on load distribution across different activities. Internal match and training load data (i.e., session-rating of perceived exertion: sRPE) were collected daily from 20 players from a regional academy across a 14-week season. Data were analysed using a mixed-effects linear model, and variability was reported as a coefficient of variation (CV). Differences between 0-, 1-, 2-, and 3-match weeks were assessed using Cohen’s d effect sizes and magnitude-based inferences. Mean weekly total match and training sRPE load was 1425 ± 545 arbitrary units (AU), with a between-player CV of 10 ±6% and within-player CV of 37 ±3%. Mean week-to-week change in total sRPE load was 497 ± 423 AU (35%), and 40% of weekly observations were outside of the suggested acute:chronic workload ratio ‘safe zone’. Total weekly sRPE loads increased substantially with match frequency (1210 ± 571 AU, 1511 ± 489, and 1692 ± 517 AU, for 0-, 1-, and 2-match weeks, respectively), except for 3-match weeks (1520 ± 442 AU). Weekly match and training loads were highly variable for adolescent rugby players during the competitive season, and match frequency has a substantial effect on the distribution of loads. Therefore, match and training loads should be coordinated, monitored, and managed on an individual basis to protect players from negative training consequences, and to promote long term athlete development
    • …
    corecore