204 research outputs found

    Sanctioning of breakdown infringements during the knockout stage of the 2019 rugby world cup

    Get PDF
    The breakdown is a complex and dynamic facet of rugby union, often containing multiple players from each team. It is the responsibility of coaches and trainers to ensure players are prepared to comply with the World Rugby laws of the game to encourage safe and fair play between all participants. The aim of this study was to investigate player adherence and sanctioning of infringements at the breakdown during the knockout stages of the 2019 Rugby World Cup. Breakdown infringements according to World Rugby laws were identified using match video recordings of the 8 matches. Each breakdown was individually analysed by coding any infringement that had occurred and the sanctioning outcome of the breakdown. A total of 898 breakdowns were coded, of which 37.7% (n = 339) were deemed to involve illegal play. 79.9% of breakdowns involving illegal play were not penalised with the majority of infringements being “head and shoulders below hips” (33.5%, n = 163), “off feet” (13%, n = 63) and “offside” (10.5%, n = 51). The attacking team were responsible for 70.0% (n = 340) of all breakdown infringements despite being penalised less than the defending team. A high number of infringements occurred at the breakdown and went unsanctioned in the knockout stages of the 2019 Rugby World Cup. Future work focused on technology, training or rule amendments may be required to improve player adherence and sanctioning of infringements at the breakdown, such that they protect players and are in keeping with the dynamics of the modern game

    Force experienced by the head during heading is influenced more by speed than the mechanical properties of the football

    Get PDF
    There are growing concerns about the risk of neurodegenerative diseases associated with heading in football. It is essential to understand the biomechanics of football heading to guide player protection strategies to reduce the severity of the impact. The aim of this study was to assess the effect of football speed, mass, and stiffness on the forces experienced during football heading using mathematical and human body computational model simulations. Previous research indicates that a football header can be modeled as a lumped mass mathematical model with elastic contact. Football headers were then reconstructed using a human body modeling approach. Simulations were run by independently varying the football mass, speed, and stiffness. Peak contact force experienced by the head was extracted from each simulation. The mathematical and human body computational model simulations indicate that the force experienced by the head was directly proportional to the speed of the ball and directly proportional to the square root of the ball stiffness and mass. Over the practical range of ball speed, mass, and stiffness, the force experienced by the head during football heading is mainly influenced by the speed of the ball rather than its mass or stiffness. The findings suggest that it would be more beneficial to develop player protection strategies that aim to reduce the speed at which the ball is traveling when headed by a player. Law changes reducing high ball speeds could be trialed at certain age grades or as a phased introduction to football heading

    Does tackle height influence offload success in rugby union? Analysis from the 2019 Rugby World Cup

    Get PDF
    Offloads are an effective way of breaking through a defensive line in rugby union. Higher tackle heights are considered an effective strategy to defend against offloads. However, in a bid to reduce head injuries, there is a cultural shift within the rules of the game to tackle lower down on the body. This study used match video analysis of ten games from the 2019 Rugby World Cup to investigate whether tackle height influences offload success for the ball carrier. Each legal tackle was categorised based on tackle height (e.g. shoulder), player body position (e.g. upright), tackle type (e.g. shoulder tackle), tackle direction (e.g. front on) and player position (e.g. tight forwards). For each characteristic, the Odds Ratio (OR) and 95% Confidence Interval (CI) were calculated based on offload success outcome. Tackles at the hip (OR = 1.81, 95% CI 1.10 to 2.96, p = 0.018) and upper leg (OR = 1.94, 95% CI 1.30 to 2.90, p = 0.001) had a greater propensity to result in offload success while tackles at shoulder height reduced offload success (OR = 0.09, 95% CI 0.04 to 0.22, p < 0.001). A bent at the waist tackler against an upright ball carrier had a greater propensity to result in offload success (OR = 1.74, 95% CI 1.19 to 2.54, p = 0.004). Tackling lower increased the chances of offload success for the ball carrier. The cultural shift towards lower tackle heights is likely to result in an increased number of offloads and it is up to players, coaches and defensive systems to be able to adapt to this

    Can tackle height influence tackle gainline success outcomes in elite level rugby union?

    Get PDF
    In rugby union, effective defensive play is highly technical and essential for game outcomes. Therefore, the aim of this study was to identify tackle heights, for given tackle types, that had a greater propensity to result in tackle gainline success for the tackler using match video evidence. The results indicated that tackling the upper legs of the ball carrier had a greater propensity to result in tackler success for both front-on (OR = 3.27; 95% CI = 1.34–7.95; p < 0.01) and side-on (OR = 5.31; 95% CI = 2.08–13.6; p < 0.01) arm tackles. For shoulder tackles, tackling at the lower trunk for front-on tackles (OR = 1.70; 95% CI = 1.04-2.79; p = 0.03) and the mid trunk for side-on tackles (OR = 3.11; 95% CI = 1.31–7.37; p < 0.01) had a greater propensity to result in tackler success. For smother tackles, tackling at the mid trunk had a greater propensity to result in tackler success during front-on (OR = 3.49; 95% CI = 1.81–6.74; p < 0.01) and side-on (OR = 5.11; 95% CI = 2.42–10.8; p < 0.01) tackles. The results highlight the importance of tackle height when coaching the tackle. The findings also suggest that technically proficient players can advance to more challenging contact techniques than aiming for the ball carrier’s centre of gravity

    The effect of technique on tackle gainline success outcomes in elite level rugby union

    Get PDF
    Tackling is a major component of rugby union and effective attacking and defensive play are essential for game outcomes. In this study, a number of pre-contact, contact and post-contact tackle characteristics that had an influence on tackle gainline success for the ball carrier and tackler were identified using match video evidence from European Rugby Champions Cup games. A total of 122 front-on tackles and 111 side-on tackles were analysed. For each ball carrier and tackler characteristic, the Odds Ratio and 95% Confidence Interval were calculated based on a gainline success outcome. A Chi-Square and Phi and Cramer’s V calculation was also conducted. A Chi-Square test then identified any statistically significant differences (p < 0.05) for proficiency characteristics between playing position. For both the ball carrier and tackler, tackle characteristics that were indicative of strong and powerful tackle technique such as ‘explosiveness on contact’ and ‘leg drive on contact’ were effective for achieving the desired gainline outcome. Playing position had an influence on only two proficiency characteristics that were statistically significant for gainline success: ‘fending into contact’ for ball carriers and ‘straight back, centre of gravity forward of support base’ for tacklers

    Autism as a disorder of neural information processing: directions for research and targets for therapy

    Get PDF
    The broad variation in phenotypes and severities within autism spectrum disorders suggests the involvement of multiple predisposing factors, interacting in complex ways with normal developmental courses and gradients. Identification of these factors, and the common developmental path into which theyfeed, is hampered bythe large degrees of convergence from causal factors to altered brain development, and divergence from abnormal brain development into altered cognition and behaviour. Genetic, neurochemical, neuroimaging and behavioural findings on autism, as well as studies of normal development and of genetic syndromes that share symptoms with autism, offer hypotheses as to the nature of causal factors and their possible effects on the structure and dynamics of neural systems. Such alterations in neural properties may in turn perturb activity-dependent development, giving rise to a complex behavioural syndrome many steps removed from the root causes. Animal models based on genetic, neurochemical, neurophysiological, and behavioural manipulations offer the possibility of exploring these developmental processes in detail, as do human studies addressing endophenotypes beyond the diagnosis itself

    Do computerised clinical decision support systems for prescribing change practice? A systematic review of the literature (1990-2007)

    Get PDF
    Computerised clinical decision support systems (CDSSs) are used widely to improve quality of care and patient outcomes. This systematic review evaluated the impact of CDSSs in targeting specific aspects of prescribing, namely initiating, monitoring and stopping therapy. We also examined the influence of clinical setting (institutional vs ambulatory care), system- or user-initiation of CDSS, multi-faceted vs stand alone CDSS interventions and clinical target on practice changes in line with the intent of the CDSS. We searched Medline, Embase and PsychINFO for publications from 1990-2007 detailing CDSS prescribing interventions. Pairs of independent reviewers extracted the key features and prescribing outcomes of methodologically adequate studies (experiments and strong quasi-experiments). 56 studies met our inclusion criteria, 38 addressing initiating, 23 monitoring and three stopping therapy. At the time of initiating therapy, CDSSs appear to be somewhat more effective after, rather than before, drug selection has occurred (7/12 versus 12/26 studies reporting statistically significant improvements in favour of CDSSs on = 50% of prescribing outcomes reported). CDSSs also appeared to be effective for monitoring therapy, particularly using laboratory test reminders (4/7 studies reporting significant improvements in favour of CDSSs on the majority of prescribing outcomes). None of the studies addressing stopping therapy demonstrated impacts in favour of CDSSs over comparators. The most consistently effective approaches used system-initiated advice to fine-tune existing therapy by making recommendations to improve patient safety, adjust the dose, duration or form of prescribed drugs or increase the laboratory testing rates for patients on long-term therapy. CDSSs appeared to perform better in institutional compared to ambulatory settings and when decision support was initiated automatically by the system as opposed to user initiation. CDSSs implemented with other strategies such as education were no more successful in improving prescribing than stand alone interventions. Cardiovascular disease was the most studied clinical target but few studies demonstrated significant improvements on the majority of prescribing outcomes. Our understanding of CDSS impacts on specific aspects of the prescribing process remains relatively limited. Future implementation should build on effective approaches including the use of system-initiated advice to address safety issues and improve the monitoring of therapy

    Evolving health information technology and the timely availability of visit diagnoses from ambulatory visits: A natural experiment in an integrated delivery system

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Health information technology (HIT) may improve health care quality and outcomes, in part by making information available in a timelier manner. However, there are few studies documenting the changes in timely availability of data with the use of a sophisticated electronic medical record (EMR), nor a description of how the timely availability of data might differ with different types of EMRs. We hypothesized that timely availability of data would improve with use of increasingly sophisticated forms of HIT.</p> <p>Methods</p> <p>We used an historical observation design (2004–2006) using electronic data from office visits in an integrated delivery system with three types of HIT: Basic, Intermediate, and Advanced. We calculated the monthly percentage of visits using the various types of HIT for entry of visit diagnoses into the delivery system's electronic database, and the time between the visit and the availability of the visit diagnoses in the database.</p> <p>Results</p> <p>In January 2004, when only Basic HIT was available, 10% of office visits had diagnoses entered on the same day as the visit and 90% within a week; 85% of office visits used paper forms for recording visit diagnoses, 16% used Basic at that time. By December 2006, 95% of all office visits had diagnoses available on the same day as the visit, when 98% of office visits used some form of HIT for entry of visit diagnoses (Advanced HIT for 67% of visits).</p> <p>Conclusion</p> <p>Use of HIT systems is associated with dramatic increases in the timely availability of diagnostic information, though the effects may vary by sophistication of HIT system. Timely clinical data are critical for real-time population surveillance, and valuable for routine clinical care.</p

    CTCF Prevents the Epigenetic Drift of EBV Latency Promoter Qp

    Get PDF
    The establishment and maintenance of Epstein-Barr Virus (EBV) latent infection requires distinct viral gene expression programs. These gene expression programs, termed latency types, are determined largely by promoter selection, and controlled through the interplay between cell-type specific transcription factors, chromatin structure, and epigenetic modifications. We used a genome-wide chromatin-immunoprecipitation (ChIP) assay to identify epigenetic modifications that correlate with different latency types. We found that the chromatin insulator protein CTCF binds at several key regulatory nodes in the EBV genome and may compartmentalize epigenetic modifications across the viral genome. Highly enriched CTCF binding sites were identified at the promoter regions upstream of Cp, Wp, EBERs, and Qp. Since Qp is essential for long-term maintenance of viral genomes in type I latency and epithelial cell infections, we focused on the role of CTCF in regulating Qp. Purified CTCF bound ∼40 bp upstream of the EBNA1 binding sites located at +10 bp relative to the transcriptional initiation site at Qp. Mutagenesis of the CTCF binding site in EBV bacmids resulted in a decrease in the recovery of stable hygromycin-resistant episomes in 293 cells. EBV lacking the Qp CTCF site showed a decrease in Qp transcription initiation and a corresponding increase in Cp and Fp promoter utilization at 8 weeks post-transfection. However, by 16 weeks post-transfection, bacmids lacking CTCF sites had no detectable Qp transcription and showed high levels of histone H3 K9 methylation and CpG DNA methylation at the Qp initiation site. These findings provide direct genetic evidence that CTCF functions as a chromatin insulator that prevents the promiscuous transcription of surrounding genes and blocks the epigenetic silencing of an essential promoter, Qp, during EBV latent infection

    The Antiquity and Evolutionary History of Social Behavior in Bees

    Get PDF
    A long-standing controversy in bee social evolution concerns whether highly eusocial behavior has evolved once or twice within the corbiculate Apidae. Corbiculate bees include the highly eusocial honey bees and stingless bees, the primitively eusocial bumble bees, and the predominantly solitary or communal orchid bees. Here we use a model-based approach to reconstruct the evolutionary history of eusociality and date the antiquity of eusocial behavior in apid bees, using a recent molecular phylogeny of the Apidae. We conclude that eusociality evolved once in the common ancestor of the corbiculate Apidae, advanced eusociality evolved independently in the honey and stingless bees, and that eusociality was lost in the orchid bees. Fossil-calibrated divergence time estimates reveal that eusociality first evolved at least 87 Mya (78 to 95 Mya) in the corbiculates, much earlier than in other groups of bees with less complex social behavior. These results provide a robust new evolutionary framework for studies of the organization and genetic basis of social behavior in honey bees and their relatives
    corecore