13 research outputs found

    The first decade of web-based sports injury surveillance: Descriptive epidemiology of injuries in United States high school football (2005-2006 through 2013-2014) and National collegiate athletic association football (2004-2005 through 2013-2014)

    Get PDF
    Context: The advent of Web-based sports injury surveillance via programs such as the High School Reporting Information Online system and the National Collegiate Athletic Association Injury Surveillance Program has aided the acquisition of football injury data. Objective: To describe the epidemiology of injuries sustained in high school football in the 2005-2006 through 2013-2014 academic years and collegiate football in the 2004-2005 through 2013-2014 academic years using Web-based sports injury surveillance. Design: Descriptive epidemiology study. Setting: Online injury surveillance from football teams of high school boys (annual average ¼ 100) and collegiate men (annual average ¼ 43). Patients or Other Participants: Football players who participated in practices and competitions during the 2005-2006 through 2013-2014 academic years in high school or the 2004-2005 through 2013-2014 academic years in college. Main Outcome Measure(s): Athletic trainers collected time-loss injury (24 hours) and exposure data. Injury rates per 1000 athlete-exposures (AEs), injury rate ratios (IRRs) with 95% confidence intervals (CIs), and injury proportions by body site and diagnosis were calculated. Results: The High School Reporting Information Online system documented 18 189 time-loss injuries during 4 539 636 AEs; the National Collegiate Athletic Association Injury Surveillance Program documented 22 766 time-loss injuries during 3 121 476 AEs. The injury rate was higher among collegiate than high school (7.29 versus 4.01/1000 AEs; IRR ¼ 1.82; 95% CI ¼ 1.79, 1.86) athletes. Most injuries occurred during competitions in high school (53.2%) and practices in college (60.9%). The competition injury rate was higher than the practice injury rate among both high school (IRR ¼ 5.62; 95% CI ¼ 5.46, 5.78) and collegiate (IRR ¼ 6.59; 95% CI ¼ 6.41, 6.76) players. Most injuries at both levels affected the lower extremity and the shoulder/clavicle and were diagnosed as ligament sprains and muscle/tendon strains. However, concussion was a common injury during competitions among most positions. Conclusions: Injury rates were higher in college than in high school and higher for competitions than for practices. Concussion was a frequent injury sustained during competitions, which confirms the need to develop interventions to mitigate its incidence and severity

    The first decade of web-based sports injury surveillance: Descriptive epidemiology of injuries in US high school girls' lacrosse (2008-2009 through 2013-2014) and National Collegiate Athletic Association women's lacrosse (2004-2005 through 2013-2014)

    Get PDF
    Context: The advent of Web-based sports injury surveillance via programs such as the High School Reporting Information Online (HS RIO) system and the National Collegiate Athletic Association Injury Surveillance Program (NCAA-ISP) has aided the acquisition of girls' and women's lacrosse injury data. Objective: To describe the epidemiology of injuries sustained in high school girls' lacrosse in the 2008-2009 through 2013-2014 academic years and collegiate women's lacrosse in the 2004-2005 through 2013-2014-academic years using Web-based sports injury surveillance. Design: Descriptive epidemiology study. Setting: Online injury surveillance from high school girls' (annual average ¼ 55) and collegiate women's (annual average ¼ 19) lacrosse teams. Patients or Other Participants: Female lacrosse players who participated in practices or competitions during the 2008-2009 through 2013-2014 academic years for high school or the 2004-2005 through 2013-2014 academic years for college. Main Outcome Measure(s): Athletic trainers collected time-loss injury (24 hours) and exposure data. We calculated injury rates per 1000 athlete-exposures (AEs), injury rate ratios (IRRs) with 95% confidence intervals (CIs), and injury proportions by body site and diagnosis. Results: High school RIO documented 700 time-loss injuries during 481 687 AEs; the NCAA-ISP documented 1027 time-loss injuries during 287 856 AEs. The total injury rate during 2008-2009 through 2013-2014 was higher in college than in high school (2.55 versus 1.45/1000 AEs; IRR ¼ 1.75; 95% CI ¼ 1.54, 1.99). Most injuries occurred during competitions in high school (51.1%) and practices in college (63.8%). Rates were higher during competitions compared with practices in high school (IRR ¼ 2.32; 95% CI ¼ 2.00, 2.69) and college (IRR ¼ 2.38; 95% CI ¼ 2.09, 2.70). Concussion was the most common diagnosis among all high school and most collegiate player positions, and the main mechanism of contact was with a playing apparatus (eg, stick, ball). Ligament sprains were also common (HS RIO practices ¼ 22.2%, competitions ¼ 30.3%; NCAA-ISP practices ¼ 25.5%, competitions ¼ 30.9%). Conclusions: Rates of injury were higher in college versus high school female lacrosse players and in competitions versus practices. Injury-prevention strategies are essential to decrease the incidence and severity of concussions and ligament sprains

    The first decade of web-based sports injury surveillance: Descriptive epidemiology of injuries in US high school boys' lacrosse (2008-2009 through 2013-2014) and National Collegiate Athletic Association men's lacrosse (2004-2005 through 2013-2014)

    Get PDF
    Context: The advent of Web-based sports injury surveillance via programs such as the High School Reporting Information Online system and the National Collegiate Athletic Association Injury Surveillance Program has aided the acquisition of boys' and men's lacrosse injury data. Objective: To describe the epidemiology of injuries sustained in high school boys' lacrosse in the 2008-2009 through 2013-2014 academic years and collegiate men's lacrosse in the 2004-2005 through 2013-2014 academic years using Web-based sports injury surveillance. Design: Descriptive epidemiology study. Setting: Online injury surveillance from lacrosse teams of high school boys (annual average ¼ 55) and collegiate men (annual average ¼ 14). Patients or Other Participants: Boys' and men's lacrosse players who participated in practices and competitions during the 2008-2009 through 2013-2014 academic years in high school or the 2004-2005 through 2013-2014 academic years in college. Main Outcome Measure(s): Athletic trainers collected time-loss (24 hours) injury and exposure data. Injury rates per 1000 athlete-exposures (AEs), injury rate ratios (IRRs) with 95% confidence intervals (CIs), and injury proportions by body site and diagnosis were calculated. Results: High School Reporting Information Online documented 1407 time-loss injuries during 662 960 AEs. The National Collegiate Athletic Association Injury Surveillance Program documented 1882 time-loss injuries during 390 029 AEs. The total injury rate from 2008-2009 through 2013-2014 was higher in college than in high school (3.77 versus 2.12/1000 AEs; IRR ¼ 1.78; 95% CI ¼ 1.63, 1.94). Most injuries occurred during competitions in high school (61.4%) and practices in college (61.4%). Injury rates were higher in competitions compared with practices in high school (IRR ¼ 3.59; 95% CI ¼ 3.23, 4.00) and college (IRR ¼ 3.38; 95% CI ¼ 3.08, 3.71). Lower limb injuries, muscle strains, and ligament sprains were common at both levels. Concussion was the most frequent competition diagnosis for all high school player positions. Conclusions: Rates of time-loss injury were higher in college versus high school and in competitions versus practices. Attention to preventing common lower leg injuries and concussions, especially at the high school level, is essential to decrease their incidence and severity

    Heavy quarkonium: progress, puzzles, and opportunities

    Get PDF
    A golden age for heavy quarkonium physics dawned a decade ago, initiated by the confluence of exciting advances in quantum chromodynamics (QCD) and an explosion of related experimental activity. The early years of this period were chronicled in the Quarkonium Working Group (QWG) CERN Yellow Report (YR) in 2004, which presented a comprehensive review of the status of the field at that time and provided specific recommendations for further progress. However, the broad spectrum of subsequent breakthroughs, surprises, and continuing puzzles could only be partially anticipated. Since the release of the YR, the BESII program concluded only to give birth to BESIII; the BB-factories and CLEO-c flourished; quarkonium production and polarization measurements at HERA and the Tevatron matured; and heavy-ion collisions at RHIC have opened a window on the deconfinement regime. All these experiments leave legacies of quality, precision, and unsolved mysteries for quarkonium physics, and therefore beg for continuing investigations. The plethora of newly-found quarkonium-like states unleashed a flood of theoretical investigations into new forms of matter such as quark-gluon hybrids, mesonic molecules, and tetraquarks. Measurements of the spectroscopy, decays, production, and in-medium behavior of c\bar{c}, b\bar{b}, and b\bar{c} bound states have been shown to validate some theoretical approaches to QCD and highlight lack of quantitative success for others. The intriguing details of quarkonium suppression in heavy-ion collisions that have emerged from RHIC have elevated the importance of separating hot- and cold-nuclear-matter effects in quark-gluon plasma studies. This review systematically addresses all these matters and concludes by prioritizing directions for ongoing and future efforts.Comment: 182 pages, 112 figures. Editors: N. Brambilla, S. Eidelman, B. K. Heltsley, R. Vogt. Section Coordinators: G. T. Bodwin, E. Eichten, A. D. Frawley, A. B. Meyer, R. E. Mitchell, V. Papadimitriou, P. Petreczky, A. A. Petrov, P. Robbe, A. Vair

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    A modelling approach to estimate the effect of exotic pollinators on exotic weed population dynamics: bumblebees and broom in Australia

    No full text
    The role of mutualisms in contributing to species invasions is rarely considered, inhibiting effective risk analysis and management options. Potential ecological consequences of invasion of non-native pollinators include increased pollination and seed set of invasive plants, with subsequent impacts on population growth rates and rates of spread. We outline a quantitative approach for evaluating the impact of a proposed introduction of an invasive pollinator on existing weed population dynamics and demonstrate the use of this approach on a relatively data-rich case study: the impacts on Cytisus scoparius (Scotch broom) from proposed introduction of Bombus terrestris. Three models have been used to assess population growth (matrix model), spread speed (integrodifference equation), and equilibrium occupancy (lattice model) for C. scoparius. We use available demographic data for an Australian population to parameterize two of these models. Increased seed set due to more efficient pollination resulted in a higher population growth rate in the density-independent matrix model, whereas simulations of enhanced pollination scenarios had a negligible effect on equilibrium weed occupancy in the lattice model. This is attributed to strong microsite limitation of recruitment in invasive C. scoparius populations observed in Australia and incorporated in the lattice model. A lack of information regarding secondary ant dispersal of C. scoparius prevents us from parameterizing the integrodifference equation model for Australia, but studies of invasive populations in California suggest that spread speed will also increase with higher seed set. For microsite-limited C. scoparius populations, increased seed set has minimal effects on equilibrium site occupancy. However, for density-independent rapidly invading populations, increased seed set is likely to lead to higher growth rates and spread speeds. The impacts of introduced pollinators on native flora and fauna and the potential for promoting range expansion in pollinator-limited 'sleeper weeds' also remain substantial risks

    Extracellular matrix and its role in spermatogenesis

    No full text
    In adult mammalian testes, such as rats, Sertoli and germ cells at different stages of their development in the seminiferous epithelium are in close contact with the basement membrane, a modified form of extracellular matrix (ECM). In essence, Sertoli and germ cells in particular spermatogonia are “resting” on the basement membrane at different stages of the seminiferous epithelial cycle, relying on its structural and hormonal supports. Thus, it is not entirely unexpected that ECM plays a significant role in regulating spermatogenesis, particularly spermatogonia and Sertoli cells, and the blood-testis barrier (BTB) constituted by Sertoli cells since these cells are in physical contact with the basement membrane. Additionally, the basement membrane is also in close contact with the underlying collagen network and the myoid cell layers, which together with the lymphatic network, constitute the tunica propria. The seminiferous epithelium and the tunica propria, in turn, constitute the seminiferous tubule, which is the functional unit that produces spermatozoa via its interaction with Leydig cells in the interstitium. In short, the basement membrane and the underlying collagen network that create the acellular zone of the tunica propria may even facilitate cross-talk between the seminiferous epithelium, the myoid cells and cells in the interstitium. Recent studies in the field have illustrated the crucial role of ECM in supporting Sertoli and germ cell function in the seminiferous epithelium, including the BTB dynamics. In this chapter, we summarize some of the latest findings in the field regarding the functional role of ECM in spermatogenesis using the adult rat testis as a model. We also high light specific areas of research that deserve attention for investigators in the field
    corecore