39 research outputs found

    A 2-year prospective study of injury epidemiology in elite Australian rugby sevens: Exploration of incidence rates, severity, injury type, and subsequent injury in men and women

    Get PDF
    BACKGROUND: Injuries are common in rugby sevens, but studies to date have been limited to short, noncontinuous periods and reporting of match injuries only. PURPOSE: To report the injury incidence rate (IIR), severity, and burden of injuries sustained by men and women in the Australian rugby sevens program and to provide the first longitudinal investigation of subsequent injury occurrence in rugby sevens looking beyond tournament injuries only. STUDY DESIGN: Descriptive epidemiology study. METHODS: Ninety international rugby sevens players (55 men and 35 women) were prospectively followed over 2 consecutive seasons (2015-2016 and 2016-2017). All medical attention injuries were reported irrespective of time loss. Individual exposure in terms of minutes, distance, and high-speed distance was captured for each player for matches and on-field training, with the use of global positioning system devices. The IIR and injury burden (IIR Ă— days lost to injury) were calculated per 1000 player-hours, and descriptive analyses were performed. RESULTS: Seventy-three players (81.1%) sustained 365 injuries at an IIR of 43.2 per 1000 player-hours (95% CI, 43.0-43.3). As compared with male players, female players experienced a lower IIR (incidence rate ratio, 0.91; 95% CI, 0.90-0.91). Female players also sustained a higher proportion of injuries to the trunk region (relative risk, 1.75; 95% CI, 1.28-2.40) but a lower number to the head/neck region (relative risk, 0.58; 95% CI, 0.37-0.93; P = .011). The majority (80.7%) of subsequent injuries were of a different site and nature than previous injuries. A trend toward a reduced number of days, participation time, distance, and high-speed distance completed before the next injury was observed after successive injury occurrence. CONCLUSION: Female players have a lower IIR than male players, with variation of injury profiles observed between sexes. With a surveillance period of 2 years, subsequent injuries account for the majority of injuries sustained in rugby sevens, and they are typically different from previous types of sustained injuries. After each successive injury, the risk profile for future injury occurrence appears to be altered, which warrants further investigation to inform injury prevention strategies in rugby sevens

    A cross sectional survey of international horse-racing authorities on injury data collection and reporting practices for professional jockeys

    Get PDF
    Jockey injuries are common in professional horse-racing and can result in life-threatening or career-ending outcomes. Robust injury data are essential to understand the circumstances of injury occurrence and ultimately identify prevention opportunities. This study aimed to identify jockey injury surveillance practices of international horse-racing authorities (HRAs) and the specific data items collected and reported by each HRA. A cross-sectional survey of representatives (e.g. Chief Medical Officer) from international HRAs was conducted. An online and paper questionnaire was designed comprised of 32 questions. Questions considered the barriers and facilitators to data collection within each HRA, and where available, what data were collected and reported by HRAs. Representatives from 15 international racing jurisdictions were included, of which 12 reported collection of race day injuries or falls, using varied definitions of medical attention and time loss. Six HRAs did not have a definition for a jockey injury, and eight HRAs had no parameters for describing injury severity. Race day exposure was collected by two HRAs. Results were commonly presented by HRAs as the number of injuries (n = 9/15) or proportion of injured jockeys (n = 6/15). The lack of a designated role for collection, collation and reporting of data was the main barrier for injury surveillance. Twelve HRAs agreed that mandatory collection would be a strong facilitator to improving practice. Enhancement and standardization of international jockey injury surveillance is required to move forward with evidence informed prevention. Concurrent investigation of how reporting practices can be best supported within existing HRA structures is recommended

    Comparison of subsequent injury categorisation (SIC) models and their application in a sporting population

    Get PDF
    Background: The original subsequent injury categorisation (SIC-1.0) model aimed to classify relationships between chronological injury sequences to provide insight into the complexity and causation of subsequent injury occurrence. An updated model has recently been published. Comparison of the data coded according to the original and revised subsequent injury categorisation (SIC-1.0 and SIC-2.0) models has yet been formally compared. Methods: Medical attention injury data was prospectively collected for 42 elite water polo players over an 8 month surveillance period. The SIC-1.0 and SIC-2.0 models were retrospectively applied to the injury data. The injury categorisation from the two models was compared using descriptive statistics. Results: Seventy-four injuries were sustained by the 42 players (median = 2, range = 0-5), of which 32 injuries (43.2%) occurred subsequent to a previous injury. The majority of subsequent injuries were coded as occurring at a different site and being of a different nature, while also being considered clinically unrelated to the previous injury (SIC-1.0 category 10 = 57.9%; SIC-2.0 clinical category 16 = 54.4%). Application of the SIC-2.0 model resulted in a greater distribution of category allocation compared to the SIC-1.0 model that reflects a greater precision in the SIC-2.0 model. Conclusions: Subsequent injury categorisation of sport injury data can be undertaken using either the original (SIC-1.0) or the revised (SIC-2.0) model to obtain similar results. However, the SIC-2.0 model offers the ability to identify a larger number of mutually exclusive categories, while not relying on clinical adjudication for category allocation. The increased precision of SIC-2.0 is advantageous for clinical application and consideration of injury relationships

    Factors influencing quality of life following lower limb amputation for peripheral arterial occlusive disease: a systematic review of the literature

    Get PDF
    Background: The majority of lower limb amputations are undertaken in people with peripheral arterial occlusive disease,\ud and approximately 50% have diabetes. Quality of life is an important outcome in lower limb amputations; little is known\ud about what influences it, and therefore how to improve it.\ud Objectives: The aim of this systematic review was to identify the factors that influence quality of life after lower limb\ud amputation for peripheral arterial occlusive disease.\ud Methods: MEDLINE, EMBASE, CINAHL, PsycINFO, Web of Science and Cochrane databases were searched to identify\ud articles that quantitatively measured quality of life in those with a lower limb amputation for peripheral arterial occlusive\ud disease. Articles were quality assessed by two assessors, evidence tables summarised each article and a narrative\ud synthesis was performed.\ud Study design: Systematic review.\ud Results: Twelve articles were included. Study designs and outcome measures used varied. Quality assessment scores\ud ranged from 36% to 92%. The ability to walk successfully with a prosthesis had the greatest positive impact on quality\ud of life. A trans-femoral amputation was negatively associated with quality of life due to increased difficulty in walking\ud with a prosthesis. Other factors such as older age, being male, longer time since amputation, level of social support and\ud presence of diabetes also negatively affected quality of life.\ud Conclusion: Being able to walk with a prosthesis is of primary importance to improve quality of life for people with lower\ud limb amputation due to peripheral arterial occlusive disease. To further understand and improve the quality of life of this\ud population, there is a need for more prospective longitudinal studies, with a standardised outcome measure

    Injury in starting and replacement players from five professional men’s rugby unions

    Get PDF
    Objectives: The aim of this study was to compare the incidence, severity, and burden of injury in starting and replacement players from professional men’s teams of five rugby unions. Methods: Match injuries of greater than 24 h time-loss (including data on the severity, match quarter, event, body region) and player minutes of match exposure data were collated for all starting and replacement players in the men’s English Premiership, Welsh Pro14 (both 2016/17–2018/19 seasons), and Australian, New Zealand, and South African Super Rugby (all 2016–2018 seasons) teams. Injury incidences and mean injury burden (incidence × days missed) were calculated, and rate ratios (RRs) (95% confidence intervals [CIs]) were used to compare injury incidence and burden between starting (reference group) and replacement players. Results: Overall injury incidence was not different between starters and replacements for all injuries (RR = 0.98, 95% CI 0.88–1.10), nor for concussions (RR = 0.85; 95% CI 0.66–1.11). Mean injury burden was higher for replacement players (RR = 1.31, 95% CI 1.17–1.46). Replacement injury incidence was lower than the starters in the third (RR = 0.68, 95% CI 0.51–0.92) and fourth (RR = 0.78, 95% CI 0.67–0.92) match quarters. Injury incidence was not different between starters and replacements for any match event or body region, but compared with starters, replacements’ injury burden was higher in lower limbs (RR = 1.24, 95% CI 1.05–1.46) and in the tackled player (RR = 1.30, 95% CI 1.01–1.66). Conclusion: This study demonstrated a lower injury incidence in replacement players compared with starters in the second half of matches, with a higher injury burden for replacement players due to higher mean injury severity

    Safe sport is more than injury prevention

    No full text

    Musculoskeletal injury in military special operations forces: A systematic review

    No full text
    © Author(s) (or their employer(s)) 2021. No commercial re-use. See rights and permissions. Published by BMJ. Introduction: Special Operations Forces conduct military activities using specialised and unconventional techniques that offer a unique and complementary capability to conventional forces. These activities expose Special Operations Forces personnel to different injury risks in comparison with personnel in the conventional forces. Consequently, different injury patterns are expected in this population. The purpose of this research is to establish high-level evidence informing what is known about musculoskeletal injury epidemiology in Special Operations Forces. Methods: A systematic review was conducted using three online databases to identify original studies reporting musculoskeletal injury data in Special Operations Forces. A critical appraisal tool was applied to all included studies. Descriptive data were extracted for demographics, study design details and injuries (eg, injury frequency, injury type, body part injured, activity, mechanism, severity). Results were narratively synthesised. Results: Twenty-one studies were included. Trainees conducting qualification training had the highest injury frequency, up to 68% injured in a training period. The ankle, knee and lumbar spine were the most common body parts affected. Parachuting caused the most severe injuries. Physical training was the most common activity causing injury, accounting for up to 80% of injuries. Running and lifting were common injury mechanisms. Injury causation information was frequently not reported. Partially validated surveillance methods limited many studies. Conclusions: Injuries are prevalent in Special Operation Forces. Future research should prioritise identifying injury causation information that supports prevention. Focus on improving surveillance methods to enhance the accuracy and comparison of results across cohorts is also recommended

    The violin technique of Italian solo sonata in the 17th century

    Get PDF
    Objectives: To conduct a document and content analysis of exertional heat illness (EHI)-related documents published by sports organisations in Victoria, Australia, in order to determine their scope and evidence base against current international best practice recommendations. Methods: A qualitative document and content analysis. Official documents relating to EHI were identified through a search of 22 Victorian sport organisation websites, supplemented by a general internet search. The content of these documents was evaluated against recommendations presented in three current international position statements on prevention and management of EHI. Results: A range of document types addressing EHI were identified (n=25), including specific heat policies, match day guides, rules and regulations. Recommendations about prevention measures were the most common information presented, but these were largely focused on event modification/cancellation guidelines only (n=22; 88%). Most documents provided information on hydration as a preventive measure (n=20; 80%), but the emphasis on the importance of cooling strategies (n=7; 28%) and heat acclimatisation (n=5; 20%) was inadequate. Details on EHI, including its definition, symptoms/signs to look out for, and common risk factors (beyond humidity/high temperatures) were lacking in most documents. Conclusion: There is considerable variation in formal documents with regard to their content and quality of information. Continued efforts to bridge the evidence to practice gap in sports safety are therefore important. This study highlights the challenge for community sport, which relies on high-level policy and governance, across settings and populations that can differ substantially in their needs
    corecore