108 research outputs found
High Sport Specialization Is Associated With More Musculoskeletal Injuries in Canadian High School Students
Objective: To describe levels of sport specialization in Canadian high school students and investigate whether sport specialization and/or sport participation volume is associated with the history of musculoskeletal injury and/or concussion.Design: Cross-sectional study.Setting: High schools, Alberta, Canada.Participants: High school students (14-19 years) participating in various sports.Independent Variables: Level of sport specialization (high, moderate, low) and sport participation volume (hours per week and months per year).Main Outcome Measures: Twelve-month injury history (musculoskeletal and concussion).Results: Of the 1504 students who completed the survey, 31% were categorized as highly specialized (7.5% before the age of 12 years). Using multivariable, negative, binomial regression (adjusted for sex, age, total yearly training hours, and clustering by school), highly specialized students had a significantly higher musculoskeletal injury rate [incidence rate ratio (IRR) = 1.36, 95% confidence interval (CI), 1.07-1.73] but not lower extremity injury or concussion rate, compared with low specialization students. Participating in one sport for more than 8 months of the year significantly increased the musculoskeletal injury rate (IRR = 1.27, 95% CI, 1.02-1.58). Increased training hours significantly increased the musculoskeletal injury rate (IRR = 1.18, 95% CI, 1.13-1.25), lower extremity injury rate (IRR = 1.16, 95% CI, 1.09-1.24), and concussion rate (IRR = 1.31, 95% CI, 1.24-1.39).Conclusions: Approximately one-third of Canadian high school students playing sports were categorized as highly specialized. The musculoskeletal injury rate was higher for high sport specialization students compared with low sport specialization students. Musculoskeletal injuries and concussion were also more common in students who train more and spend greater than 8 months per year in one sport
Canadian high school rugby coaches readiness for an injury prevention strategy implementation: Evaluating a Train-the-Coach workshop
Background: Canadian rugby coach injury prevention beliefs and attitudes have not been studied, yet are key to informing injury prevention strategy implementation. Despite neuromuscular training (NMT) warm-up success in reducing injury, adoption of these programs is variable. Therefore, objectives of this study included (1) describing Canadian youth rugby coach injury prevention beliefs and attitudes and current warm-up practices and (2) evaluating intention to use a rugby-specific NMT warm-up. Methods: High school rugby coaches completed a questionnaire before and after a rugby-specific NMT warm-up workshop. The pre-workshop questionnaire captured demographics, current warm-up practice, and NMT warm-up knowledge and use. Both questionnaires captured injury prevention beliefs, attitudes and behavioral intention. Results: Forty-eight coaches participated in the workshops. Pre-workshop, 27% of coaches were aware of NMT warm-ups. Coaches primarily included aerobic and stretching components, while balance components were not common in their warm-ups over the past year. Additionally, 92% of coaches agreed to some extent they would “complete a rugby-specific warm-up program prior to every game and training session this season.” Post-workshop, 86% of coaches agreed to some extent that they would use the program in every rugby session. No differences were observed between pre- and post-workshop intention to implement the warm-up (p = 0.10). Interpretation: This is the first study to examine current Canadian youth rugby coach warm-up practices and intention to use NMT warm-ups. Canadian rugby coach intention to use a rugby-specific NMT warm-up is high, providing ample opportunity to investigate the efficacy of a NMT warm-up in youth rugby
Validation of previously identified serum biomarkers for breast cancer with SELDI-TOF MS: a case control study
<p>Abstract</p> <p>Background</p> <p>Serum protein profiling seems promising for early detection of breast cancer. However, the approach is also criticized, partly because of difficulties in validating discriminatory proteins. This study's aim is to validate three proteins previously reported to be discriminative between breast cancer cases and healthy controls. These proteins had been identified as a fragment of inter-alpha trypsin inhibitor H4 (4.3 kDa), C-terminal-truncated form of C3a des arginine anaphylatoxin (8.1 kDa) and C3a des arginine anaphylatoxin (8.9 kDa).</p> <p>Methods</p> <p>Serum protein profiles of 48 breast cancer patients and 48 healthy controls were analyzed with surface-enhanced laser desorption/ionization time-of-flight mass spectrometry (SELDI-TOF MS). Differences in protein intensity between breast cancer cases and controls were measured with the Mann-Whitney U test and adjusted for confounding in a multivariate logistic regression model.</p> <p>Results</p> <p>Four peaks, with mass-to-charge ratio (<it>m/z</it>) 4276, 4292, 8129 and 8941, were found that were assumed to represent the previously reported proteins. <it>M/</it>z 4276 and 4292 were statistically significantly decreased in breast cancer cases compared to healthy controls (p < 0.001). M/<it>z </it>8941 was decreased in breast cancer cases (p < 0.001) and <it>m/z </it>8129 was not related with breast cancer (p = 0.87). Adjustment for sample preparation day, sample storage duration and age did not substantially alter results.</p> <p>Conclusion</p> <p><it>M/z </it>4276 and 4292 both represented the previously reported 4.3 kDa protein and were both decreased in breast cancer patients, which is in accordance with the results of most previous studies. <it>M/z </it>8129 was in contrast with previous studies not related with breast cancer. Remarkably, <it>m/z </it>8941 was decreased in breast cancer cases whereas in previous studies it was increased. Differences in patient populations and pre-analytical sample handling could have contributed to discrepancies. Further research is needed before we can conclude on the relevance of these proteins as breast cancer biomarkers.</p
Searching for early breast cancer biomarkers by serum protein profiling of pre-diagnostic serum; a nested case-control study
<p>Abstract</p> <p>Background</p> <p>Serum protein profiles have been investigated frequently to discover early biomarkers for breast cancer. So far, these studies used biological samples collected <it>at </it>or <it>after </it>diagnosis. This may limit these studies' value in the search for cancer biomarkers because of the often advanced tumor stage, and consequently risk of reverse causality. We present for the first time pre-diagnostic serum protein profiles in relation to breast cancer, using the Prospect-EPIC (European Prospective Investigation into Cancer and nutrition) cohort.</p> <p>Methods</p> <p>In a nested case-control design we compared 68 women diagnosed with breast cancer within three years after enrollment, with 68 matched controls for differences in serum protein profiles. All samples were analyzed with SELDI-TOF MS (surface enhanced laser desorption/ionization time-of-flight mass spectrometry). In a subset of 20 case-control pairs, the serum proteome was identified and relatively quantified using isobaric Tags for Relative and Absolute Quantification (iTRAQ) and online two-dimensional nano-liquid chromatography coupled with tandem MS (2D-nanoLC-MS/MS).</p> <p>Results</p> <p>Two SELDI-TOF MS peaks with m/z 3323 and 8939, which probably represent doubly charged apolipoprotein C-I and C3a des-arginine anaphylatoxin (C3a<sub>desArg</sub>), were higher in pre-diagnostic breast cancer serum (p = 0.02 and p = 0.06, respectively). With 2D-nanoLC-MS/MS, afamin, apolipoprotein E and isoform 1 of inter-alpha trypsin inhibitor heavy chain H4 (ITIH4) were found to be higher in pre-diagnostic breast cancer (p < 0.05), while alpha-2-macroglobulin and ceruloplasmin were lower (p < 0.05). C3a<sub>desArg </sub>and ITIH4 have previously been related to the presence of symptomatic and/or mammographically detectable breast cancer.</p> <p>Conclusions</p> <p>We show that serum protein profiles are already altered up to three years before breast cancer detection.</p
Rib Cage Deformities Alter Respiratory Muscle Action and Chest Wall Function in Patients with Severe Osteogenesis Imperfecta
Osteogenesis imperfecta (OI) is an inherited connective tissue disorder characterized by bone fragility, multiple fractures and significant chest wall deformities. Cardiopulmonary insufficiency is the leading cause of death in these patients.Seven patients with severe OI type III, 15 with moderate OI type IV and 26 healthy subjects were studied. In addition to standard spirometry, rib cage geometry, breathing pattern and regional chest wall volume changes at rest in seated and supine position were assessed by opto-electronic plethysmography to investigate if structural modifications of the rib cage in OI have consequences on ventilatory pattern. One-way or two-way analysis of variance was performed to compare the results between the three groups and the two postures. compared to predicted values, on condition that updated reference equations are considered. In both positions, ventilation was lower in OI patients than control because of lower tidal volume (p<0.01). In contrast to OI type IV patients, whose chest wall geometry and function was normal, OI type III patients were characterized by reduced (p<0.01) angle at the sternum (pectus carinatum), paradoxical inspiratory inward motion of the pulmonary rib cage, significant thoraco-abdominal asynchronies and rib cage distortions in supine position (p<0.001).In conclusion, the restrictive respiratory pattern of Osteogenesis Imperfecta is closely related to the severity of the disease and to the sternal deformities. Pectus carinatum characterizes OI type III patients and alters respiratory muscles coordination, leading to chest wall and rib cage distortions and an inefficient ventilator pattern. OI type IV is characterized by lower alterations in the respiratory function. These findings suggest that functional assessment and treatment of OI should be differentiated in these two forms of the disease
Candida albicans-produced farnesol stimulates Pseudomonas quinolone signal production in LasR-defective Pseudomonas aeruginosa strains
Candida albicans has been previously shown to stimulate the production of Pseudomonas aeruginosa phenazine toxins in dual-species colony biofilms. Here, we report that P. aeruginosa lasR mutants, which lack the master quorum sensing system regulator, regain the ability to produce quorum-sensing-regulated phenazines when cultured with C. albicans. Farnesol, a signalling molecule produced by C. albicans, was sufficient to stimulate phenazine production in LasR− laboratory strains and clinical isolates. P. aeruginosa ΔlasR mutants are defective in production of the Pseudomonas quinolone signal (PQS) due to their inability to properly induce pqsH, which encodes the enzyme necessary for the last step in PQS biosynthesis. We show that expression of pqsH in a ΔlasR strain was sufficient to restore PQS production, and that farnesol restored pqsH expression in ΔlasR mutants. The farnesol-mediated increase in pqsH required RhlR, a transcriptional regulator downstream of LasR, and farnesol led to higher levels of N-butyryl-homoserine lactone, the small molecule activator of RhlR. Farnesol promotes the production of reactive oxygen species (ROS) in a variety of species. Because the antioxidant N-acetylcysteine suppressed farnesol-induced RhlR activity in LasR− strains, and hydrogen peroxide was sufficient to restore PQS production in las mutants, we propose that ROS are responsible for the activation of downstream portions of this quorum sensing pathway. LasR mutants frequently arise in the lungs of patients chronically infected with P. aeruginosa. The finding that C. albicans, farnesol or ROS stimulate virulence factor production in lasR strains provides new insight into the virulence potential of these strains
Socioeconomic status and hospitalization in the very old: a retrospective study
<p>Abstract</p> <p>Background</p> <p>Socioeconomic status could affect the demand for hospital care. The aim of the present study was to assess the role of age, socioeconomic status and comorbidity on acute hospital admissions among elderly.</p> <p>Methods</p> <p>We retrospectively examined the discharge abstracts data of acute care hospital admissions of residents in Rome aged 75 or more years in the period 1997–2000. We used the Hospital Information System of Rome, the Tax Register, and the Population Register of Rome for socio-economic data. The rate of hospitalization, modified Charlson's index of comorbidity, and level of income in the census tract of residence were obtained. Rate ratios and 95% confidence limits were computed to assess the relationship between income deciles and rate of hospitalization. Cross-tabulation was used to explore the distribution of the index of comorbidity by deciles of income. Analyses were repeated for patients grouped according to selected diseases.</p> <p>Results</p> <p>Age was associated with a marginal increase in the rate of hospitalization. However, the hospitalization rate was inversely related to income in both sexes. Higher income was associated with lower comorbidity. The same associations were observed in patients admitted with a principal diagnosis of chronic condition (diabetes mellitus, heart failure, chron obstructive pulmonary disease) or stroke, but not hip fracture.</p> <p>Conclusion</p> <p>Lower social status and associated comorbidity, more than age per se, are associated with a higher rate of hospitalization in very old patients.</p
Erratum: Global, regional, and national comparative risk assessment of 84 behavioural, environmental and occupational, and metabolic risks or clusters of risks for 195 countries and territories, 1990–2017: a systematic analysis for the Global Burden of Disease Study 2017
Interpretation: By quantifying levels and trends in exposures to risk factors and the resulting disease burden, this assessment offers insight into where past policy and programme efforts might have been successful and highlights current priorities for public health action. Decreases in behavioural, environmental, and occupational risks have largely offset the effects of population growth and ageing, in relation to trends in absolute burden. Conversely, the combination of increasing metabolic risks and population ageing will probably continue to drive the increasing trends in non-communicable diseases at the global level, which presents both a public health challenge and opportunity. We see considerable spatiotemporal heterogeneity in levels of risk exposure and risk-attributable burden. Although levels of development underlie some of this heterogeneity, O/E ratios show risks for which countries are overperforming or underperforming relative to their level of development. As such, these ratios provide a benchmarking tool to help to focus local decision making. Our findings reinforce the importance of both risk exposure monitoring and epidemiological research to assess causal connections between risks and health outcomes, and they highlight the usefulness of the GBD study in synthesising data to draw comprehensive and robust conclusions that help to inform good policy and strategic health planning
Global, regional, and national comparative risk assessment of 84 behavioural, environmental and occupational, and metabolic risks or clusters of risks for 195 countries and territories, 1990-2017: a systematic analysis for the Global Burden of Disease Study 2017
Background
The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2017 comparative risk assessment (CRA) is a comprehensive approach to risk factor quantification that offers a useful tool for synthesising evidence on risks and risk–outcome associations. With each annual GBD study, we update the GBD CRA to incorporate improved methods, new risks and risk–outcome pairs, and new data on risk exposure levels and risk–outcome associations.
Methods
We used the CRA framework developed for previous iterations of GBD to estimate levels and trends in exposure, attributable deaths, and attributable disability-adjusted life-years (DALYs), by age group, sex, year, and location for 84 behavioural, environmental and occupational, and metabolic risks or groups of risks from 1990 to 2017. This study included 476 risk–outcome pairs that met the GBD study criteria for convincing or probable evidence of causation. We extracted relative risk and exposure estimates from 46 749 randomised controlled trials, cohort studies, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. Using the counterfactual scenario of theoretical minimum risk exposure level (TMREL), we estimated the portion of deaths and DALYs that could be attributed to a given risk. We explored the relationship between development and risk exposure by modelling the relationship between the Socio-demographic Index (SDI) and risk-weighted exposure prevalence and estimated expected levels of exposure and risk-attributable burden by SDI. Finally, we explored temporal changes in risk-attributable DALYs by decomposing those changes into six main component drivers of change as follows: (1) population growth; (2) changes in population age structures; (3) changes in exposure to environmental and occupational risks; (4) changes in exposure to behavioural risks; (5) changes in exposure to metabolic risks; and (6) changes due to all other factors, approximated as the risk-deleted death and DALY rates, where the risk-deleted rate is the rate that would be observed had we reduced the exposure levels to the TMREL for all risk factors included in GBD 2017.
Findings
In 2017, 34·1 million (95% uncertainty interval [UI] 33·3–35·0) deaths and 1·21 billion (1·14–1·28) DALYs were attributable to GBD risk factors. Globally, 61·0% (59·6–62·4) of deaths and 48·3% (46·3–50·2) of DALYs were attributed to the GBD 2017 risk factors. When ranked by risk-attributable DALYs, high systolic blood pressure (SBP) was the leading risk factor, accounting for 10·4 million (9·39–11·5) deaths and 218 million (198–237) DALYs, followed by smoking (7·10 million [6·83–7·37] deaths and 182 million [173–193] DALYs), high fasting plasma glucose (6·53 million [5·23–8·23] deaths and 171 million [144–201] DALYs), high body-mass index (BMI; 4·72 million [2·99–6·70] deaths and 148 million [98·6–202] DALYs), and short gestation for birthweight (1·43 million [1·36–1·51] deaths and 139 million [131–147] DALYs). In total, risk-attributable DALYs declined by 4·9% (3·3–6·5) between 2007 and 2017. In the absence of demographic changes (ie, population growth and ageing), changes in risk exposure and risk-deleted DALYs would have led to a 23·5% decline in DALYs during that period. Conversely, in the absence of changes in risk exposure and risk-deleted DALYs, demographic changes would have led to an 18·6% increase in DALYs during that period. The ratios of observed risk exposure levels to exposure levels expected based on SDI (O/E ratios) increased globally for unsafe drinking water and household air pollution between 1990 and 2017. This result suggests that development is occurring more rapidly than are changes in the underlying risk structure in a population. Conversely, nearly universal declines in O/E ratios for smoking and alcohol use indicate that, for a given SDI, exposure to these risks is declining. In 2017, the leading Level 4 risk factor for age-standardised DALY rates was high SBP in four super-regions: central Europe, eastern Europe, and central Asia; north Africa and Middle East; south Asia; and southeast Asia, east Asia, and Oceania. The leading risk factor in the high-income super-region was smoking, in Latin America and Caribbean was high BMI, and in sub-Saharan Africa was unsafe sex. O/E ratios for unsafe sex in sub-Saharan Africa were notably high, and those for alcohol use in north Africa and the Middle East were notably low.
Interpretation
By quantifying levels and trends in exposures to risk factors and the resulting disease burden, this assessment offers insight into where past policy and programme efforts might have been successful and highlights current priorities for public health action. Decreases in behavioural, environmental, and occupational risks have largely offset the effects of population growth and ageing, in relation to trends in absolute burden. Conversely, the combination of increasing metabolic risks and population ageing will probably continue to drive the increasing trends in non-communicable diseases at the global level, which presents both a public health challenge and opportunity. We see considerable spatiotemporal heterogeneity in levels of risk exposure and risk-attributable burden. Although levels of development underlie some of this heterogeneity, O/E ratios show risks for which countries are overperforming or underperforming relative to their level of development. As such, these ratios provide a benchmarking tool to help to focus local decision making. Our findings reinforce the importance of both risk exposure monitoring and epidemiological research to assess causal connections between risks and health outcomes, and they highlight the usefulness of the GBD study in synthesising data to draw comprehensive and robust conclusions that help to inform good policy and strategic health planning
- …