314 research outputs found

    Effects of low energy availability on bone health in endurance athletes and high-impact exercise as a potential countermeasure: a narrative review

    Get PDF
    Endurance athletes expend large amounts of energy in prolonged high-intensity exercise and, due to the weight-sensitive nature of most endurance sports, often practice periods of dietary restriction. The Female Athlete Triad and Relative Energy Deficiency in Sport models consider endurance athletes at high-risk for suffering from low energy availability and associated health complications, including an increased chance of bone stress injury. Several studies have examined the effects of low energy availability on various parameters of bone structure and markers of bone (re)modelling; however, there are differences in findings and research methods and critical summaries are lacking. It is difficult for athletes to reduce energy expenditure or increase energy intake (to restore energy availability) in an environment where performance is a priority. Development of an alternative tool to help protect bone health would be beneficial. High-impact exercise can be highly osteogenic and energy efficient; however, at present, it is rarely utilized to promote bone health in endurance athletes. Therefore, with a view to reducing the prevalence of bone stress injury, the objectives of this review are to evaluate the effects of low energy availability on bone health in endurance athletes and explore whether a high-impact exercise intervention may help to prevent those effects from occurring

    The stepped wedge trial design: a systematic review

    Get PDF
    BACKGROUND: Stepped wedge randomised trial designs involve sequential roll-out of an intervention to participants (individuals or clusters) over a number of time periods. By the end of the study, all participants will have received the intervention, although the order in which participants receive the intervention is determined at random. The design is particularly relevant where it is predicted that the intervention will do more good than harm (making a parallel design, in which certain participants do not receive the intervention unethical) and/or where, for logistical, practical or financial reasons, it is impossible to deliver the intervention simultaneously to all participants. Stepped wedge designs offer a number of opportunities for data analysis, particularly for modelling the effect of time on the effectiveness of an intervention. This paper presents a review of 12 studies (or protocols) that use (or plan to use) a stepped wedge design. One aim of the review is to highlight the potential for the stepped wedge design, given its infrequent use to date. METHODS: Comprehensive literature review of studies or protocols using a stepped wedge design. Data were extracted from the studies in three categories for subsequent consideration: study information (epidemiology, intervention, number of participants), reasons for using a stepped wedge design and methods of data analysis. RESULTS: The 12 studies included in this review describe evaluations of a wide range of interventions, across different diseases in different settings. However the stepped wedge design appears to have found a niche for evaluating interventions in developing countries, specifically those concerned with HIV. There were few consistent motivations for employing a stepped wedge design or methods of data analysis across studies. The methodological descriptions of stepped wedge studies, including methods of randomisation, sample size calculations and methods of analysis, are not always complete. CONCLUSION: While the stepped wedge design offers a number of opportunities for use in future evaluations, a more consistent approach to reporting and data analysis is required

    Cabozantinib versus everolimus, nivolumab, axitinib, sorafenib and best supportive care: A network meta-analysis of progression-free survival and overall survival in second line treatment of advanced renal cell carcinoma

    Get PDF
    Background Relative effect of therapies indicated for the treatment of advanced renal cell carcinoma (aRCC) after failure of first line treatment is currently not known. The objective of the present study is to evaluate progression-free survival (PFS) and overall survival (OS) of cabozantinib compared to everolimus, nivolumab, axitinib, sorafenib, and best supportive care (BSC) in aRCC patients who progressed after previous VEGFR tyrosine-kinase inhibitor (TKI) treatment. Methodology & findings Systematic literature search identified 5 studies for inclusion in this analysis. The assessment of the proportional hazard (PH) assumption between the survival curves for different treatment arms in the identified studies showed that survival curves in two of the studies did not fulfil the PH assumption, making comparisons of constant hazard ratios (HRs) inappropriate. Consequently, a parametric survival network meta-analysis model was implemented with five families of functions being jointly fitted in a Bayesian framework to PFS, then OS, data on all treatments. The comparison relied on data digitized from the Kaplan-Meier curves of published studies, except for cabozantinib and its comparator everolimus where patient level data were available. This analysis applied a Bayesian fixed-effects network meta-analysis model to compare PFS and OS of cabozantinib versus its comparators. The log-normal fixed-effects model displayed the best fit of data for both PFS and OS, and showed that patients on cabozantinib had a higher probability of longer PFS and OS than patients exposed to comparators. The survival advantage of cabozantinib increased over time for OS. For PFS the survival advantage reached its maximum at the end of the first year’s treatment and then decreased over time to zero. Conclusion With all five families of distributions, cabozantinib was superior to all its comparators with a higher probability of longer PFS and OS during the analyzed 3 years, except with the Gompertz model, where nivolumab was preferred after 24 months

    The Common Swift Louse Fly, Crataerina pallida: An Ideal Species for Studying Host-Parasite Interactions

    Get PDF
    Little is known of the life-history of many parasitic species. This hinders a full understanding of host-parasitic interactions. The common swift louse fly, Crataerina pallida Latreille (Diptera: Hippoboscidae), an obligate haematophagous parasite of the Common Swift, Apus apus Linnaeus 1758, is one such species. No detrimental effect of its parasitism upon the host has been found. This may be because too little is known about C. pallida ecology, and therefore detrimental effects are also unknown. This is a review of what is known about the life-history of this parasite, with the aim of promoting understanding of its ecology. New, previously unreported observations about C. pallida made from personal observations at a nesting swift colony are described. Unanswered questions are highlighted, which may aid understanding of this host-parasite system. C. pallida may prove a suitable model species for the study of other host-parasite relationships

    Critical Review of Norovirus Surrogates in Food Safety Research: Rationale for Considering Volunteer Studies

    Get PDF
    The inability to propagate human norovirus (NoV) or to clearly differentiate infectious from noninfectious virus particles has led to the use of surrogate viruses, like feline calicivirus (FCV) and murine norovirus-1 (MNV), which are propagatable in cell culture. The use of surrogates is predicated on the assumption that they generally mimic the viruses they represent; however, studies are proving this concept invalid. In direct comparisons between FCV and MNV, their susceptibility to temperatures, environmental and food processing conditions, and disinfectants are dramatically different. Differences have also been noted between the inactivation of NoV and its surrogates, thus questioning the validity of surrogates. Considerable research funding is provided globally each year to conduct surrogate studies on NoVs; however, there is little demonstrated benefit derived from these studies in regard to the development of virus inactivation techniques or food processing strategies. Human challenge studies are needed to determine which processing techniques are effective in reducing NoVs in foods. A major obstacle to clinical trials on NoVs is the perception that such trials are too costly and risky, but in reality, there is far more cost and risk in allowing millions of unsuspecting consumers to contract NoV illness each year, when practical interventions are only a few volunteer studies away. A number of clinical trials have been conducted, providing important insights into NoV inactivation. A shift in research priorities from surrogate research to volunteer studies is essential if we are to identify realistic, practical, and scientifically valid processing approaches to improve food safety

    Folic Acid Transport to the Human Fetus Is Decreased in Pregnancies with Chronic Alcohol Exposure

    Get PDF
    During pregnancy, the demand for folic acid increases since the fetus requires this nutrient for its rapid growth and cell proliferation. The placenta concentrates folic acid into the fetal circulation; as a result the fetal levels are 2 to 4 times higher than the maternal level. Animal and in vitro studies have suggested that alcohol may impair transport of folic acid across the placenta by decreasing expression of transport proteins. We aim to determine if folate transfer to the fetus is altered in human pregnancies with chronic alcohol consumption.Serum folate was measured in maternal blood and umbilical cord blood at the time of delivery in pregnancies with chronic and heavy alcohol exposure (n = 23) and in non-drinking controls (n = 24). In the alcohol-exposed pairs, the fetal:maternal serum folate ratio was ≀ 1.0 in over half (n = 14), whereas all but one of the controls were >1.0. Mean folate in cord samples was lower in the alcohol-exposed group than in the controls (33.15 Β± 19.89 vs 45.91 Β± 20.73, p = 0.04).Our results demonstrate that chronic and heavy alcohol use in pregnancy impairs folate transport to the fetus. Altered folate concentrations within the placenta and in the fetus may in part contribute to the deficits observed in the fetal alcohol spectrum disorders

    Calcium Dependent CAMTA1 in Adult Stem Cell Commitment to a Myocardial Lineage

    Get PDF
    The phenotype of somatic cells has recently been found to be reversible. Direct reprogramming of one cell type into another has been achieved with transduction and over expression of exogenous defined transcription factors emphasizing their role in specifying cell fate. To discover early and novel endogenous transcription factors that may have a role in adult-derived stem cell acquisition of a cardiomyocyte phenotype, mesenchymal stem cells from human and mouse bone marrow and rat liver were co-cultured with neonatal cardiomyocytes as an in vitro cardiogenic microenvironment. Cell-cell communications develop between the two cell types as early as 24 hrs in co-culture and are required for elaboration of a myocardial phenotype in the stem cells 8-16 days later. These intercellular communications are associated with novel Ca(2+) oscillations in the stem cells that are synchronous with the Ca(2+) transients in adjacent cardiomyocytes and are detected in the stem cells as early as 24-48 hrs in co-culture. Early and significant up-regulation of Ca(2+)-dependent effectors, CAMTA1 and RCAN1 ensues before a myocardial program is activated. CAMTA1 loss-of-function minimizes the activation of the cardiac gene program in the stem cells. While the expression of RCAN1 suggests involvement of the well-characterized calcineurin-NFAT pathway as a response to a Ca(2+) signal, the CAMTA1 up-regulated expression as a response to such a signal in the stem cells was unknown. Cell-cell communications between the stem cells and adjacent cardiomyocytes induce Ca(2+) signals that activate a myocardial gene program in the stem cells via a novel and early Ca(2+)-dependent intermediate, up-regulation of CAMTA1

    Individuals with Le(a+bβˆ’) Blood Group Have Increased Susceptibility to Symptomatic Vibrio cholerae O1 Infection

    Get PDF
    Cholera remains a severe diarrheal disease, capable of causing extensive outbreaks and high mortality. Blood group is one of the genetic factors determining predisposition to disease, including infectious diseases. Expression of different Lewis or ABO blood group types has been shown to be associated with risk of different enteric infections. For example, individuals of blood group O have a higher risk of severe illness due to V. cholerae compared to those with non-blood group O antigens. In this study, we have determined the relationship of the Lewis blood group antigen phenotypes with the risk of symptomatic cholera as well as the severity of disease and immune responses following infection. We show that individuals expressing the Le(a+bβˆ’) phenotype were more susceptible to symptomatic cholera, while Le(a–b+) expressing individuals were less susceptible. Individuals with the Le(a–bβˆ’) blood group had a longer duration of diarrhea when infected, required more intravenous fluid replacement, and had lower plasma IgA antibody responses to V. cholerae LPS on day 7 following infection. We conclude that there is an association between the Lewis blood group and the risk of cholera, and that this risk may affect the outcome of infection as well as possibly the efficacy of vaccination
    • …
    corecore