4,461 research outputs found

    Resource guide for health and fitness program development

    Get PDF

    Statement of the Third International Exercise-Associated Hyponatremia Consensus Development Conference, Carlsbad, California, 2015

    Get PDF
    The third International Exercise-Associated Hyponatremia (EAH) Consensus Development Conference convened in Carlsbad, California in February 2015 with a panel of 17 international experts. The delegates represented 4 countries and 9 medical and scientific sub-specialties pertaining to athletic training, exercise physiology, sports medicine, water/sodium metabolism, and body fluid homeostasis. The primary goal of the panel was to review the existing data on EAH and update the 2008 Consensus Statement.1 This document serves to replace the second International EAH Consensus Development Conference Statement and launch an educational campaign designed to address the morbidity and mortality associated with a preventable and treatable fluid imbalance. The following statement is a summary of the data synthesized by the 2015 EAH Consensus Panel and represents an evolution of the most current knowledge on EAH. This document will summarize the most current information on the prevalence, etiology, diagnosis, treatment and prevention of EAH for medical personnel, athletes, athletic trainers, and the greater public. The EAH Consensus Panel strove to clearly articulate what we agreed upon, did not agree upon, and did not know, including minority viewpoints that were supported by clinical experience and experimental data. Further updates will be necessary to both: (1) remain current with our understanding and (2) critically assess the effectiveness of our present recommendations. Suggestions for future research and educational strategies to reduce the incidence and prevalence of EAH are provided at the end of the document as well as areas of controversy that remain in this topic. [excerpt

    In situ labeling of DNA reveals interindividual variation in nuclear DNA breakdown in hair and may be useful to predict success of forensic genotyping of hair

    Get PDF
    Hair fibers are formed by keratinocytes of the hair follicle in a process that involves the breakdown of the nucleus including DNA. Accordingly, DNA can be isolated with high yield from the hair bulb which contains living keratinocytes, whereas it is difficult to prepare from the distal portions of hair fibers and from shed hair. Nevertheless, forensic investigations are successful in a fraction of shed hair samples found at crime scenes. Here, we report that interindividual differences in the completeness of DNA removal from hair corneocytes are major determinants of DNA content and success rates of forensic investigations of hair. Distal hair samples were permeabilized with ammonia and incubated with the DNA-specific dye Hoechst 33258 to label DNA in situ. Residual nuclear DNA was visualized under the fluorescence microscope. Hair from some donors did not contain any stainable nuclei, whereas hair of other donors contained a variable number of DNA-positive nuclear remnants. The number of DNA-containing nuclear remnants per millimeter of hair correlated with the amount of DNA that could be extracted and amplified by quantitative PCR. When individual hairs were investigated, only hairs in which DNA could be labeled in situ gave positive results in short tandem repeat typing. This study reveals that the completeness of DNA degradation during cornification of the hair is a polymorphic trait. Furthermore, our results suggest that in situ labeling of DNA in hair may be useful for predicting the probability of success of forensic analysis of nuclear DNA in shed hair

    Association Between Hemoglobin Levels and Efficacy of Intravenous Ferric Carboxymaltose in Patients With Acute Heart Failure and Iron Deficiency: An AFFIRM-AHF Subgroup Analysis

    Get PDF
    BACKGROUND: Iron deficiency, with or without anemia, is an adverse prognostic factor in heart failure (HF). In AFFIRM-AHF (a randomized, double-blind placebo-controlled trial comparing the effect of intravenous ferric carboxymaltose on hospitalizations and mortality in iron-deficient subjects admitted for acute heart failure), intravenous ferric carboxymaltose (FCM), although having no significant effect on the primary end point, reduced the risk of HF hospitalization (hHF) and improved quality of life versus placebo in iron-deficient patients stabilized after an acute HF (AHF) episode. These prespecified AFFIRM-AHF subanalyses explored the association between hemoglobin levels and FCM treatment effects. METHODS: AFFIRM-AHF was a multicenter, double-blind, randomized, placebo-controlled trial of FCM in hospitalized AHF patients with iron deficiency. Patients were stratified by baseline hemoglobin level (<12 versus ≄12 g/dL). In each subgroup, the primary composite (total hHF and cardiovascular death) and secondary (total hHF; total cardiovascular hospitalizations and cardiovascular death; time to cardiovascular death, and time to first/days lost due to hHF or cardiovascular death) outcomes were assessed with FCM versus placebo at week 52. Sensitivity analyses using the World Health Organization anemia definition (hemoglobin level <12 g/dL [women] or <13 g/dL [men]) were performed, among others. RESULTS: Of 1108 AFFIRM-AHF patients, 1107 were included in these subanalyses: 464 (FCM group, 228; placebo group, 236) had a hemoglobin level <12 g/dL, and 643 (FCM, 329; placebo, 314) had a hemoglobin level ≄12 g/dL. Patients with a hemoglobin level <12 g/dL were older (mean, 73.7 versus 69.1 years), with more frequent previous HF (75.0% versus 68.7%), serum ferritin <100 ÎŒg/L (75.4% versus 68.1%), and transferrin saturation <20% (87.9% versus 81.4%). For the primary outcome, annualized event rates per 100 patient-years with FCM versus placebo were 71.1 and 73.6 (rate ratio, 0.97 [95% CI, 0.66-1.41]), respectively, and 48.5 versus 72.9 (RR, 0.67 [95% CI, 0.48-0.93]) in the hemoglobin levels <12 and ≄12 g/dL subgroups, respectively. No significant interactions between hemoglobin subgroup and treatment effect were observed for primary (Pinteraction_{interaction}=0.15) or secondary outcomes. Changes from baseline in hemoglobin, serum ferritin and transferrin saturation were significantly greater with FCM versus placebo in both subgroups between weeks 6 and 52. Findings were similar using the World Health Organization definition for anemia. CONCLUSIONS: The effects of intravenous FCM on outcomes in iron-deficient patients stabilized after an AHF episode, including improvements in iron parameters over time, did not differ between patients with hemoglobin levels <12 and ≄12 g/dL

    Ferric Carboxymaltose in Iron-Deficient Patients with Hospitalized Heart Failure and Reduced Kidney Function

    Get PDF
    Background: Reduced kidney function is common among patients with heart failure. In patients with heart failure and/or kidney disease, iron deficiency is an independent predictor of adverse outcomes. In the AFFIRM-AHF trial, patients with acute heart failure with iron deficiency treated with intravenous ferric carboxymaltose demonstrated reduced risk of heart failure hospitalization, with improved quality of life. We aimed to further characterize the impact of ferric carboxymaltose among patients with coexisting kidney impairment. Methods: The double-blind, placebo-controlled AFFIRM-AHF trial randomized 1132 stabilized adults with acute heart failure (left ventricular ejection fraction <50%) and iron deficiency. Patients on dialysis were excluded. The primary end point was a composite of total heart failure hospitalizations and cardiovascular death during the 52-week follow-up period. Additional end points included cardiovascular hospitalizations, total heart failure hospitalizations, and days lost to heart failure hospitalizations or cardiovascular death. For this subgroup analysis, patients were stratified according to baseline eGFR. Results: Overall, 60% of patients had an eGFR 0.05). Conclusions: In a cohort of patients with acute heart failure, left ventricular ejection fraction <50%, and iron deficiency, the safety and efficacy of ferric carboxymaltose were consistent across a range of eGFR values. Clinical Trial registry name and registration number: Study to Compare Ferric Carboxymaltose With Placebo in Patients With Acute Heart Failure and Iron Deficiency (Affirm-AHF), NCT02937454

    Ferric Carboxymaltose in Iron-Deficient Patients with Hospitalized Heart Failure and Reduced Kidney Function

    Get PDF
    BACKGROUND Reduced kidney function is common among patients with heart failure. In patients with heart failure and/or kidney disease, iron deficiency is an independent predictor of adverse outcomes. In the AFFIRM-AHF trial, patients with acute heart failure with iron deficiency treated with intravenous ferric carboxymaltose demonstrated reduced risk of heart failure hospitalization, with improved quality of life. We aimed to further characterize the impact of ferric carboxymaltose among patients with coexisting kidney impairment. METHODS The double-blind, placebo-controlled AFFIRM-AHF trial randomized 1132 stabilized adults with acute heart failure (left ventricular ejection fraction <50%) and iron deficiency. Patients on dialysis were excluded. The primary end point was a composite of total heart failure hospitalizations and cardiovascular death during the 52-week follow-up period. Additional end points included cardiovascular hospitalizations, total heart failure hospitalizations, and days lost to heart failure hospitalizations or cardiovascular death. For this subgroup analysis, patients were stratified according to baseline eGFR. RESULTS Overall, 60% of patients had an eGFR 0.05). CONCLUSIONS In a cohort of patients with acute heart failure, left ventricular ejection fraction <50%, and iron deficiency, the safety and efficacy of ferric carboxymaltose were consistent across a range of eGFR values. CLINICAL TRIAL REGISTRY NAME AND REGISTRATION NUMBER Study to Compare Ferric Carboxymaltose With Placebo in Patients With Acute Heart Failure and Iron Deficiency (Affirm-AHF), NCT02937454

    The Magnitude of Androgen Receptor Positivity in Breast Cancer Is Critical for Reliable Prediction of Disease Outcome

    Get PDF
    Purpose: Consensus is lacking regarding the androgen receptor (AR) as a prognostic marker in breast cancer. The objectives of this study were to comprehensively review the literature on AR prognostication and determine optimal criteria for AR as an independent predictor of breast cancer survival. Experimental Design: AR positivity was assessed by immunostaining in two clinically validated primary breast cancer cohorts [training cohort, n = 219; validation cohort, n = 418; 77% and 79% estrogen receptor alpha (ERα) positive, respectively]. The optimal AR cut-point was determined by ROC analysis in the training cohort and applied to both cohorts. Results: AR was an independent prognostic marker of breast cancer outcome in 22 of 46 (48%) previous studies that performed multivariate analyses. Most studies used cut-points of 1% or 10% nuclear positivity. Herein, neither 1% nor 10% cut-points were robustly prognostic. ROC analysis revealed that a higher AR cut-point (78% positivity) provided optimal sensitivity and specificity to predict breast cancer survival in the training (HR, 0.41; P = 0.015) and validation (HR, 0.50; P = 0.014) cohorts. Tenfold cross-validation confirmed the robustness of this AR cut-point. Patients with ERα-positive tumors and AR positivity ≄78% had the best survival in both cohorts (P 0.87) had the best outcomes (P < 0.0001). Conclusions: This study defines an optimal AR cut-point to reliably predict breast cancer survival. Testing this cut-point in prospective cohorts is warranted for implementation of AR as a prognostic factor in the clinical management of breast cancer

    ProPIG - Organic pig health, welfare and environmental impact across Europe

    Get PDF
    Organic production is perceived by consumers as being superior in animal welfare and sustainability and the demand for organic pork products is slowly increasing. Within the past ten years a variety of husbandry and management systems have been developed across the EU, ranging from farms with pigs outdoors all year round using local breeds to farms with housed pigs having concrete outside runs and using conventional breeds (CorePIG, Rousing et al, 2011). So far, mainly clinical parameters have been used to describe the health situation on organic pig farms, identifying some key problems, such as weaning diarrhoea and piglet mortality. Organic pig production is - amongst others - characterised through a holistic approach based on the EU Regulation (EC) No 834/2007 and the IFOAM principles: ‘health, ecology, fairness and care’. This clearly states that health is more than absence of clinical symptoms and, the relation between animals and their environment is identified: ‘Health’ is defined as ‘the wholeness and integrity of living systems. It is not simply the absence of illness, but the maintenance of physical, mental, social and ecological well-being’ (IFOAM; 2006). Concepts of animal welfare include physical and mental welfare as well as the concept of naturalness (Fraser 2003), which is often interpreted as the ability to perform natural behaviour. Verhoog et al (2003) describe three main approaches within organic agriculture’s concept of nature and naturalness: the no-chemicals approach, the agro-ecology approach and the integrity approach. Applying those concepts to organic pig production can highlight potential conflicts: outdoor systems are perceived as the optimal housing system for pigs, as they allow natural behaviour such as rooting. However, this behaviour can cause damage to the grass cover and furthermore the manure fate in outdoor areas needs to be considered. A few studies on outdoor pig production have shown a clear N and P surplus and a high degree of distribution heterogeneity in outdoor areas, increasing the risk of N and P losses (Watson et al. 2003). Robust and competitive organic pig production needs to encompass low environmental impacts and good animal health and welfare. So far few studies have quantified both aspects in different pig husbandry systems. In addition, the theory that improving animal health and welfare reduces environmental impacts through decreased medicine use, improved growth rate and feed conversion efficiency has still to be verified. The aim of the CoreOrganic2 project ProPIG (2011-2014; carried out in eight European countries) is to examine the relationship between health, welfare and environmental impact. On-farm assessment protocols will be carried out on 75 farms in three pig husbandry systems (outdoor, partly outdoor, indoor with concrete outside run). Environmental impact will be assessed using both Life Cycle Assessment and calculations of nutrient balances at farm and outdoor area level. Animal health and welfare will be evaluated from animal based parameters including clinical and selected behavioural parameters. Results will be fed back and used by the farmers to decide farm specific goals and strategies to achieve these goals. As an outcome, all farms will create their individual health, welfare and environmental plan, which will be reviewed after one year to allow continuous development. This will provide the opportunity not only to investigate, but also improve the influence of organic pig farming systems on animal welfare and environmental impact. This fulfils the fourth IFOAM principle of care: ‘Organic Agriculture should be managed in a precautionary and responsible manner to protect the health and well-being of current and future generations and the environment’ (IFOAM, 2006)
    • 

    corecore