35 research outputs found
Consumers' salient beliefs regarding dairy products in the functional food era: a qualitative study using concepts from the theory of planned behaviour
<p>Abstract</p> <p>Background</p> <p>Inadequate consumption of dairy products without appropriate dietary substitution may have deleterious health consequences. Social research reveals the factors that may impede compliance with dietary recommendations. This is particularly important given the recent introduction of functional dairy products. One of the challenges for public health professionals is to demonstrate the efficacy of nutrition education in improving attitudes toward nutrient rich foods. The aim of this study was to explore the salient beliefs of adult weight loss trial participants regarding both traditional and functional dairy products and to compare these with a control group not exposed to nutrition education.</p> <p>Methods</p> <p>Six focus groups were conducted, three with weight loss trial completers (<it>n </it>= 15) that had received nutrition education and three with individuals from the same region (<it>n </it>= 14) to act as controls. Transcribed focus groups were coded using the Theory of Planned Behaviour theoretical framework.</p> <p>Results</p> <p>Non-trial participants perceived dairy foods as weight inducing and were sceptical of functional dairy products. A lack of time/ability to decipher dairy food labels was also discussed by these individuals. In contrast trial participants discussed several health benefits related to dairy foods, practised label reading and were confident in their ability to incorporate dairy foods into their diet. Normative beliefs expressed were similar for both groups indicating that these were more static and less amenable to change through nutrition education than control and behavioural beliefs.</p> <p>Conclusions</p> <p>Nutrition education provided as a result of weight loss trial participation influenced behavioural and control beliefs relating to dairy products. This study provides a proof of concept indication that nutrition education may improve attitudes towards dairy products and may thus be an important target for public health campaigns seeking to increase intake of this food group.</p
Variants of the EAAT2 Glutamate Transporter Gene Promoter Are Associated with Cerebral Palsy in Preterm Infants
© 2017, The Author(s). Preterm delivery is associated with neurodevelopmental impairment caused by environmental and genetic factors. Dysfunction of the excitatory amino acid transporter 2 (EAAT2) and the resultant impaired glutamate uptake can lead to neurological disorders. In this study, we investigated the role of single nucleotide polymorphisms (SNPs; g.-200CCloseSPigtSPiA and g.-181ACloseSPigtSPiC) in the EAAT2 promoter in susceptibility to brain injury and neurodisability in very preterm infants born at or before 32-week gestation. DNA isolated from newborns’ dried blood spots were used for pyrosequencing to detect both SNPs. Association between EAAT2 genotypes and cerebral palsy, cystic periventricular leukomalacia and a low developmental score was then assessed. The two SNPs were concordant in 89.4% of infants resulting in three common genotypes all carrying two C and two A alleles in different combinations. However, in 10.6% of cases, non-concordance was found, generating six additional rare genotypes. The A alleles at both loci appeared to be detrimental and consequently, the risk of developing cerebral palsy increased four- and sixfold for each additional detrimental allele at -200 and -181bp, respectively. The two SNPs altered the regulation of the EAAT2 promoter activity and glutamate homeostasis. This study highlights the significance of glutamate in the pathogenesis of preterm brain injury and subsequent development of cerebral palsy and neurodevelopmental disabilities. Furthermore, the described EAAT2 SNPs may be an early biomarker of vulnerability to neurodisability and may aid the development of targeted treatment strategies
Mortality and pulmonary complications in patients undergoing surgery with perioperative SARS-CoV-2 infection: an international cohort study
Background: The impact of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) on postoperative recovery needs to be understood to inform clinical decision making during and after the COVID-19 pandemic. This study reports 30-day mortality and pulmonary complication rates in patients with perioperative SARS-CoV-2 infection. Methods: This international, multicentre, cohort study at 235 hospitals in 24 countries included all patients undergoing surgery who had SARS-CoV-2 infection confirmed within 7 days before or 30 days after surgery. The primary outcome measure was 30-day postoperative mortality and was assessed in all enrolled patients. The main secondary outcome measure was pulmonary complications, defined as pneumonia, acute respiratory distress syndrome, or unexpected postoperative ventilation. Findings: This analysis includes 1128 patients who had surgery between Jan 1 and March 31, 2020, of whom 835 (74·0%) had emergency surgery and 280 (24·8%) had elective surgery. SARS-CoV-2 infection was confirmed preoperatively in 294 (26·1%) patients. 30-day mortality was 23·8% (268 of 1128). Pulmonary complications occurred in 577 (51·2%) of 1128 patients; 30-day mortality in these patients was 38·0% (219 of 577), accounting for 81·7% (219 of 268) of all deaths. In adjusted analyses, 30-day mortality was associated with male sex (odds ratio 1·75 [95% CI 1·28–2·40], p\textless0·0001), age 70 years or older versus younger than 70 years (2·30 [1·65–3·22], p\textless0·0001), American Society of Anesthesiologists grades 3–5 versus grades 1–2 (2·35 [1·57–3·53], p\textless0·0001), malignant versus benign or obstetric diagnosis (1·55 [1·01–2·39], p=0·046), emergency versus elective surgery (1·67 [1·06–2·63], p=0·026), and major versus minor surgery (1·52 [1·01–2·31], p=0·047). Interpretation: Postoperative pulmonary complications occur in half of patients with perioperative SARS-CoV-2 infection and are associated with high mortality. Thresholds for surgery during the COVID-19 pandemic should be higher than during normal practice, particularly in men aged 70 years and older. Consideration should be given for postponing non-urgent procedures and promoting non-operative treatment to delay or avoid the need for surgery. Funding: National Institute for Health Research (NIHR), Association of Coloproctology of Great Britain and Ireland, Bowel and Cancer Research, Bowel Disease Research Foundation, Association of Upper Gastrointestinal Surgeons, British Association of Surgical Oncology, British Gynaecological Cancer Society, European Society of Coloproctology, NIHR Academy, Sarcoma UK, Vascular Society for Great Britain and Ireland, and Yorkshire Cancer Research
The Changing Landscape for Stroke\ua0Prevention in AF: Findings From the GLORIA-AF Registry Phase 2
Background GLORIA-AF (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients with Atrial Fibrillation) is a prospective, global registry program describing antithrombotic treatment patterns in patients with newly diagnosed nonvalvular atrial fibrillation at risk of stroke. Phase 2 began when dabigatran, the first non\u2013vitamin K antagonist oral anticoagulant (NOAC), became available. Objectives This study sought to describe phase 2 baseline data and compare these with the pre-NOAC era collected during phase 1. Methods During phase 2, 15,641 consenting patients were enrolled (November 2011 to December 2014); 15,092 were eligible. This pre-specified cross-sectional analysis describes eligible patients\u2019 baseline characteristics. Atrial fibrillation disease characteristics, medical outcomes, and concomitant diseases and medications were collected. Data were analyzed using descriptive statistics. Results Of the total patients, 45.5% were female; median age was 71 (interquartile range: 64, 78) years. Patients were from Europe (47.1%), North America (22.5%), Asia (20.3%), Latin America (6.0%), and the Middle East/Africa (4.0%). Most had high stroke risk (CHA2DS2-VASc [Congestive heart failure, Hypertension, Age 6575 years, Diabetes mellitus, previous Stroke, Vascular disease, Age 65 to 74 years, Sex category] score 652; 86.1%); 13.9% had moderate risk (CHA2DS2-VASc = 1). Overall, 79.9% received oral anticoagulants, of whom 47.6% received NOAC and 32.3% vitamin K antagonists (VKA); 12.1% received antiplatelet agents; 7.8% received no antithrombotic treatment. For comparison, the proportion of phase 1 patients (of N = 1,063 all eligible) prescribed VKA was 32.8%, acetylsalicylic acid 41.7%, and no therapy 20.2%. In Europe in phase 2, treatment with NOAC was more common than VKA (52.3% and 37.8%, respectively); 6.0% of patients received antiplatelet treatment; and 3.8% received no antithrombotic treatment. In North America, 52.1%, 26.2%, and 14.0% of patients received NOAC, VKA, and antiplatelet drugs, respectively; 7.5% received no antithrombotic treatment. NOAC use was less common in Asia (27.7%), where 27.5% of patients received VKA, 25.0% antiplatelet drugs, and 19.8% no antithrombotic treatment. Conclusions The baseline data from GLORIA-AF phase 2 demonstrate that in newly diagnosed nonvalvular atrial fibrillation patients, NOAC have been highly adopted into practice, becoming more frequently prescribed than VKA in Europe and North America. Worldwide, however, a large proportion of patients remain undertreated, particularly in Asia and North America. (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients With Atrial Fibrillation [GLORIA-AF]; NCT01468701
Application of Ligninolytic Enzymes in the Production of Biofuels from Cotton Wastes
The application of ligninolytic fungi and enzymes is an option to overcome the issues related with the production of biofuels using cotton wastes. In this dissertation, the ligninolytic fungus and enzymes were evaluated as pretreatment for the biochemical conversion of Cotton Gin Trash (CGT) in ethanol and as a treatment for the transformation of cotton wastes biochar in other substances.
In biochemical conversion, seven combinations of three pretreatments (ultrasonication, liquid hot water and ligninolytic enzymes) were evaluated on CGT. The best results were achieved by the sequential combination of ultrasonication, hot water, and ligninolytic enzymes with an improvement of 10% in ethanol yield. To improve these results, alkaline-ultrasonication was evaluated. Additionally, Fourier Transform Infrared (FT-IR) and principal component analysis (PCA) were employed as fast methodology to identify structural differences in the biomass. The combination of ultrasonication-alkali hydrolysis, hot liquid water, and ligninolytic enzymes using 15% of NaOH improved 35% ethanol yield compared with the original treatment. Additionally, FT-IR and PCA identified modifications in the biomass structure after different types of pretreatments and conditions.
In thermal conversion, this study evaluated the biodepolymerization of cotton wastes biochar using chemical and biological treatments. The chemical depolymerization evaluated three chemical agents (KMnO4, H2SO4, and NaOH), with three concentrations and two environmental conditions. The sulfuric acid treatments performed the largest transformations of the biochar solid phase; whereas, the KMnO4 treatments achieved the largest depolymerizations. The compounds released into the liquid phase were correlated with fulvic and humic acids and silicon compounds.
The biological depolymerization utilized four ligninolytic fungi Phanerochaete chrysosporium, Ceriporiopsis subvermispora, Postia placenta, and Bjerkandera adusta. The greatest depolymerization was obtained by C. subvermispora. The depolymerization kinetics of C. subvermispora evidenced the production of laccase and manganese peroxidase and a correlation between depolymerization and production of ligninolytic enzymes. The modifications obtained in the liquid and solid phases showed the production of humic and fulvic acids from the cultures with C. subvermispora.
The results of this research are the initial steps for the development of new processes using the ligninolytic fungus and their enzymes for the production of biofuels from cotton wastes
Measuring the health-related Sustainable Development Goals in 188 countries : a baseline analysis from the Global Burden of Disease Study 2015
Background In September, 2015, the UN General Assembly established the Sustainable Development Goals (SDGs). The SDGs specify 17 universal goals, 169 targets, and 230 indicators leading up to 2030. We provide an analysis of 33 health-related SDG indicators based on the Global Burden of Diseases, Injuries, and Risk Factors Study 2015 (GBD 2015). Methods We applied statistical methods to systematically compiled data to estimate the performance of 33 health-related SDG indicators for 188 countries from 1990 to 2015. We rescaled each indicator on a scale from 0 (worst observed value between 1990 and 2015) to 100 (best observed). Indices representing all 33 health-related SDG indicators (health-related SDG index), health-related SDG indicators included in the Millennium Development Goals (MDG index), and health-related indicators not included in the MDGs (non-MDG index) were computed as the geometric mean of the rescaled indicators by SDG target. We used spline regressions to examine the relations between the Socio-demographic Index (SDI, a summary measure based on average income per person, educational attainment, and total fertility rate) and each of the health-related SDG indicators and indices. Findings In 2015, the median health-related SDG index was 59.3 (95% uncertainty interval 56.8-61.8) and varied widely by country, ranging from 85.5 (84.2-86.5) in Iceland to 20.4 (15.4-24.9) in Central African Republic. SDI was a good predictor of the health-related SDG index (r(2) = 0.88) and the MDG index (r(2) = 0.2), whereas the non-MDG index had a weaker relation with SDI (r(2) = 0.79). Between 2000 and 2015, the health-related SDG index improved by a median of 7.9 (IQR 5.0-10.4), and gains on the MDG index (a median change of 10.0 [6.7-13.1]) exceeded that of the non-MDG index (a median change of 5.5 [2.1-8.9]). Since 2000, pronounced progress occurred for indicators such as met need with modern contraception, under-5 mortality, and neonatal mortality, as well as the indicator for universal health coverage tracer interventions. Moderate improvements were found for indicators such as HIV and tuberculosis incidence, minimal changes for hepatitis B incidence took place, and childhood overweight considerably worsened. Interpretation GBD provides an independent, comparable avenue for monitoring progress towards the health-related SDGs. Our analysis not only highlights the importance of income, education, and fertility as drivers of health improvement but also emphasises that investments in these areas alone will not be sufficient. Although considerable progress on the health-related MDG indicators has been made, these gains will need to be sustained and, in many cases, accelerated to achieve the ambitious SDG targets. The minimal improvement in or worsening of health-related indicators beyond the MDGs highlight the need for additional resources to effectively address the expanded scope of the health-related SDGs.Peer reviewe
Genomic evidence for the evolution of Streptococcus equi : host restriction, increased virulence, and genetic exchange with human pathogens
The continued evolution of bacterial pathogens has major implications for both human and animal disease, but the exchange of genetic material between host-restricted pathogens is rarely considered. Streptococcus equi subspecies equi (S. equi) is a host-restricted pathogen of horses that has evolved from the zoonotic pathogen Streptococcus equi subspecies zooepidemicus (S. zooepidemicus). These pathogens share approximately 80% genome sequence identity with the important human pathogen Streptococcus pyogenes. We sequenced and compared the genomes of S. equi 4047 and S. zooepidemicus H70 and screened S. equi and S. zooepidemicus strains from around the world to uncover evidence of the genetic events that have shaped the evolution of the S. equi genome and led to its emergence as a host-restricted pathogen. Our analysis provides evidence of functional loss due to mutation and deletion, coupled with pathogenic specialization through the acquisition of bacteriophage encoding a phospholipase A(2) toxin, and four superantigens, and an integrative conjugative element carrying a novel iron acquisition system with similarity to the high pathogenicity island of Yersinia pestis. We also highlight that S. equi, S. zooepidemicus, and S. pyogenes share a common phage pool that enhances cross-species pathogen evolution. We conclude that the complex interplay of functional loss, pathogenic specialization, and genetic exchange between S. equi, S. zooepidemicus, and S. pyogenes continues to influence the evolution of these important streptococci.Publisher PDFPeer reviewe