771 research outputs found

    Coral-reef-derived dimethyl sulfide and the climatic impact of the loss of coral reefs

    Get PDF
    Dimethyl sulfide (DMS) is a naturally occurring aerosol precursor gas which plays an important role in the global sulfur budget, aerosol formation and climate. While DMS is produced predominantly by phytoplankton, recent observational literature has suggested that corals and their symbionts produce a comparable amount of DMS, which is unaccounted for in models. It has further been hypothesised that the coral reef source of DMS may modulate regional climate. This hypothesis presents a particular concern given the current threat to coral reefs under anthropogenic climate change. In this paper, a global climate model with online chemistry and aerosol is used to explore the influence of coral-reef-derived DMS on atmospheric composition and climate. A simple representation of coral-reef-derived DMS is developed and added to a common DMS surface water climatology, resulting in an additional flux of 0.3 Tg yr−1 S, or 1.7 % of the global sulfur flux from DMS. By comparing the differences between both nudged and free-running ensemble simulations with and without coral-reef-derived DMS, the influence of coral-reef-derived DMS on regional climate is quantified. In the Maritime Continent–Australian region, where the highest density of coral reefs exists, a small decrease in nucleation- and Aitken-mode aerosol number concentration and mass is found when coral reef DMS emissions are removed from the system. However, these small responses are found to have no robust effect on regional climate via direct and indirect aerosol effects. This work emphasises the complexities of the aerosol–climate system, and the limitations of current modelling capabilities are highlighted, in particular surrounding convective responses to changes in aerosol. In conclusion, we find no robust evidence that coral-reef-derived DMS influences global and regional climate

    Predictive validity of the CriSTAL tool for short-term mortality in older people presenting at Emergency Departments: a prospective study

    Get PDF
    © 2018, The Author(s). Abstract: To determine the validity of the Australian clinical prediction tool Criteria for Screening and Triaging to Appropriate aLternative care (CRISTAL) based on objective clinical criteria to accurately identify risk of death within 3 months of admission among older patients. Methods: Prospective study of ≥ 65 year-olds presenting at emergency departments in five Australian (Aus) and four Danish (DK) hospitals. Logistic regression analysis was used to model factors for death prediction; Sensitivity, specificity, area under the ROC curve and calibration with bootstrapping techniques were used to describe predictive accuracy. Results: 2493 patients, with median age 78–80 years (DK–Aus). The deceased had significantly higher mean CriSTAL with Australian mean of 8.1 (95% CI 7.7–8.6 vs. 5.8 95% CI 5.6–5.9) and Danish mean 7.1 (95% CI 6.6–7.5 vs. 5.5 95% CI 5.4–5.6). The model with Fried Frailty score was optimal for the Australian cohort but prediction with the Clinical Frailty Scale (CFS) was also good (AUROC 0.825 and 0.81, respectively). Values for the Danish cohort were AUROC 0.764 with Fried and 0.794 using CFS. The most significant independent predictors of short-term death in both cohorts were advanced malignancy, frailty, male gender and advanced age. CriSTAL’s accuracy was only modest for in-hospital death prediction in either setting. Conclusions: The modified CriSTAL tool (with CFS instead of Fried’s frailty instrument) has good discriminant power to improve prognostic certainty of short-term mortality for ED physicians in both health systems. This shows promise in enhancing clinician’s confidence in initiating earlier end-of-life discussions

    The impact of diabetes prevention on labour force participation and income of older Australians: an economic study

    Get PDF
    Background: Globally, diabetes is estimated to affect 246 million people and is increasing. In Australia diabetes has been made a national health priority. While the direct costs of treating diabetes are substantial, and rising, the indirect costs are considered greater. There is evidence that interventions to prevent diabetes are effective, and cost-effective, but the impact on labour force participation and income has not been assessed. In this study we quantify the potential impact of implementing a diabetes prevention program, using screening and either metformin or a lifestyle intervention on individual economic outcomes of pre-diabetic Australians aged 45-64. Methods. The output of an epidemiological microsimulation model of the reduction in prevalence of diabetes from a lifestyle or metformin intervention, and another microsimulation model, Health&WealthMOD, of health and the associated impacts on labour force participation, personal income, savings, government revenue and expenditure were used to quantify the estimated outcomes of the two interventions. Results: An additional 753 person years in the labour force would have been achieved from 1993 to 2003 for the male cohort aged 60-64 years in 2003, if a lifestyle intervention had been introduced in 1983; with 890 person years for the equivalent female group. The impact on labour force participation was lower for the metformin intervention, and increased with age for both interventions. The male cohort aged 60-64 years in 2003 would have earned an additional 30millioninincomewiththemetforminintervention,andtheequivalentfemalecohortwouldhaveearnedanadditional30 million in income with the metformin intervention, and the equivalent female cohort would have earned an additional 25 million. If the lifestyle intervention was introduced, the same male and female cohorts would have earned an additional 34millionand34 million and 28 million respectively from 1993 to 2003. For the individuals involved, on average, males would have earned an additional 44,600peryearandfemalesanadditional44,600 per year and females an additional 31,800 per year, if they had continued to work as a result of preventing diabetes. Conclusions: In addition to improved health and wellbeing, considerable benefits to individuals, in terms of both additional working years and increased personal income, could be made by introducing either a lifestyle or metformin intervention to prevent diabetes

    Protective CD8+ T lymphocytes in Primates Immunized with Malaria Sporozoites

    Get PDF
    Live attenuated malaria vaccines are more potent than the recombinant protein, bacterial or viral platform vaccines that have been tested, and an attenuated sporozoite vaccine against falciparum malaria is being developed for humans. In mice, attenuated malaria sporozoite vaccines induce CD8+ T cells that kill parasites developing in the liver. We were curious to know if CD8+ T cells were also important in protecting primates against malaria. We immunized 9 rhesus monkeys with radiation attenuated Plasmodium knowlesi sporozoites, and found that 5 did not develop blood stage infections after challenge with live sporozoites. We then injected 4 of these protected monkeys with cM-T807, a monoclonal antibody to the CD8 molecule which depletes T cells. The fifth monkey received equivalent doses of normal IgG. In 3 of the 4 monkeys receiving cM-T807 circulating CD8+ T cells were profoundly depleted. When re-challenged with live sporozoites all 3 of these depleted animals developed blood stage malaria. The fourth monkey receiving cM-T807 retained many circulating CD8+ T cells. This monkey, and the vaccinated monkey receiving normal IgG, did not develop blood stage malaria at re-challenge with live sporozoites. Animals were treated with antimalarial drugs and rested for 4 months. During this interval CD8+ T cells re-appeared in the circulation of the depleted monkeys. When all vaccinated animals received a third challenge with live sporozoites, all 5 monkeys were once again protected and did not develop blood stage malaria infections. These data indicate that CD8+ T cells are important effector cells protecting monkeys against malaria sporozoite infection. We believe that malaria vaccines which induce effector CD8+ T cells in humans will have the best chance of protecting against malaria

    Herpes simplex virus and rates of cognitive decline or whole brain atrophy in the Dominantly Inherited Alzheimer Network

    Full text link
    Objective: To investigate whether herpes simplex virus type 1 (HSV-1) infection was associated with rates of cognitive decline or whole brain atrophy among individuals from the Dominantly Inherited Alzheimer Network (DIAN). Methods: Among two subsets of the DIAN cohort (age range 19.6–66.6 years; median follow-up 3.0 years) we examined (i) rate of cognitive decline (N = 164) using change in mini-mental state examination (MMSE) score, (ii) rate of whole brain atrophy (N = 149), derived from serial MR imaging, calculated using the boundary shift integral (BSI) method. HSV-1 antibodies were assayed in baseline sera collected from 2009–2015. Linear mixed-effects models were used to compare outcomes by HSV-1 seropositivity and high HSV-1 IgG titres/IgM status. Results: There was no association between baseline HSV-1 seropositivity and rates of cognitive decline or whole brain atrophy. Having high HSV-1 IgG titres/IgM was associated with a slightly greater decline in MMSE points per year (difference in slope − 0.365, 95% CI: −0.958 to −0.072), but not with rate of whole brain atrophy. Symptomatic mutation carriers declined fastest on both MMSE and BSI measures, however, this was not influenced by HSV-1. Among asymptomatic mutation carriers, rates of decline on MMSE and BSI were slightly greater among those who were HSV-1 seronegative. Among mutation-negative individuals, no differences were seen by HSV-1. Stratifying by APOE4 status yielded inconsistent results. Interpretation: We found no evidence for a major role of HSV-1, measured by serum antibodies, in cognitive decline or whole brain atrophy among individuals at high risk of early-onset AD

    Prevalence of transfusion-transmitted Chagas disease among multitransfused patients in Brazil

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Blood transfusion has always been an important route for Chagas Disease (CD) transmission. The high prevalence of CD in Latin America and its lifelong asymptomatic clinical picture pose a threat for the safety of the blood supply. The outcome of measures designed to improve transfusion safety can be assessed by evaluating the prevalence of CD among multitransfused patients</p> <p>Methods</p> <p>In order to assess the impact of CD control measures on the safety of the blood supply, an observational cross-sectional study was designed to determine the prevalence of CD in 351 highly transfused patients, in which vectorial transmission was excluded. This study compared patients that received transfusion products before (n = 230) and after (n = 121) 1997, when measures to control transfusion-transmitted CD were fully implemented in Brazil.</p> <p>Results</p> <p>The study group consisted of 351 patients exposed to high numbers of blood products during their lifetime (median number of units transfused = 51, range 10–2086). A higher prevalence of transfusion-transmitted CD (1.30%) was observed among multitransfused patients that received their first transfusion before 1997, compared with no cases of transfusion-transmitted CD among multitransfused patients transfused after that year. The magnitude of the exposure to blood products was similar among both groups (mean number of units transfused per year of exposure = 25.00 ± 26.46 and 23.99 ± 30.58 respectively; P = 0.75, Mann-Whitney test).</p> <p>Conclusion</p> <p>Multiple initiatives aimed to control vector and parental transmission of CD can significantly decrease transfusion-transmitted CD in Brazil. Our data suggest that mandatory donor screening for CD represents the most important measure to interrupt transmission of CD by blood transfusions.</p

    Percutaneous coronary revascularization in patients with formerly "refractory angina pectoris in end-stage coronary artery disease" – Not "end-stage" after all

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Patients with refractory angina pectoris in end-stage coronary artery disease represent a severe condition with a higher reduction of life-expectancy and quality of life as compared to patients with stable coronary artery disease. It was the purpose of this study to invasively re-evaluate highly symptomatic patients with formerly diagnosed refractory angina pectoris in end-stage coronary artery disease for feasible options of myocardial revascularization.</p> <p>Methods</p> <p>Thirty-four Patients formerly characterized as having end stage coronary artery disease with refractory angina pectoris were retrospectively followed for coronary interventions.</p> <p>Results</p> <p>Of those 34 patients 21 (61.8%) were eventually revascularized with percutaneous interventional revascularization (PCI). Due to complex coronary morphology (angulation, chronic total occlusion) PCI demanded an above-average amount of time (66 ± 42 minutes, range 25–206 minutes) and materials (contrast media 247 ± 209 ml, range 50–750 ml; PCI guiding wires 2.0 ± 1.4, range 1–6 wires). Of PCI patients 7 (33.3%) showed a new lesion as a sign of progression of atherosclerosis. Clinical success rate with a reduction to angina class II or lower was 71.4% at 30 days. Surgery was performed in a total of8 (23.5%) patients with a clinical success rate of 62.5%. Based on an intention-to-treat 2 patients of originally 8 (25%) demonstrated clinical success. Mortality during follow-up (1–18 months) was 4.8% in patients who underwent PCI, 25% in patients treated surgically and 25% in those only treated medically.</p> <p>Conclusion</p> <p>The majority of patients with end-stage coronary artery disease can be treated effectively with conventional invasive treatment modalities. Therefore even though it is challenging and demanding PCI should be considered as a first choice before experimental interventions are considered.</p

    Protection from Experimental Cerebral Malaria with a Single Dose of Radiation-Attenuated, Blood-Stage Plasmodium berghei Parasites

    Get PDF
    BACKGROUND: Whole malaria parasites are highly effective in inducing immunity against malaria. Due to the limited success of subunit based vaccines in clinical studies, there has been a renewed interest in whole parasite-based malaria vaccines. Apart from attenuated sporozoites, there have also been efforts to use live asexual stage parasites as vaccine immunogens. METHODOLOGY AND RESULTS: We used radiation exposure to attenuate the highly virulent asexual blood stages of the murine malaria parasite P. berghei to a non-replicable, avirulent form. We tested the ability of the attenuated blood stage parasites to induce immunity to parasitemia and the symptoms of severe malaria disease. Depending on the mouse genetic background, a single high dose immunization without adjuvant protected mice from parasitemia and severe disease (CD1 mice) or from experimental cerebral malaria (ECM) (C57BL/6 mice). A low dose immunization did not protect against parasitemia or severe disease in either model after one or two immunizations. The protection from ECM was associated with a parasite specific antibody response and also with a lower level of splenic parasite-specific IFN-γ production, which is a mediator of ECM pathology in C57BL/6 mice. Surprisingly, there was no difference in the sequestration of CD8+ T cells and CD45+ CD11b+ macrophages in the brains of immunized, ECM-protected mice. CONCLUSIONS: This report further demonstrates the effectiveness of a whole parasite blood-stage vaccine in inducing immunity to malaria and explicitly demonstrates its effectiveness against ECM, the most pathogenic consequence of malaria infection. This experimental model will be important to explore the formulation of whole parasite blood-stage vaccines against malaria and to investigate the immune mechanisms that mediate protection against parasitemia and cerebral malaria

    Genome of the Avirulent Human-Infective Trypanosome—Trypanosoma rangeli

    Get PDF
    Background: Trypanosoma rangeli is a hemoflagellate protozoan parasite infecting humans and other wild and domestic mammals across Central and South America. It does not cause human disease, but it can be mistaken for the etiologic agent of Chagas disease, Trypanosoma cruzi. We have sequenced the T. rangeli genome to provide new tools for elucidating the distinct and intriguing biology of this species and the key pathways related to interaction with its arthropod and mammalian hosts.  Methodology/Principal Findings: The T. rangeli haploid genome is ,24 Mb in length, and is the smallest and least repetitive trypanosomatid genome sequenced thus far. This parasite genome has shorter subtelomeric sequences compared to those of T. cruzi and T. brucei; displays intraspecific karyotype variability and lacks minichromosomes. Of the predicted 7,613 protein coding sequences, functional annotations could be determined for 2,415, while 5,043 are hypothetical proteins, some with evidence of protein expression. 7,101 genes (93%) are shared with other trypanosomatids that infect humans. An ortholog of the dcl2 gene involved in the T. brucei RNAi pathway was found in T. rangeli, but the RNAi machinery is non-functional since the other genes in this pathway are pseudogenized. T. rangeli is highly susceptible to oxidative stress, a phenotype that may be explained by a smaller number of anti-oxidant defense enzymes and heatshock proteins.  Conclusions/Significance: Phylogenetic comparison of nuclear and mitochondrial genes indicates that T. rangeli and T. cruzi are equidistant from T. brucei. In addition to revealing new aspects of trypanosome co-evolution within the vertebrate and invertebrate hosts, comparative genomic analysis with pathogenic trypanosomatids provides valuable new information that can be further explored with the aim of developing better diagnostic tools and/or therapeutic targets

    Rapid and Sensitive Detection of Yersinia pestis Using Amplification of Plague Diagnostic Bacteriophages Monitored by Real-Time PCR

    Get PDF
    BACKGROUND: Yersinia pestis, the agent of plague, has caused many millions of human deaths and still poses a serious threat to global public health. Timely and reliable detection of such a dangerous pathogen is of critical importance. Lysis by specific bacteriophages remains an essential method of Y. pestis detection and plague diagnostics. METHODOLOGY/PRINCIPAL FINDINGS: The objective of this work was to develop an alternative to conventional phage lysis tests--a rapid and highly sensitive method of indirect detection of live Y. pestis cells based on quantitative real-time PCR (qPCR) monitoring of amplification of reporter Y. pestis-specific bacteriophages. Plague diagnostic phages phiA1122 and L-413C were shown to be highly effective diagnostic tools for the detection and identification of Y. pestis by using qPCR with primers specific for phage DNA. The template DNA extraction step that usually precedes qPCR was omitted. phiA1122-specific qPCR enabled the detection of an initial bacterial concentration of 10(3) CFU/ml (equivalent to as few as one Y. pestis cell per 1-microl sample) in four hours. L-413C-mediated detection of Y. pestis was less sensitive (up to 100 bacteria per sample) but more specific, and thus we propose parallel qPCR for the two phages as a rapid and reliable method of Y. pestis identification. Importantly, phiA1122 propagated in simulated clinical blood specimens containing EDTA and its titer rise was detected by both a standard plating test and qPCR. CONCLUSIONS/SIGNIFICANCE: Thus, we developed a novel assay for detection and identification of Y. pestis using amplification of specific phages monitored by qPCR. The method is simple, rapid, highly sensitive, and specific and allows the detection of only live bacteria
    corecore