54 research outputs found

    Commentary: reconstructing four centuries of temperature-induced coral bleaching on the great barrier reef

    Get PDF
    Coral reefs are spectacular ecosystems found along tropical coastlines where they provide goods and services to hundreds of millions of people. While under threat from local factors, coral reefs are increasingly susceptible to ocean warming from anthropogenic climate change. One of the signature disturbances is the large-scale, and often deadly, breakdown of the symbiosis between corals and dinoflagellates. This is referred to as mass coral bleaching and often causes mass mortality. The first scientific records of mass bleaching date to the early 1980s (Hoegh-Guldberg et al., 2017). Kamenos and Hennige (2018, hereafter KH18), however, claim to show that mass coral bleaching is not a recent phenomenon, and has occurred regularly over the past four centuries (1572–2001) on the Great Barrier Reef (GBR), Australia. They support their claim by developing a putative proxy for coral bleaching that uses the suggested relationship between elevated sea surface temperatures (SSTs) and reduced linear extension rates of 44 Porites spp. coral cores from 28 GBR reefs. If their results are correct, then mass coral bleaching events have been a frequent feature for hundreds of years in sharp contrast to the vast majority of scientific evidence. There are, however, major flaws in the KH18 methodology. Their use of the Extended Reconstructed Sea Surface Temperature (ERSST) dataset (based on ship and buoy observations) for reef temperatures from 1854 to 2001, ignores the increasing unreliability of these data which become sparse, less rigorous, and more interpolated going back in time. To demonstrate how the quality of these data degrades, we plot the average number of SST observations per month that contribute to each 200 x 200 km ERSST pixel (Figure 1A, black line). Note that from 1854 to 1900 the four ERSST pixels used by KH18 averaged only 0.85 observations per month, and 82% of these months had no observations at all. Given the heterogeneous nature of SST at local and regional levels, using such broad-scale data as ERSST, is likely to produce substantial errors at reef scales (Figure 1A, red line prior to 1900)

    Effectiveness of Provider and Community Interventions to Improve Treatment of Uncomplicated Malaria in Nigeria: A Cluster Randomized Controlled Trial

    Get PDF
    The World Health Organization recommends that malaria be confirmed by parasitological diagnosis before treatment using Artemisinin-based Combination Therapy (ACT). Despite this, many health workers in malaria endemic countries continue to diagnose malaria based on symptoms alone. This study evaluates interventions to help bridge this gap between guidelines and provider practice. A stratified cluster-randomized trial in 42 communities in Enugu state compared 3 scenarios: Rapid Diagnostic Tests (RDTs) with basic instruction (control); RDTs with provider training (provider arm); and RDTs with provider training plus a school-based community intervention (provider-school arm). The primary outcome was the proportion of patients treated according to guidelines, a composite indicator requiring patients to be tested for malaria and given treatment consistent with the test result. The primary outcome was evaluated among 4946 (93%) of the 5311 patients invited to participate. A total of 40 communities (12 in control, 14 per intervention arm) were included in the analysis. There was no evidence of differences between the three arms in terms of our composite indicator (p = 0.36): stratified risk difference was 14% (95% CI -8.3%, 35.8%; p = 0.26) in the provider arm and 1% (95% CI -21.1%, 22.9%; p = 0.19) in the provider-school arm, compared with control. The level of testing was low across all arms (34% in control; 48% provider arm; 37% provider-school arm; p = 0.47). Presumptive treatment of uncomplicated malaria remains an ingrained behaviour that is difficult to change. With or without extensive supporting interventions, levels of testing in this study remained critically low. Governments and researchers must continue to explore alternative ways of encouraging providers to deliver appropriate treatment and avoid the misuse of valuable medicines

    Chlorpromazine versus placebo for schizophrenia

    Get PDF

    BCG Revaccination Does Not Protect Against Leprosy in the Brazilian Amazon: A Cluster Randomised Trial

    Get PDF
    BCG is a vaccine developed and used to protect against tuberculosis, but it can also protect against leprosy. In Brazil, children receive BCG at birth, and since 1996 a trial has been conducted to find out if a second dose of BCG administered to schoolchildren gives additional protection against tuberculosis. We use this trial to find out if such vaccination protects against leprosy. The trial was conducted in the Brazilian Amazon, involving almost 100,000 children aged 7–14 years who had received neonatal BCG. Half of them received a second dose of BCG at school, and the other half did not. We followed the children for 6 years and observed that there were as many new cases of leprosy in the vaccinated children as in the unvaccinated children. Therefore, we concluded that a second dose of BCG given at school age in the Brazilian Amazon offers no additional protection against leprosy

    Predicting flood insurance claims with hydrologic and socioeconomic demographics via machine learning: exploring the roles of topography, minority populations, and political dissimilarity

    Get PDF
    Current research on flooding risk often focuses on understanding hazards, de-emphasizing the complex pathways of exposure and vulnerability. We investigated the use of both hydrologic and social demographic data for flood exposure mapping with Random Forest (RF) regression and classification algorithms trained to predict both parcel- and tract-level flood insurance claims within New York State, US. Topographic characteristics best described flood claim frequency, but RF prediction skill was improved at both spatial scales when socioeconomic data was incorporated. Substantial improvements occurred at the tract-level when the percentage of minority residents, housing stock value and age, and the political dissimilarity index of voting precincts were used to predict insurance claims. Census tracts with higher numbers of claims and greater densities of low-lying tax parcels tended to have low proportions of minority residents, newer houses, and less political similarity to state level government. We compared this data-driven approach and a physically-based pluvial flood routing model for prediction of the spatial extents of flooding claims in two nearby catchments of differing land use. The floodplain we defined with physically based modeling agreed well with existing federal flood insurance rate maps, but underestimated the spatial extents of historical claim generating areas. In contrast, RF classification incorporating hydrologic and socioeconomic demographic data likely overestimated the flood-exposed areas. Our research indicates that quantitative incorporation of social data can improve flooding exposure estimates

    The Ontario printed educational message (OPEM) trial to narrow the evidence-practice gap with respect to prescribing practices of general and family physicians: a cluster randomized controlled trial, targeting the care of individuals with diabetes and hypertension in Ontario, Canada

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There are gaps between what family practitioners do in clinical practice and the evidence-based ideal. The most commonly used strategy to narrow these gaps is the printed educational message (PEM); however, the attributes of successful printed educational messages and their overall effectiveness in changing physician practice are not clear. The current endeavor aims to determine whether such messages change prescribing quality in primary care practice, and whether these effects differ with the format of the message.</p> <p>Methods/design</p> <p>The design is a large, simple, factorial, unblinded cluster-randomized controlled trial. PEMs will be distributed with <b><it>informed</it></b>, a quarterly evidence-based synopsis of current clinical information produced by the Institute for Clinical Evaluative Sciences, Toronto, Canada, and will be sent to all eligible general and family practitioners in Ontario. There will be three replicates of the trial, with three different educational messages, each aimed at narrowing a specific evidence-practice gap as follows: 1) angiotensin-converting enzyme inhibitors, hypertension treatment, and cholesterol lowering agents for diabetes; 2) retinal screening for diabetes; and 3) diuretics for hypertension.</p> <p>For each of the three replicates there will be three intervention groups. The first group will receive <b><it>informed </it></b>with an attached postcard-sized, short, directive "outsert." The second intervention group will receive <b><it>informed </it></b>with a two-page explanatory "insert" on the same topic. The third intervention group will receive <b><it>informed</it></b>, with both the above-mentioned outsert and insert. The control group will receive <b><it>informed </it></b>only, without either an outsert or insert.</p> <p>Routinely collected physician billing, prescription, and hospital data found in Ontario's administrative databases will be used to monitor pre-defined prescribing changes relevant and specific to each replicate, following delivery of the educational messages. Multi-level modeling will be used to study patterns in physician-prescribing quality over four quarters, before and after each of the three interventions. Subgroup analyses will be performed to assess the association between the characteristics of the physician's place of practice and target behaviours.</p> <p>A further analysis of the immediate and delayed impacts of the PEMs will be performed using time-series analysis and interventional, auto-regressive, integrated moving average modeling.</p> <p>Trial registration number</p> <p>Current controlled trial ISRCTN72772651.</p

    Whole-genome sequencing reveals host factors underlying critical COVID-19

    Get PDF
    Critical COVID-19 is caused by immune-mediated inflammatory lung injury. Host genetic variation influences the development of illness requiring critical care1 or hospitalization2–4 after infection with SARS-CoV-2. The GenOMICC (Genetics of Mortality in Critical Care) study enables the comparison of genomes from individuals who are critically ill with those of population controls to find underlying disease mechanisms. Here we use whole-genome sequencing in 7,491 critically ill individuals compared with 48,400 controls to discover and replicate 23 independent variants that significantly predispose to critical COVID-19. We identify 16 new independent associations, including variants within genes that are involved in interferon signalling (IL10RB and PLSCR1), leucocyte differentiation (BCL11A) and blood-type antigen secretor status (FUT2). Using transcriptome-wide association and colocalization to infer the effect of gene expression on disease severity, we find evidence that implicates multiple genes—including reduced expression of a membrane flippase (ATP11A), and increased expression of a mucin (MUC1)—in critical disease. Mendelian randomization provides evidence in support of causal roles for myeloid cell adhesion molecules (SELE, ICAM5 and CD209) and the coagulation factor F8, all of which are potentially druggable targets. Our results are broadly consistent with a multi-component model of COVID-19 pathophysiology, in which at least two distinct mechanisms can predispose to life-threatening disease: failure to control viral replication; or an enhanced tendency towards pulmonary inflammation and intravascular coagulation. We show that comparison between cases of critical illness and population controls is highly efficient for the detection of therapeutically relevant mechanisms of disease
    corecore