57 research outputs found

    Bayesian modelling of high-throughput sequencing assays with malacoda.

    Get PDF
    NGS studies have uncovered an ever-growing catalog of human variation while leaving an enormous gap between observed variation and experimental characterization of variant function. High-throughput screens powered by NGS have greatly increased the rate of variant functionalization, but the development of comprehensive statistical methods to analyze screen data has lagged. In the massively parallel reporter assay (MPRA), short barcodes are counted by sequencing DNA libraries transfected into cells and the cell\u27s output RNA in order to simultaneously measure the shifts in transcription induced by thousands of genetic variants. These counts present many statistical challenges, including overdispersion, depth dependence, and uncertain DNA concentrations. So far, the statistical methods used have been rudimentary, employing transformations on count level data and disregarding experimental and technical structure while failing to quantify uncertainty in the statistical model. We have developed an extensive framework for the analysis of NGS functionalization screens available as an R package called malacoda (available from github.com/andrewGhazi/malacoda). Our software implements a probabilistic, fully Bayesian model of screen data. The model uses the negative binomial distribution with gamma priors to model sequencing counts while accounting for effects from input library preparation and sequencing depth. The method leverages the high-throughput nature of the assay to estimate the priors empirically. External annotations such as ENCODE data or DeepSea predictions can also be incorporated to obtain more informative priors-a transformative capability for data integration. The package also includes quality control and utility functions, including automated barcode counting and visualization methods. To validate our method, we analyzed several datasets using malacoda and alternative MPRA analysis methods. These data include experiments from the literature, simulated assays, and primary MPRA data. We also used luciferase assays to experimentally validate several hits from our primary data, as well as variants for which the various methods disagree and variants detectable only with the aid of external annotations

    Functionalization of CD36 Cardiovascular Disease and Expression Associated Variants by Interdisciplinary High Throughput Analysis.

    Get PDF
    CD36 is a platelet membrane glycoprotein whose engagement with oxidized low-density lipoprotein (oxLDL) results in platelet activation. The CD36 gene has been associated with platelet count, platelet volume, as well as lipid levels and CVD risk by genome-wide association studies. Platelet CD36 expression levels have been shown to be associated with both the platelet oxLDL response and an elevated risk of thrombo-embolism. Several genomic variants have been identified as associated with platelet CD36 levels, however none have been conclusively demonstrated to be causative. We screened 81 expression quantitative trait loci (eQTL) single nucleotide polymorphisms (SNPs) associated with platelet CD36 expression by a Massively Parallel Reporter Assay (MPRA) and analyzed the results with a novel Bayesian statistical method. Ten eQTLs located 13kb to 55kb upstream of the CD36 transcriptional start site of transcript ENST00000309881 and 49kb to 92kb upstream of transcript ENST00000447544, demonstrated significant transcription shifts between their minor and major allele in the MPRA assay. Of these, rs2366739 and rs1194196, separated by only 20bp, were confirmed by luciferase assay to alter transcriptional regulation. In addition, electromobility shift assays demonstrated differential DNA:protein complex formation between the two alleles of this locus. Furthermore, deletion of the genomic locus by CRISPR/Cas9 in K562 and Meg-01 cells results in upregulation of CD36 transcription. These data indicate that we have identified a variant that regulates expression of CD36, which in turn affects platelet function. To assess the clinical relevance of our findings we used the PhenoScanner tool, which aggregates large scale GWAS findings; the results reinforce the clinical relevance of our variants and the utility of the MPRA assay. The study demonstrates a generalizable paradigm for functional testing of genetic variants to inform mechanistic studies, support patient management and develop precision therapies

    Functional annotation of the transcriptome of Sorghum bicolor in response to osmotic stress and abscisic acid

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Higher plants exhibit remarkable phenotypic plasticity allowing them to adapt to an extensive range of environmental conditions. Sorghum is a cereal crop that exhibits exceptional tolerance to adverse conditions, in particular, water-limiting environments. This study utilized next generation sequencing (NGS) technology to examine the transcriptome of sorghum plants challenged with osmotic stress and exogenous abscisic acid (ABA) in order to elucidate genes and gene networks that contribute to sorghum's tolerance to water-limiting environments with a long-term aim of developing strategies to improve plant productivity under drought.</p> <p>Results</p> <p>RNA-Seq results revealed transcriptional activity of 28,335 unique genes from sorghum root and shoot tissues subjected to polyethylene glycol (PEG)-induced osmotic stress or exogenous ABA. Differential gene expression analyses in response to osmotic stress and ABA revealed a strong interplay among various metabolic pathways including abscisic acid and 13-lipoxygenase, salicylic acid, jasmonic acid, and plant defense pathways. Transcription factor analysis indicated that groups of genes may be co-regulated by similar regulatory sequences to which the expressed transcription factors bind. We successfully exploited the data presented here in conjunction with published transcriptome analyses for rice, maize, and Arabidopsis to discover more than 50 differentially expressed, drought-responsive gene orthologs for which no function had been previously ascribed.</p> <p>Conclusions</p> <p>The present study provides an initial assemblage of sorghum genes and gene networks regulated by osmotic stress and hormonal treatment. We are providing an RNA-Seq data set and an initial collection of transcription factors, which offer a preliminary look into the cascade of global gene expression patterns that arise in a drought tolerant crop subjected to abiotic stress. These resources will allow scientists to query gene expression and functional annotation in response to drought.</p

    Mapping geographical inequalities in access to drinking water and sanitation facilities in low-income and middle-income countries, 2000-17

    Get PDF
    Background: Universal access to safe drinking water and sanitation facilities is an essential human right, recognised in the Sustainable Development Goals as crucial for preventing disease and improving human wellbeing. Comprehensive, high-resolution estimates are important to inform progress towards achieving this goal. We aimed to produce high-resolution geospatial estimates of access to drinking water and sanitation facilities. Methods: We used a Bayesian geostatistical model and data from 600 sources across more than 88 low-income and middle-income countries (LMICs) to estimate access to drinking water and sanitation facilities on continuous continent-wide surfaces from 2000 to 2017, and aggregated results to policy-relevant administrative units. We estimated mutually exclusive and collectively exhaustive subcategories of facilities for drinking water (piped water on or off premises, other improved facilities, unimproved, and surface water) and sanitation facilities (septic or sewer sanitation, other improved, unimproved, and open defecation) with use of ordinal regression. We also estimated the number of diarrhoeal deaths in children younger than 5 years attributed to unsafe facilities and estimated deaths that were averted by increased access to safe facilities in 2017, and analysed geographical inequality in access within LMICs. Findings: Across LMICs, access to both piped water and improved water overall increased between 2000 and 2017, with progress varying spatially. For piped water, the safest water facility type, access increased from 40·0% (95% uncertainty interval [UI] 39·4–40·7) to 50·3% (50·0–50·5), but was lowest in sub-Saharan Africa, where access to piped water was mostly concentrated in urban centres. Access to both sewer or septic sanitation and improved sanitation overall also increased across all LMICs during the study period. For sewer or septic sanitation, access was 46·3% (95% UI 46·1–46·5) in 2017, compared with 28·7% (28·5–29·0) in 2000. Although some units improved access to the safest drinking water or sanitation facilities since 2000, a large absolute number of people continued to not have access in several units with high access to such facilities (>80%) in 2017. More than 253 000 people did not have access to sewer or septic sanitation facilities in the city of Harare, Zimbabwe, despite 88·6% (95% UI 87·2–89·7) access overall. Many units were able to transition from the least safe facilities in 2000 to safe facilities by 2017; for units in which populations primarily practised open defecation in 2000, 686 (95% UI 664–711) of the 1830 (1797–1863) units transitioned to the use of improved sanitation. Geographical disparities in access to improved water across units decreased in 76·1% (95% UI 71·6–80·7) of countries from 2000 to 2017, and in 53·9% (50·6–59·6) of countries for access to improved sanitation, but remained evident subnationally in most countries in 2017. Interpretation: Our estimates, combined with geospatial trends in diarrhoeal burden, identify where efforts to increase access to safe drinking water and sanitation facilities are most needed. By highlighting areas with successful approaches or in need of targeted interventions, our estimates can enable precision public health to effectively progress towards universal access to safe water and sanitation

    Identifying associations between diabetes and acute respiratory distress syndrome in patients with acute hypoxemic respiratory failure: an analysis of the LUNG SAFE database

    Get PDF
    Background: Diabetes mellitus is a common co-existing disease in the critically ill. Diabetes mellitus may reduce the risk of acute respiratory distress syndrome (ARDS), but data from previous studies are conflicting. The objective of this study was to evaluate associations between pre-existing diabetes mellitus and ARDS in critically ill patients with acute hypoxemic respiratory failure (AHRF). Methods: An ancillary analysis of a global, multi-centre prospective observational study (LUNG SAFE) was undertaken. LUNG SAFE evaluated all patients admitted to an intensive care unit (ICU) over a 4-week period, that required mechanical ventilation and met AHRF criteria. Patients who had their AHRF fully explained by cardiac failure were excluded. Important clinical characteristics were included in a stepwise selection approach (forward and backward selection combined with a significance level of 0.05) to identify a set of independent variables associated with having ARDS at any time, developing ARDS (defined as ARDS occurring after day 2 from meeting AHRF criteria) and with hospital mortality. Furthermore, propensity score analysis was undertaken to account for the differences in baseline characteristics between patients with and without diabetes mellitus, and the association between diabetes mellitus and outcomes of interest was assessed on matched samples. Results: Of the 4107 patients with AHRF included in this study, 3022 (73.6%) patients fulfilled ARDS criteria at admission or developed ARDS during their ICU stay. Diabetes mellitus was a pre-existing co-morbidity in 913 patients (22.2% of patients with AHRF). In multivariable analysis, there was no association between diabetes mellitus and having ARDS (OR 0.93 (0.78-1.11); p = 0.39), developing ARDS late (OR 0.79 (0.54-1.15); p = 0.22), or hospital mortality in patients with ARDS (1.15 (0.93-1.42); p = 0.19). In a matched sample of patients, there was no association between diabetes mellitus and outcomes of interest. Conclusions: In a large, global observational study of patients with AHRF, no association was found between diabetes mellitus and having ARDS, developing ARDS, or outcomes from ARDS. Trial registration: NCT02010073. Registered on 12 December 2013

    Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study

    Get PDF
    Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe

    Canagliflozin and renal outcomes in type 2 diabetes and nephropathy

    Get PDF
    BACKGROUND Type 2 diabetes mellitus is the leading cause of kidney failure worldwide, but few effective long-term treatments are available. In cardiovascular trials of inhibitors of sodium–glucose cotransporter 2 (SGLT2), exploratory results have suggested that such drugs may improve renal outcomes in patients with type 2 diabetes. METHODS In this double-blind, randomized trial, we assigned patients with type 2 diabetes and albuminuric chronic kidney disease to receive canagliflozin, an oral SGLT2 inhibitor, at a dose of 100 mg daily or placebo. All the patients had an estimated glomerular filtration rate (GFR) of 30 to &lt;90 ml per minute per 1.73 m2 of body-surface area and albuminuria (ratio of albumin [mg] to creatinine [g], &gt;300 to 5000) and were treated with renin–angiotensin system blockade. The primary outcome was a composite of end-stage kidney disease (dialysis, transplantation, or a sustained estimated GFR of &lt;15 ml per minute per 1.73 m2), a doubling of the serum creatinine level, or death from renal or cardiovascular causes. Prespecified secondary outcomes were tested hierarchically. RESULTS The trial was stopped early after a planned interim analysis on the recommendation of the data and safety monitoring committee. At that time, 4401 patients had undergone randomization, with a median follow-up of 2.62 years. The relative risk of the primary outcome was 30% lower in the canagliflozin group than in the placebo group, with event rates of 43.2 and 61.2 per 1000 patient-years, respectively (hazard ratio, 0.70; 95% confidence interval [CI], 0.59 to 0.82; P=0.00001). The relative risk of the renal-specific composite of end-stage kidney disease, a doubling of the creatinine level, or death from renal causes was lower by 34% (hazard ratio, 0.66; 95% CI, 0.53 to 0.81; P&lt;0.001), and the relative risk of end-stage kidney disease was lower by 32% (hazard ratio, 0.68; 95% CI, 0.54 to 0.86; P=0.002). The canagliflozin group also had a lower risk of cardiovascular death, myocardial infarction, or stroke (hazard ratio, 0.80; 95% CI, 0.67 to 0.95; P=0.01) and hospitalization for heart failure (hazard ratio, 0.61; 95% CI, 0.47 to 0.80; P&lt;0.001). There were no significant differences in rates of amputation or fracture. CONCLUSIONS In patients with type 2 diabetes and kidney disease, the risk of kidney failure and cardiovascular events was lower in the canagliflozin group than in the placebo group at a median follow-up of 2.62 years

    Condensed-phase biogenic–anthropogenic interactions with implications for cold cloud formation

    Full text link
    Anthropogenic and biogenic gas emissions contribute to the formation of secondary organic aerosol (SOA). When present, soot particles from fossil fuel combustion can acquire a coating of SOA. We investigate SOA-soot biogenic-anthropogenic interactions and their impact on ice nucleation in relation to the particles' organic phase state. SOA particles were generated from the OH oxidation of naphthalene, α-pinene, longifolene, or isoprene, with or without the presence of sulfate or soot particles. Corresponding particle glass transition (Tg) and full deliquescence relative humidity (FDRH) were estimated using a numerical diffusion model. Longifolene SOA particles are solid-like and all biogenic SOA sulfate mixtures exhibit a core-shell configuration (i.e. a sulfate-rich core coated with SOA). Biogenic SOA with or without sulfate formed ice at conditions expected for homogeneous ice nucleation, in agreement with respective Tg and FDRH. α-pinene SOA coated soot particles nucleated ice above the homogeneous freezing temperature with soot acting as ice nuclei (IN). At lower temperatures the α-pinene SOA coating can be semisolid, inducing ice nucleation. Naphthalene SOA coated soot particles acted as ice nuclei above and below the homogeneous freezing limit, which can be explained by the presence of a highly viscous SOA phase. Our results suggest that biogenic SOA does not play a significant role in mixed-phase cloud formation and the presence of sulfate renders this even less likely. However, anthropogenic SOA may have an enhancing effect on cloud glaciation under mixed-phase and cirrus cloud conditions compared to biogenic SOA that dominate during pre-industrial times or in pristine areas

    Testing a global standard for quantifying species recovery and assessing conservation impact.

    Get PDF
    Recognizing the imperative to evaluate species recovery and conservation impact, in 2012 the International Union for Conservation of Nature (IUCN) called for development of a "Green List of Species" (now the IUCN Green Status of Species). A draft Green Status framework for assessing species' progress toward recovery, published in 2018, proposed 2 separate but interlinked components: a standardized method (i.e., measurement against benchmarks of species' viability, functionality, and preimpact distribution) to determine current species recovery status (herein species recovery score) and application of that method to estimate past and potential future impacts of conservation based on 4 metrics (conservation legacy, conservation dependence, conservation gain, and recovery potential). We tested the framework with 181 species representing diverse taxa, life histories, biomes, and IUCN Red List categories (extinction risk). Based on the observed distribution of species' recovery scores, we propose the following species recovery categories: fully recovered, slightly depleted, moderately depleted, largely depleted, critically depleted, extinct in the wild, and indeterminate. Fifty-nine percent of tested species were considered largely or critically depleted. Although there was a negative relationship between extinction risk and species recovery score, variation was considerable. Some species in lower risk categories were assessed as farther from recovery than those at higher risk. This emphasizes that species recovery is conceptually different from extinction risk and reinforces the utility of the IUCN Green Status of Species to more fully understand species conservation status. Although extinction risk did not predict conservation legacy, conservation dependence, or conservation gain, it was positively correlated with recovery potential. Only 1.7% of tested species were categorized as zero across all 4 of these conservation impact metrics, indicating that conservation has, or will, play a role in improving or maintaining species status for the vast majority of these species. Based on our results, we devised an updated assessment framework that introduces the option of using a dynamic baseline to assess future impacts of conservation over the short term to avoid misleading results which were generated in a small number of cases, and redefines short term as 10 years to better align with conservation planning. These changes are reflected in the IUCN Green Status of Species Standard

    Socializing One Health: an innovative strategy to investigate social and behavioral risks of emerging viral threats

    Get PDF
    In an effort to strengthen global capacity to prevent, detect, and control infectious diseases in animals and people, the United States Agency for International Development’s (USAID) Emerging Pandemic Threats (EPT) PREDICT project funded development of regional, national, and local One Health capacities for early disease detection, rapid response, disease control, and risk reduction. From the outset, the EPT approach was inclusive of social science research methods designed to understand the contexts and behaviors of communities living and working at human-animal-environment interfaces considered high-risk for virus emergence. Using qualitative and quantitative approaches, PREDICT behavioral research aimed to identify and assess a range of socio-cultural behaviors that could be influential in zoonotic disease emergence, amplification, and transmission. This broad approach to behavioral risk characterization enabled us to identify and characterize human activities that could be linked to the transmission dynamics of new and emerging viruses. This paper provides a discussion of implementation of a social science approach within a zoonotic surveillance framework. We conducted in-depth ethnographic interviews and focus groups to better understand the individual- and community-level knowledge, attitudes, and practices that potentially put participants at risk for zoonotic disease transmission from the animals they live and work with, across 6 interface domains. When we asked highly-exposed individuals (ie. bushmeat hunters, wildlife or guano farmers) about the risk they perceived in their occupational activities, most did not perceive it to be risky, whether because it was normalized by years (or generations) of doing such an activity, or due to lack of information about potential risks. Integrating the social sciences allows investigations of the specific human activities that are hypothesized to drive disease emergence, amplification, and transmission, in order to better substantiate behavioral disease drivers, along with the social dimensions of infection and transmission dynamics. Understanding these dynamics is critical to achieving health security--the protection from threats to health-- which requires investments in both collective and individual health security. Involving behavioral sciences into zoonotic disease surveillance allowed us to push toward fuller community integration and engagement and toward dialogue and implementation of recommendations for disease prevention and improved health security
    corecore