130 research outputs found
Temporal Subsampling Diminishes Small Spatial Scales in Recurrent Neural Network Emulators of Geophysical Turbulence
The immense computational cost of traditional numerical weather and climate
models has sparked the development of machine learning (ML) based emulators.
Because ML methods benefit from long records of training data, it is common to
use datasets that are temporally subsampled relative to the time steps required
for the numerical integration of differential equations. Here, we investigate
how this often overlooked processing step affects the quality of an emulator's
predictions. We implement two ML architectures from a class of methods called
reservoir computing: (1) a form of Nonlinear Vector Autoregression (NVAR), and
(2) an Echo State Network (ESN). Despite their simplicity, it is well
documented that these architectures excel at predicting low dimensional chaotic
dynamics. We are therefore motivated to test these architectures in an
idealized setting of predicting high dimensional geophysical turbulence as
represented by Surface Quasi-Geostrophic dynamics. In all cases, subsampling
the training data consistently leads to an increased bias at small spatial
scales that resembles numerical diffusion. Interestingly, the NVAR architecture
becomes unstable when the temporal resolution is increased, indicating that the
polynomial based interactions are insufficient at capturing the detailed
nonlinearities of the turbulent flow. The ESN architecture is found to be more
robust, suggesting a benefit to the more expensive but more general structure.
Spectral errors are reduced by including a penalty on the kinetic energy
density spectrum during training, although the subsampling related errors
persist. Future work is warranted to understand how the temporal resolution of
training data affects other ML architectures
Cost Savings of Universal Decolonization to Prevent Intensive Care Unit Infection: Implications of the REDUCE MRSA Trial
ObjectiveTo estimate and compare the impact on healthcare costs of 3 alternative strategies for reducing bloodstream infections in the intensive care unit (ICU): methicillin-resistant Staphylococcus aureus (MRSA) nares screening and isolation, targeted decolonization (ie, screening, isolation, and decolonization of MRSA carriers or infections), and universal decolonization (ie, no screening and decolonization of all ICU patients).DesignCost analysis using decision modeling.MethodsWe developed a decision-analysis model to estimate the health care costs of targeted decolonization and universal decolonization strategies compared with a strategy of MRSA nares screening and isolation. Effectiveness estimates were derived from a recent randomized trial of the 3 strategies, and cost estimates were derived from the literature.ResultsIn the base case, universal decolonization was the dominant strategy and was estimated to have both lower intervention costs and lower total ICU costs than either screening and isolation or targeted decolonization. Compared with screening and isolation, universal decolonization was estimated to save $171,000 and prevent 9 additional bloodstream infections for every 1,000 ICU admissions. The dominance of universal decolonization persisted under a wide range of cost and effectiveness assumptions.ConclusionsA strategy of universal decolonization for patients admitted to the ICU would both reduce bloodstream infections and likely reduce healthcare costs compared with strategies of MRSA nares screening and isolation or screening and isolation coupled with targeted decolonization
Recommended from our members
Increased importance of methane reduction for a 1.5 degree target
To understand the importance of methane on the levels of carbon emission reductions required to achieve temperature goals, a processed-based approach is necessary rather than reliance on the Transient Climate Response to Emissions. We show that plausible levels of methane (CH4) mitigation can make a substantial difference to the feasibility of achieving the Paris climate targets through increasing the allowable carbon emissions. This benefit is enhanced by the indirect effects of CH4 on ozone (O3). Here the differing effects of CH4 and CO2 on land carbon storage, including the effects of surface O3, lead to an additional increase in the allowable carbon emissions with CH4 mitigation. We find a simple robust relationship between the change in the 2100 CH4 concentration and the extra allowable cumulative carbon emissions between now and 2100 (0.27 ± 0.05 GtC per ppb CH4). This relationship is independent of modelled climate sensitivity and precise temperature target, although later mitigation of CH4 reduces its value and thus methane reduction effectiveness. Up to 12% of this increase in allowable emissions is due to the effect of surface ozone. We conclude early mitigation of CH4 emissions would significantly increase the feasibility of stabilising global warming below 1.5C, alongside having co-benefits for human and ecosystem health
Cost Savings of Universal Decolonization to Prevent Intensive Care Unit Infection: Implications of the REDUCE MRSA Trial
Cas9 gRNA engineering for genome editing, activation and repression
We demonstrate that by altering the length of Cas9-associated guide RNA(gRNA) we were able to control Cas9 nuclease activity and simultaneously perform genome editing and transcriptional regulation with a single Cas9 protein. We exploited these principles to engineer mammalian synthetic circuits with combined transcriptional regulation and kill functions governed by a single multifunctional Cas9 protein.National Human Genome Research Institute (U.S.) (P50 HG005550)United States. Department of Energy (DE-FG02-02ER63445)Wyss Institute for Biologically Inspired EngineeringUnited States. Army Research Office (DARPA W911NF-11-2-0054)National Science Foundation (U.S.)United States. National Institutes of Health (5R01CA155320-04)United States. National Institutes of Health (P50 GM098792)National Cancer Institute (U.S.) (5T32CA009216-34)Massachusetts Institute of Technology. Department of Biological EngineeringHarvard Medical School. Department of GeneticsDefense Threat Reduction Agency (DTRA) (HDTRA1-14-1-0006
The Sorcerer II Global Ocean Sampling Expedition: Northwest Atlantic through Eastern Tropical Pacific
The world's oceans contain a complex mixture of micro-organisms that are for the most part, uncharacterized both genetically and biochemically. We report here a metagenomic study of the marine planktonic microbiota in which surface (mostly marine) water samples were analyzed as part of the Sorcerer II Global Ocean Sampling expedition. These samples, collected across a several-thousand km transect from the North Atlantic through the Panama Canal and ending in the South Pacific yielded an extensive dataset consisting of 7.7 million sequencing reads (6.3 billion bp). Though a few major microbial clades dominate the planktonic marine niche, the dataset contains great diversity with 85% of the assembled sequence and 57% of the unassembled data being unique at a 98% sequence identity cutoff. Using the metadata associated with each sample and sequencing library, we developed new comparative genomic and assembly methods. One comparative genomic method, termed “fragment recruitment,” addressed questions of genome structure, evolution, and taxonomic or phylogenetic diversity, as well as the biochemical diversity of genes and gene families. A second method, termed “extreme assembly,” made possible the assembly and reconstruction of large segments of abundant but clearly nonclonal organisms. Within all abundant populations analyzed, we found extensive intra-ribotype diversity in several forms: (1) extensive sequence variation within orthologous regions throughout a given genome; despite coverage of individual ribotypes approaching 500-fold, most individual sequencing reads are unique; (2) numerous changes in gene content some with direct adaptive implications; and (3) hypervariable genomic islands that are too variable to assemble. The intra-ribotype diversity is organized into genetically isolated populations that have overlapping but independent distributions, implying distinct environmental preference. We present novel methods for measuring the genomic similarity between metagenomic samples and show how they may be grouped into several community types. Specific functional adaptations can be identified both within individual ribotypes and across the entire community, including proteorhodopsin spectral tuning and the presence or absence of the phosphate-binding gene PstS
Structure-Based Design of Non-Natural Amino Acid Inhibitors of Amyloid Fibrillation
Many globular and natively disordered proteins can convert into amyloid fibers. These fibers are associated with numerous pathologies1 as well as with normal cellular functions2,3, and frequently form during protein denaturation4,5. Inhibitors of pathological amyloid fibers could serve as leads for therapeutics, provided the inhibitors were specific enough to avoid interfering with normal processes. Here we show that computer-aided, structure-based design can yield highly specific peptide inhibitors of amyloid formation. Using known atomic structures of segments of amyloid fibers as templates, we have designed and characterized an all D-amino acid inhibitor of fibrillation of the tau protein found in Alzheimer’s disease, and a non-natural L-amino acid inhibitor of an amyloid fiber that enhances sexual transmission of HIV. Our results indicate that peptides from structure-based designs can disrupt the fibrillation of full-length proteins, including those like tau that lack fully ordered native structures.We thank M.I. Ivanova, J. Corn, T. Kortemme, D. Anderson, M.R. Sawaya, M. Phillips, S. Sambashivan, J. Park, M. Landau, Q. Zhang, R. Clubb, F. Guo, T. Yeates, J. Nowick, J. Zheng, and M.J. Thompson for discussions, HHMI, NIH, NSF, the GATES foundation, and the Joint Center for Translational Medicine for support, R. Peterson for help with NMR experiments, E. Mandelkow for providing tau constructs, R. Riek for providing amyloid beta, J. Stroud for amyloid beta preparation. Support for JK was from the Damon Runyon Cancer Research Foundation, for HWC by the Ruth L. Kirschstein National Research Service Award, for JM from the programme for junior-professors by the ministry of science, Baden-Württemberg, and for SAS by a UCLA-IGERT bioinformatics traineeship
Sensitivity and Bias in Decision-Making under Risk: Evaluating the Perception of Reward, Its Probability and Value
BACKGROUND: There are few clinical tools that assess decision-making under risk. Tests that characterize sensitivity and bias in decisions between prospects varying in magnitude and probability of gain may provide insights in conditions with anomalous reward-related behaviour. OBJECTIVE: We designed a simple test of how subjects integrate information about the magnitude and the probability of reward, which can determine discriminative thresholds and choice bias in decisions under risk. DESIGN/METHODS: Twenty subjects were required to choose between two explicitly described prospects, one with higher probability but lower magnitude of reward than the other, with the difference in expected value between the two prospects varying from 3 to 23%. RESULTS: Subjects showed a mean threshold sensitivity of 43% difference in expected value. Regarding choice bias, there was a 'risk premium' of 38%, indicating a tendency to choose higher probability over higher reward. An analysis using prospect theory showed that this risk premium is the predicted outcome of hypothesized non-linearities in the subjective perception of reward value and probability. CONCLUSIONS: This simple test provides a robust measure of discriminative value thresholds and biases in decisions under risk. Prospect theory can also make predictions about decisions when subjective perception of reward or probability is anomalous, as may occur in populations with dopaminergic or striatal dysfunction, such as Parkinson's disease and schizophrenia
- …