90 research outputs found

    The Lotic Intersite Nitrogen Experiments: an example of successful ecological research collaboration

    Get PDF
    Collaboration is an essential skill for modern ecologists because it brings together diverse expertise, viewpoints, and study systems. The Lotic Intersite Nitrogen eXperiments (LINX I and II), a 17-y research endeavor involving scores of early- to late-career stream ecologists, is an example of the benefits, challenges, and approaches of successful collaborative research in ecology. The scientific success of LINX reflected tangible attributes including clear scientific goals (hypothesis-driven research), coordinated research methods, a team of cooperative scientists, excellent leadership, extensive communication, and a philosophy of respect for input from all collaborators. Intangible aspects of the collaboration included camaraderie and strong team chemistry. LINX further benefited from being part of a discipline in which collaboration is a tradition, clear data-sharing and authorship guidelines, an approach that melded field experiments and modeling, and a shared collaborative goal in the form of a universal commitment to see the project and resulting data products through to completion

    An examination of sex differences in associations between cord blood adipokines and childhood adiposity

    Full text link
    Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/154308/1/ijpo12587.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/154308/2/ijpo12587_am.pd

    Significant regional differences in antibiotic use across 576 US hospitals and 11 701 326 adult admissions, 2016-2017

    Get PDF
    BACKGROUND: Quantifying the amount and diversity of antibiotic use in United States hospitals assists antibiotic stewardship efforts but is hampered by limited national surveillance. Our study aimed to address this knowledge gap by examining adult antibiotic use across 576 hospitals and nearly 12 million encounters in 2016-2017. METHODS: We conducted a retrospective study of patients aged ≥ 18 years discharged from hospitals in the Premier Healthcare Database between 1 January 2016 and 31 December 2017. Using daily antibiotic charge data, we mapped antibiotics to mutually exclusive classes and to spectrum of activity categories. We evaluated relationships between facility and case-mix characteristics and antibiotic use in negative binomial regression models. RESULTS: The study included 11 701 326 admissions, totaling 64 064 632 patient-days, across 576 hospitals. Overall, patients received antibiotics in 65% of hospitalizations, at a crude rate of 870 days of therapy (DOT) per 1000 patient-days. By class, use was highest among β-lactam/β-lactamase inhibitor combinations, third- and fourth-generation cephalosporins, and glycopeptides. Teaching hospitals averaged lower rates of total antibiotic use than nonteaching hospitals (834 vs 957 DOT per 1000 patient-days; P \u3c .001). In adjusted models, teaching hospitals remained associated with lower use of third- and fourth-generation cephalosporins and antipseudomonal agents (adjusted incidence rate ratio [95% confidence interval], 0.92 [.86-.97] and 0.91 [.85-.98], respectively). Significant regional differences in total and class-specific antibiotic use also persisted in adjusted models. CONCLUSIONS: Adult inpatient antibiotic use remains high, driven predominantly by broad-spectrum agents. Better understanding reasons for interhospital usage differences, including by region and teaching status, may inform efforts to reduce inappropriate antibiotic prescribing

    Electronically available patient claims data improve models for comparing antibiotic use across hospitals: Results from 576 US facilities

    Get PDF
    BACKGROUND: The Centers for Disease Control and Prevention (CDC) uses standardized antimicrobial administration ratios (SAARs)-that is, observed-to-predicted ratios-to compare antibiotic use across facilities. CDC models adjust for facility characteristics when predicting antibiotic use but do not include patient diagnoses and comorbidities that may also affect utilization. This study aimed to identify comorbidities causally related to appropriate antibiotic use and to compare models that include these comorbidities and other patient-level claims variables to a facility model for risk-adjusting inpatient antibiotic utilization. METHODS: The study included adults discharged from Premier Database hospitals in 2016-2017. For each admission, we extracted facility, claims, and antibiotic data. We evaluated 7 models to predict an admission\u27s antibiotic days of therapy (DOTs): a CDC facility model, models that added patient clinical constructs in varying layers of complexity, and an external validation of a published patient-variable model. We calculated hospital-specific SAARs to quantify effects on hospital rankings. Separately, we used Delphi Consensus methodology to identify Elixhauser comorbidities associated with appropriate antibiotic use. RESULTS: The study included 11 701 326 admissions across 576 hospitals. Compared to a CDC-facility model, a model that added Delphi-selected comorbidities and a bacterial infection indicator was more accurate for all antibiotic outcomes. For total antibiotic use, it was 24% more accurate (respective mean absolute errors: 3.11 vs 2.35 DOTs), resulting in 31-33% more hospitals moving into bottom or top usage quartiles postadjustment. CONCLUSIONS: Adding electronically available patient claims data to facility models consistently improved antibiotic utilization predictions and yielded substantial movement in hospitals\u27 utilization rankings

    A modified Delphi approach to develop a trial protocol for antibiotic de-escalation in patients with suspected sepsis

    Get PDF
    Background: Early administration of antibiotics in sepsis is associated with improved patient outcomes, but safe and generalizable approaches to de-escalate or discontinue antibiotics after suspected sepsis events are unknown. Methods: We used a modified Delphi approach to identify safety criteria for an opt-out protocol to guide de-escalation or discontinuation of antibiotic therapy after 72 hours in non-ICU patients with suspected sepsis. An expert panel with expertise in antimicrobial stewardship and hospital epidemiology rated 48 unique criteria across 3 electronic survey rating tools. Criteria were rated primarily based on their impact on patient safety and feasibility for extraction from electronic health record review. The 48 unique criteria were rated by anonymous electronic survey tools, and the results were fed back to the expert panel participants. Consensus was achieved to either retain or remove each criterion. Results: After 3 rounds, 22 unique criteria remained as part of the opt-out safety checklist. These criteria included high-risk comorbidities, signs of severe illness, lack of cultures during sepsis work-up or antibiotic use prior to blood cultures, or ongoing signs and symptoms of infection. Conclusions: The modified Delphi approach is a useful method to achieve expert-level consensus in the absence of evidence suifficient to provide validated guidance. The Delphi approach allowed for flexibility in development of an opt-out trial protocol for sepsis antibiotic de-escalation. The utility of this protocol should be evaluated in a randomized controlled trial

    Assessing exposure in epidemiologic studies to disinfection by-products in drinking water: report from an international workshop.

    Get PDF
    The inability to accurately assess exposure has been one of the major shortcomings of epidemiologic studies of disinfection by-products (DBPs) in drinking water. A number of contributing factors include a) limited information on the identity, occurrence, toxicity, and pharmacokinetics of the many DBPs that can be formed from chlorine, chloramine, ozone, and chlorine dioxide disinfection; b) the complex chemical interrelationships between DBPs and other parameters within a municipal water distribution system; and c) difficulties obtaining accurate and reliable information on personal activity and water consumption patterns. In May 2000, an international workshop was held to bring together various disciplines to develop better approaches for measuring DBP exposure for epidemiologic studies. The workshop reached consensus about the clear need to involve relevant disciplines (e.g., chemists, engineers, toxicologists, biostatisticians and epidemiologists) as partners in developing epidemiologic studies of DBPs in drinking water. The workshop concluded that greater collaboration of epidemiologists with water utilities and regulators should be encouraged in order to make regulatory monitoring data more useful for epidemiologic studies. Similarly, exposure classification categories in epidemiologic studies should be chosen to make results useful for regulatory or policy decision making

    Gradients of anthropogenic nutrient enrichment alter N Composition and DOM stoichiometry in freshwater ecosystems

    Get PDF
    Plain language summary Ammonium and nitrate in freshwaters have received considerable attention due to their clear ecological and health effects. A comprehensive assessment of N in freshwaters that includes DON is lacking. Including DON in studies of surface water chemistry is important because it can cause eutrophication and certain forms can be rapidly removed by microbial communities. Here, we document how elevated levels of TDN impact the concentrations and relative proportions of all three forms of dissolved N and the stoichiometry of DOM. Our results suggest that human activities fundamentally alter the composition of the dissolved nitrogen pool and the stoichiometry of DOM. Results also highlight feedbacks between the C and N cycles in freshwater ecosystems that are poorly studied.A comprehensive cross-biome assessment of major nitrogen (N) species that includes dissolved organic N (DON) is central to understanding interactions between inorganic nutrients and organic matter in running waters. Here, we synthesize stream water N chemistry across biomes and find that the composition of the dissolved N pool shifts from highly heterogeneous to primarily comprised of inorganic N, in tandem with dissolved organic matter (DOM) becoming more N-rich, in response to nutrient enrichment from human disturbances. We identify two critical thresholds of total dissolved N (TDN) concentrations where the proportions of organic and inorganic N shift. With low TDN concentrations (0–1.3 mg/L N), the dominant form of N is highly variable, and DON ranges from 0% to 100% of TDN. At TDN concentrations above 2.8 mg/L, inorganic N dominates the N pool and DON rarely exceeds 25% of TDN. This transition to inorganic N dominance coincides with a shift in the stoichiometry of the DOM pool, where DOM becomes progressively enriched in N and DON concentrations are less tightly associated with concentrations of dissolved organic carbon (DOC). This shift in DOM stoichiometry (defined as DOC:DON ratios) suggests that fundamental changes in the biogeochemical cycles of C and N in freshwater ecosystems are occurring across the globe as human activity alters inorganic N and DOM sources and availability. Alterations to DOM stoichiometry are likely to have important implications for both the fate of DOM and its role as a source of N as it is transported downstream to the coastal ocean

    Stream denitrification across biomes and its response to anthropogenic nitrate loading

    Get PDF
    Author Posting. © The Author(s), 2008. This is the author's version of the work. It is posted here by permission of Nature Publishing Group for personal use, not for redistribution. The definitive version was published in Nature 452 (2008): 202-205, doi:10.1038/nature06686.Worldwide, anthropogenic addition of bioavailable nitrogen (N) to the biosphere is increasing and terrestrial ecosystems are becoming increasingly N saturated, causing more bioavailable N to enter groundwater and surface waters. Large-scale N budgets show that an average of about 20-25% of the N added to the biosphere is exported from rivers to the ocean or inland basins, indicating substantial sinks for N must exist in the landscape. Streams and rivers may be important sinks for bioavailable N owing to their hydrologic connections with terrestrial systems, high rates of biological activity, and streambed sediment environments that favor microbial denitrification. Here, using data from 15N tracer experiments replicated across 72 streams and 8 regions representing several biomes, we show that total biotic uptake and denitrification of nitrate increase with stream nitrate concentration, but that the efficiency of biotic uptake and denitrification declines as concentration increases, reducing the proportion of instream nitrate that is removed from transport. Total uptake of nitrate was related to ecosystem photosynthesis and denitrification was related to ecosystem respiration. Additionally, we use a stream network model to demonstrate that excess nitrate in streams elicits a disproportionate increase in the fraction of nitrate that is exported to receiving waters and reduces the relative role of small versus large streams as nitrate sinks.Funding for this research was provided by the National Science Foundation

    Meta-analysis of pharmacogenetic interactions in amyotrophic lateral sclerosis clinical trials

    Get PDF
    OBJECTIVE: To assess whether genetic subgroups in recent amyotrophic lateral sclerosis (ALS) trials responded to treatment with lithium carbonate, but that the treatment effect was lost in a large cohort of nonresponders. METHODS: Individual participant data were obtained from 3 randomized trials investigating the efficacy of lithium carbonate. We matched clinical data with data regarding the UNC13A and C9orf72 genotype. Our primary outcome was survival at 12 months. On an exploratory basis, we assessed whether the effect of lithium depended on the genotype. RESULTS: Clinical data were available for 518 of the 606 participants. Overall, treatment with lithium carbonate did not improve 12-month survival (hazard ratio [HR] 1.0, 95% confidence interval [CI] 0.7-1.4; p = 0.96). Both the UNC13A and C9orf72 genotype were independent predictors of survival (HR 2.4, 95% CI 1.3-4.3; p = 0.006 and HR 2.5, 95% CI 1.1-5.2; p = 0.032, respectively). The effect of lithium was different for UNC13A carriers (p = 0.027), but not for C9orf72 carriers (p = 0.22). The 12-month survival probability for UNC13A carriers treated with lithium carbonate improved from 40.1% (95% CI 23.2-69.1) to 69.7% (95% CI 50.4-96.3). CONCLUSIONS: This study incorporated genetic data into past ALS trials to determine treatment effects in a genetic post hoc analysis. Our results suggest that we should reorient our strategies toward finding treatments for ALS, start focusing on genotype-targeted treatments, and standardize genotyping in order to optimize randomization and analysis for future clinical trials
    corecore