86 research outputs found

    Spatial changes in soil stable isotopic composition in response to carrion decomposition

    Get PDF
    Decomposition provides a critical mechanism for returning nutrients to the surrounding environment. In terrestrial systems, animal carcass, or carrion, decomposition results in a cascade of biogeochemical changes. Soil microbial communities are stimulated, resulting in transformations of carbon (C) and nitrogen (N) sourced from the decaying carrion soft tissues, changes to soil pH, electrical conductivity, and oxygen availability as microbial communities release CO2 and mineralize organic N. While many of the rapid changes to soil biogeochemistry observed during carrion decomposition return to background or starting conditions shortly after soft tissues are degraded, some biogeochemical parameters, particularly bulk soil stable δ15N isotopic composition, have the potential to exhibit prolonged perturbations, extending for several years. The goal of this study was to evaluate the lateral and vertical changes to soil stable isotopic composition 1 year after carrion decomposition in a forest ecosystem. Lateral transects extending 140 cm from three decomposition “hotspots” were sampled at 20 cm intervals, and subsurface cores were collected beneath each hotspot to a depth of 50 cm. Bulk soil stable isotopic composition (δ15N and δ13C) indicated that 1 year after complete soft tissue removal and decay, soils were significantly 15N enriched by 7.5±1.0 ‰ compared to control soils up to 60 cm from the hotspot center, and enrichment extended to a depth of 10 cm. Hotspot soils also contained 10 % more N compared to control soils, indicating that decomposition perturbs N pools. Our results demonstrate that carrion decomposition has the potential to result in long-term changes to soil biogeochemistry, up to at least 1 year after soft tissue degradation, and to contribute to bulk soil stable isotopic composition

    Effects of biodegradable plastic film mulching on soil microbial communities in two agroecosystems

    Get PDF
    Plastic mulch films are used globally in crop production but incur considerable disposal and environmental pollution issues. Biodegradable plastic mulch films (BDMs), an alternative to polyethylene (PE)-based films, are designed to be tilled into the soil where they are expected to be mineralized to carbon dioxide, water and microbial biomass. However, insufficient research regarding the impacts of repeated soil incorporation of BDMs on soil microbial communities has partly contributed to limited adoption of BDMs. In this study, we evaluated the effects of BDM incorporation on soil microbial community structure and function over two years in two geographical locations: Knoxville, TN, and in Mount Vernon, WA, USA. Treatments included four plastic BDMs (three commercially available and one experimental film), a biodegradable cellulose paper mulch, a non-biodegradable PE mulch and a no mulch plot. Bacterial community structure determined using 16S rRNA gene amplicon sequencing revealed significant differences by location and season. Differences in bacterial communities by mulch treatment were not significant for any season in either location, except for Fall 2015 in WA where differences were observed between BDMs and no-mulch plots. Extracellular enzyme assays were used to characterize communities functionally, revealing significant differences by location and sampling season in both TN and WA but minimal differences between BDMs and PE treatments. Overall, BDMs had comparable influences on soil microbial communities to PE mulch films

    Bacillus pumilus B12 Degrades Polylactic Acid and Degradation Is Affected by Changing Nutrient Conditions

    Get PDF
    Poly-lactic acid (PLA) is increasingly used as a biodegradable alternative to traditional petroleum-based plastics. In this study, we identify a novel agricultural soil isolate of Bacillus pumilus (B12) that is capable of degrading high molecular weight PLA films. This degradation can be detected on a short timescale, with significant degradation detected within 48-h by the release of L-lactate monomers, allowing for a rapid identification ideal for experimental variation. The validity of using L-lactate as a proxy for degradation of PLA films is corroborated by loss of rigidity and appearance of fractures in PLA films, as measured by atomic force microscopy and scanning electron microscopy (SEM), respectively. Furthermore, we have observed a dose-dependent decrease in PLA degradation in response to an amino acid/nucleotide supplement mix that is driven mainly by the nucleotide base adenine. In addition, amendments of the media with specific carbon sources increase the rate of PLA degradation, while phosphate and potassium additions decrease the rate of PLA degradation by B. pumilus B12. These results suggest B. pumilus B12 is adapting its enzymatic expression based on environmental conditions and that these conditions can be used to study the regulation of this process. Together, this work lays a foundation for studying the bacterial degradation of biodegradable plastics

    Effect of Phosphorus Amendments on Present Day Plankton Communities in Pelagic Lake Erie

    Get PDF
    To address questions regarding the potential impact of elevated total phosphorus (TP) inputs (due to relaxed regulations of TP loading), a series of TP enrichment experiments were conducted at pelagic stations in the 3 hydrologically distinct basins of Lake Erie. Results of nutrient assimilation measurements and assays for nutrient bioavailability suggest that the chemical speciation, and not concentration, of nitrogenous compounds may influence phytoplankton community structure; this in turn may lead to the selective proliferation of cyanobacteria in the eastern basin of the lake. Assays with cyanobacterial bioluminescent reporter systems for P and N availability as well as N-tot:P-tot assimilation ratios from on-deck incubation experiments support this work. Considered in the context of a microbial food web relative to a grazing food web, the results imply that alterations in current TP loading controls may lead to alterations in the phytoplankton community structure in the different basins of the Lake Erie system

    Microbial control of diatom bloom dynamics in the open ocean

    Get PDF
    Diatom blooms play a central role in supporting foodwebs and sequestering biogenic carbon to depth. Oceanic conditions set bloom initiation, whereas both environmental and ecological factors determine bloom magnitude and longevity. Our study reveals another fundamental determinant of bloom dynamics. A diatom spring bloom in offshore New Zealand waters was likely terminated by iron limitation, even though diatoms consumed <1/3 of the mixed-layer dissolved iron inventory. Thus, bloom duration and magnitude were primarily set by competition for dissolved iron between microbes and small phytoplankton versus diatoms. Significantly, such a microbial mode of control probably relies both upon out-competing diatoms for iron (i.e., K-strategy), and having high iron requirements (i.e., r-strategy). Such resource competition for iron has implications for carbon biogeochemistry, as, blooming diatoms fixed three-fold more carbon per unit iron than resident non-blooming microbes. Microbial sequestration of iron has major ramifications for determining the biogeochemical imprint of oceanic diatom blooms. Citation: Boyd, P. W., et al. (2012), Microbial control of diatom bloom dynamics in the open ocean, Geophys. Res. Lett., 39, L18601

    A Machine Learning Approach for Using the Postmortem Skin Microbiome to Estimate the Postmortem Interval

    Full text link
    Research on the human microbiome, the microbiota that live in, on, and around the human person, has revolutionized our understanding of the complex interactions between microbial life and human health and disease. The microbiome may also provide a valuable tool in forensic death investigations by helping to reveal the postmortem interval (PMI) of a decedent that is discovered after an unknown amount of time since death. Current methods of estimating PMI for cadavers discovered in uncontrolled, unstudied environments have substantial limitations, some of which may be overcome through the use of microbial indicators. In this project, we sampled the microbiomes of decomposing human cadavers, focusing on the skin microbiota found in the nasal and ear canals. We then developed several models of statistical regression to establish an algorithm for predicting the PMI of microbial samples. We found that the complete data set, rather than a curated list of indicator species, was preferred for training the regressor. We further found that genus and family, rather than species, are the most informative taxonomic levels. Finally, we developed a k-nearest- neighbor regressor, tuned with the entire data set from all nasal and ear samples, that predicts the PMI of unknown samples with an average error of ±55 accumulated degree days (ADD). This study outlines a machine learning approach for the use of necrobiome data in the prediction of the PMI and thereby provides a successful proof-of- concept that skin microbiota is a promising tool in forensic death investigations

    Field-Grown Transgenic Switchgrass (Panicum virgatum L.) with Altered Lignin Does Not Affect Soil Chemistry, Microbiology, and Carbon Storage Potential

    Get PDF
    Cell wall recalcitrance poses a major challenge on cellulosic biofuel production from feedstocks such as switchgrass (Panicum virgatum L.). As lignin is a known contributor of recalcitrance, transgenic switchgrass plants with altered lignin have been produced by downregulation of caffeic acid O-methyltransferase (COMT). Field trials of COMT-downregulated plants previously demonstrated improved ethanol conversion with no adverse agronomic effects. However, the rhizosphere impacts of altering lignin in plants are unknown. We hypothesized that changing plant lignin composition may affect residue degradation in soils, ultimately altering soil processes. The objective of this study was to evaluate effects of two independent lines of COMT-downregulated switchgrass plants on soils in terms of chemistry, microbiology, and carbon cycling when grown in the field. Over the first two years of establishment, we observed no significant differences between transgenic and control plants in terms of soil pH or the total concentrations of 19 elements. An analysis of soil bacterial communities via high-throughput 16S rRNA gene amplicon sequencing revealed no effects of transgenic plants on bacterial diversity, richness, or community composition. We also did not observe a change in the capacity for soil carbon storage: There was no significant effect on soil respiration or soil organic matter. After five years of establishment, δ13C of plant roots, leaves, and soils was measured and an isotopic mixing model used to estimate that 11.2 to 14.5% of soil carbon originated from switchgrass. Switchgrass-contributed carbon was not significantly different between transgenic and control plants. Overall, our results indicate that over the short term (two and five years), lignin modification in switchgrass through manipulation of COMT expression does not have an adverse effect on soils in terms of total elemental composition, bacterial community structure and diversity, and capacity for carbon storage

    Colectomy is a risk factor for venous thromboembolism in ulcerative colitis

    Get PDF
    AIM: To compare venous thromboembolism (VTE) in hospitalized ulcerative colitis (UC) patients who respond to medical management to patients requiring colectomy.METHODS: Population-based surveillance from 1997 to 2009 was used to identify all adults admitted to hospital for a flare of UC and those patients who underwent colectomy. All medical charts were reviewed to confirm the diagnosis and extract clinically relevant information. UC patients were stratified by: (1) responsive to inpatient medical therapy (n = 382); (2) medically refractory requiring emergent colectomy (n = 309); and (3) elective colectomy (n = 329). The primary outcome was the development of VTE during hospitalization or within 6 mo of discharge. Heparin prophylaxis to prevent VTE was assessed. Logistic regression analysis determined the effect of disease course (i.e., responsive to medical therapy, medically refractory, and elective colectomy) on VTE after adjusting for confounders including age, sex, smoking, disease activity, comorbidities, extent of disease, and IBD medications (i.e., corticosteroids, mesalamine, azathioprine, and infliximab). Point estimates were presented as odds ratios (OR) with 95%CI.RESULTS: The prevalence of VTE among patients with UC who responded to medical therapy was 1.3% and only 16% of these patients received heparin prophylaxis. In contrast, VTE was higher among patients who underwent an emergent (8.7%) and elective (4.9%) colectomy, despite greater than 90% of patients receiving postoperative heparin prophylaxis. The most common site of VTE was intra-abdominal (45.8%) followed by lower extremity (19.6%). VTE was diagnosed after discharge from hospital in 16.7% of cases. Elective (adjusted OR = 3.69; 95%CI: 1.30-10.44) and emergent colectomy (adjusted OR = 5.28; 95%CI: 1.93-14.45) were significant risk factors for VTE as compared to medically responsive UC patients. Furthermore, the odds of a VTE significantly increased across time (adjusted OR = 1.10; 95%CI: 1.01-1.20). Age, sex, comorbidities, disease extent, disease activity, smoking, corticosteroids, mesalamine, azathioprine, and infliximab were not independently associated with the development of VTE.CONCLUSION: VTE was associated with colectomy, particularly, among UC patients who failed medical management. VTE prophylaxis may not be sufficient to prevent VTE in patients undergoing colectom

    Allied Health Professional Support in Pediatric Inflammatory Bowel Disease: A Survey from the Canadian Children Inflammatory Bowel Disease Network—A Joint Partnership of CIHR and the CH.I.L.D. Foundation

    Get PDF
    Objectives. The current number of healthcare providers (HCP) caring for children with inflammatory bowel disease (IBD) across Canadian tertiary-care centres is underinvestigated. The aim of this survey was to assess the number of healthcare providers (HCP) in ambulatory pediatric IBD care across Canadian tertiary-care centres. Methods. Using a self-administered questionnaire, we examined available resources in academic pediatric centres within the Canadian Children IBD Network. The survey evaluated the number of HCP providing ambulatory care for children with IBD. Results. All 12 tertiary pediatric gastroenterology centres participating in the network responded. Median full-time equivalent (FTE) of allied health professionals providing IBD care at each site was 1.0 (interquartile range (IQR) 0.6–1.0) nurse, 0.5 (IQR 0.2–0.8) dietitian, 0.3 (IQR 0.2–0.8) social worker, and 0.1 (IQR 0.02–0.3) clinical psychologists. The ratio of IBD patients to IBD physicians was 114 : 1 (range 31 : 1–537 : 1), patients to nurses/physician assistants 324 : 1 (range 150 : 1–900 : 1), dieticians 670 : 1 (range 250 : 1–4500 : 1), social workers 1558 : 1 (range 250 : 1–16000 : 1), and clinical psychologists 2910 : 1 (range 626 : 1–3200 : 1). Conclusions. There was a wide variation in HCP support among Canadian centres. Future work will examine variation in care including patients’ outcomes and satisfaction across Canadian centres

    Diagnostic Delay Is Associated with Complicated Disease and Growth Impairment in Paediatric Crohn\u27s Disease

    Get PDF
    Background: Paediatric data on the association between diagnostic delay and inflammatory bowel disease [IBD] complications are lacking. We aimed to determine the effect of diagnostic delay on stricturing/fistulising complications, surgery, and growth impairment in a large paediatric cohort, and to identify predictors of diagnostic delay. Methods: We conducted a national, prospective, multicentre IBD inception cohort study including 1399 children. Diagnostic delay was defined as time from symptom onset to diagnosis \u3e75th percentile. Multivariable proportional hazards [PH] regression was used to examine the association between diagnostic delay and stricturing/fistulising complications and surgery, and multivariable linear regression to examine the association between diagnostic delay and growth. Predictors of diagnostic delay were identified using Cox PH regression. Results: Overall (64% Crohn\u27s disease [CD]; 36% ulcerative colitis/IBD unclassified [UC/IBD-U]; 57% male]), median time to diagnosis was 4.2 (interquartile range [IQR] 2.0-9.2) months. For the overall cohort, diagnostic delay was \u3e9.2 months; in CD, \u3e10.8 months and in UC/IBD-U, \u3e6.6 months. In CD, diagnostic delay was associated with a 2.5-fold higher rate of strictures/internal fistulae (hazard ratio [HR] 2.53, 95% confidence interval [CI] 1.41-4.56). Every additional month of diagnostic delay was associated with a decrease in height-for-age z-score of 0.013 standard deviations [95% CI 0.005-0.021]. Associations persisted after adjusting for disease location and therapy. No independent association was observed between diagnostic delay and surgery in CD or UC/IBD-U. Diagnostic delay was more common in CD, particularly small bowel CD. Abdominal pain, including isolated abdominal pain in CD, was associated with diagnostic delay. Conclusions: Diagnostic delay represents a risk factor for stricturing/internal fistulising complications and growth impairment in paediatric CD
    corecore