420 research outputs found
Development and validation of a chemostat gut model to study both planktonic and biofilm modes of growth of Clostridium difficile and human microbiota
Copyright: 2014 Crowther et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.The human gastrointestinal tract harbours a complex microbial community which exist in planktonic and sessile form. The degree to which composition and function of faecal and mucosal microbiota differ remains unclear. We describe the development and characterisation of an in vitro human gut model, which can be used to facilitate the formation and longitudinal analysis of mature mixed species biofilms. This enables the investigation of the role of biofilms in Clostridium difficile infection (CDI). A well established and validated human gut model of simulated CDI was adapted to incorporate glass rods that create a solid-gaseous-liquid interface for biofilm formation. The continuous chemostat model was inoculated with a pooled human faecal emulsion and controlled to mimic colonic conditions in vivo. Planktonic and sessile bacterial populations were enumerated for up to 46 days. Biofilm consistently formed macroscopic structures on all glass rods over extended periods of time, providing a framework to sample and analyse biofilm structures independently. Whilst variation in biofilm biomass is evident between rods, populations of sessile bacterial groups (log10 cfu/g of biofilm) remain relatively consistent between rods at each sampling point. All bacterial groups enumerated within the planktonic communities were also present within biofilm structures. The planktonic mode of growth of C. difficile and gut microbiota closely reflected observations within the original gut model. However, distinct differences were observed in the behaviour of sessile and planktonic C. difficile populations, with C. difficile spores preferentially persisting within biofilm structures. The redesigned biofilm chemostat model has been validated for reproducible and consistent formation of mixed species intestinal biofilms. This model can be utilised for the analysis of sessile mixed species communities longitudinally, potentially providing information of the role of biofilms in CDI.Peer reviewe
Application of a Geomorphic and Temporal Perspective to Wetland Management
The failure of managed wetlands to provide a broad suite of ecosystem services (e.g., carbon storage, wildlife habitat, ground-water recharge, storm-water retention) valuable to society is primarily the result of a lack of consideration of ecosystem processes that maintain productive wetland ecosystems or physical and social forces that restrict a manager’s ability to apply actions that allow those processes to occur. Therefore, we outline a course of action that considers restoration of ecosystem processes in those systems where off-site land use or physical alterations restrict local management. Upon considering a wetland system, or examining a particular management regime, there are several factors that will allow successful restoration of wetland services. An initial step is examination of the political/social factors that have structured the current ecological condition and whether those realities can be addressed. Most successful restorations of wetland ecosystem services involve cooperation among multiple agencies, acquisition of funds from non-traditional sources, seeking of scientific advice on ecosystem processes, and cultivation of good working relationships among biologists, managers, and maintenance staff. Beyond that, in on-site wetland situations, management should examine the existing hydrogeomorphic situation and processes (e.g., climatic variation, tides, riverine flood-pulse events) responsible for maintenance of ecosystem services within a given temporal framework appropriate for that wetland’s hydrologic pattern. We discuss these processes for five major wetland types (depressional, lacustrine, estuarine, riverine, and man-made impoundments) and then provide two case histories in which this approach was applied: Seney National Wildlife Refuge with a restored fen system and Bosque del Apache National Wildlife Refuge where riverine processes have been simulated to restore native habitat. With adequate partnerships and administrative and political support, managers faced with degraded and/or disconnected wetland processes will be able to restore ecosystem services for society in our highly altered landscape by considering wetlands in their given hydrogeomorphic setting and temporal stage
Short-term genome stability of serial Clostridium difficile ribotype 027 isolates in an experimental gut model and recurrent human disease
Copyright: © 2013 Eyre et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are creditedClostridium difficile whole genome sequencing has the potential to identify related isolates, even among otherwise indistinguishable strains, but interpretation depends on understanding genomic variation within isolates and individuals.Serial isolates from two scenarios were whole genome sequenced. Firstly, 62 isolates from 29 timepoints from three in vitro gut models, inoculated with a NAP1/027 strain. Secondly, 122 isolates from 44 patients (2–8 samples/patient) with mostly recurrent/on-going symptomatic NAP-1/027 C. difficile infection. Reference-based mapping was used to identify single nucleotide variants (SNVs).Across three gut model inductions, two with antibiotic treatment, total 137 days, only two new SNVs became established. Pre-existing minority SNVs became dominant in two models. Several SNVs were detected, only present in the minority of colonies at one/two timepoints. The median (inter-quartile range) [range] time between patients’ first and last samples was 60 (29.5–118.5) [0–561] days. Within-patient C. difficile evolution was 0.45 SNVs/called genome/year (95%CI 0.00–1.28) and within-host diversity was 0.28 SNVs/called genome (0.05–0.53). 26/28 gut model and patient SNVs were non-synonymous, affecting a range of gene targets.The consistency of whole genome sequencing data from gut model C. difficile isolates, and the high stability of genomic sequences in isolates from patients, supports the use of whole genome sequencing in detailed transmission investigations.Peer reviewe
Risk factors for <i>Clostridium difficile</i> infection in hospitalized patients with community-acquired pneumonia
Objectives: Clostridium difficile infection (CDI) is strongly associated with anti-biotic treatment, and community-acquired pneumonia (CAP) is the leading indication for anti-biotic prescription in hospitals. This study assessed the incidence of and risk factors for CDI in a cohort of patients hospitalized with CAP. Methods: We analysed data from a prospective, observational cohort of patients with CAP in Edinburgh, UK. Patients with diarrhoea were systematically screened for CDI, and risk factors were determined through time-dependent survival analysis. Results: Overall, 1883 patients with CAP were included, 365 developed diarrhoea and 61 had laboratory-confirmed CDI. The risk factors for CDI were: age (hazard ratio [HR], 1.06 per year; 95% confidence interval [CI], 1.03-1.08), total number of antibiotic classes received (HR, 3.01 per class; 95% CI, 2.32-3.91), duration of antibiotic therapy (HR, 1.09 per day; 95% CI, 1.00-1.19 and hospitalization status (HR, 13.1; 95% CI, 6.0-28.7). Antibiotic class was not an independent predictor of CDI when adjusted for these risk factors (P > 0.05 by interaction testing).Conclusions: These data suggest that reducing the overall antibiotic burden, duration of antibiotic treatment and duration of hospital stay may reduce the incidence of CDI in patients with CAP.</p
Impact of interventions to reduce nosocomial transmission of SARS-CoV-2 in English NHS Trusts: a computational modelling study
Background: Prior to September 2021, 55,000–90,000 hospital inpatients in England were identified as having a potentially nosocomial SARS-CoV-2 infection. This includes cases that were likely missed due to pauci- or asymptomatic infection. Further, high numbers of healthcare workers (HCWs) are thought to have been infected, and there is evidence that some of these cases may also have been nosocomially linked, with both HCW to HCW and patient to HCW transmission being reported. From the start of the SARS-CoV-2 pandemic interventions in hospitals such as testing patients on admission and universal mask wearing were introduced to stop spread within and between patient and HCW populations, the effectiveness of which are largely unknown. Materials/methods: Using an individual-based model of within-hospital transmission, we estimated the contribution of individual interventions (together and in combination) to the effectiveness of the overall package of interventions implemented in English hospitals during the COVID-19 pandemic. A panel of experts in infection prevention and control informed intervention choice and helped ensure the model reflected implementation in practice. Model parameters and associated uncertainty were derived using national and local data, literature review and formal elicitation of expert opinion. We simulated scenarios to explore how many nosocomial infections might have been seen in patients and HCWs if interventions had not been implemented. We simulated the time period from March-2020 to July-2022 encompassing different strains and multiple doses of vaccination. Results: Modelling results suggest that in a scenario without inpatient testing, infection prevention and control measures, and reductions in occupancy and visitors, the number of patients developing a nosocomial SARS-CoV-2 infection could have been twice as high over the course of the pandemic, and over 600,000 HCWs could have been infected in the first wave alone. Isolation of symptomatic HCWs and universal masking by HCWs were the most effective interventions for preventing infections in both patient and HCW populations. Model findings suggest that collectively the interventions introduced over the SARS-CoV-2 pandemic in England averted 400,000 (240,000 – 500,000) infections in inpatients and 410,000 (370,000 – 450,000) HCW infections. Conclusions: Interventions to reduce the spread of nosocomial infections have varying impact, but the package of interventions implemented in England significantly reduced nosocomial transmission to both patients and HCWs over the SARS-CoV-2 pandemic
High prevalence of subclass-specific binding and neutralizing antibodies against Clostridium difficile toxins in adult cystic fibrosis sera: possible mode of immunoprotection against symptomatic C. difficile infection
Objectives: Despite multiple risk factors and a high rate of colonization for Clostridium difficile, the occurrence of C. difficile infection in patients with cystic fibrosis is rare. The aim of this study was to compare the prevalence of binding C. difficile toxin-specific immunoglobulin (Ig)A, IgG and anti-toxin neutralizing antibodies in the sera of adults with cystic fibrosis, symptomatic C. difficile infection (without cystic fibrosis) and healthy controls.
Methods: Subclass-specific IgA and IgG responses to highly purified whole C. difficile toxins A and B (toxinotype 0, strain VPI 10463, ribotype 087), toxin B from a C. difficile toxin-B only expressing strain (CCUG 20309) and precursor form of B fragment of binary toxin, pCDTb, were determined by protein microarray. Neutralizing antibodies to C. difficile toxins A and B were evaluated using a Caco-2 cell-based neutralization assay.
Results: Serum IgA anti-toxin A and B levels and neutralizing antibodies against toxin A were significantly higher in adult cystic fibrosis patients (n=16) compared with healthy controls (n=17) and patients with symptomatic C. difficile infection (n=16); p≤0.05. The same pattern of response prevailed for IgG, except that there was no difference in anti-toxin A IgG levels between the groups. Compared with healthy controls (toxins A and B) and patients with C. difficile infection (toxin A), sera from cystic fibrosis patients exhibited significantly stronger protective anti-toxin neutralizing antibody responses.
Conclusion: A superior ability to generate robust humoral immunity to C. difficile toxins in the cystic fibrosis population is likely to confer protection against symptomatic C. difficile infection. This protection may be lost in the post-transplantation setting, where sera-monitoring of anti-C. difficile toxin antibody titers may be of clinical value
Profiling humoral immune responses to Clostridium difficile-specific antigens by protein microarray analysis
Clostridium difficile is an anaerobic, Gram-positive, and spore-forming bacterium that is the leading worldwide infective cause of hospital-acquired and antibiotic-associated diarrhea. Several studies have reported associations between humoral immunity and the clinical course of C. difficile infection (CDI). Host humoral immune responses are determined using conventional enzyme-linked immunosorbent assay (ELISA) techniques. Herein, we report the first use of a novel protein microarray assay to determine systemic IgG antibody responses against a panel of highly purified C. difficile-specific antigens, including native toxins A and B (TcdA and TcdB, respectively), recombinant fragments of toxins A and B (TxA4 and TxB4, respectively), ribotypespecific surface layer proteins (SLPs; 001, 002, 027), and control proteins (tetanus toxoid and Candida albicans). Microarrays were probed with sera from a total of 327 individuals with CDI, cystic fibrosis without diarrhea, and healthy controls. For all antigens, precision profiles demonstrated<10% coefficient of variation (CV). Significant correlation was observed between microarray and ELISA in the quantification of antitoxin A and antitoxin B IgG. These results indicate that microarray is a suitable assay for defining humoral immune responses to C. difficile protein antigens and may have potential advantages in throughput, convenience, and cost
Role of cephalosporins in the era of Clostridium difficile infection
The incidence of Clostridium difficile infection (CDI) in Europe has increased markedly since 2000. Previous meta-analyses have suggested a strong association between cephalosporin use and CDI, and many national programmes on CDI control have focused on reducing cephalosporin usage. Despite reductions in cephalosporin use, however, rates of CDI have continued to rise. This review examines the potential association of CDI with cephalosporins, and considers other factors that influence CDI risk. EUCLID (the EUropean, multicentre, prospective biannual point prevalence study of CLostridium difficile Infection in hospitalized patients with Diarrhoea) reported an increase in the annual incidence of CDI from 6.6 to 7.3 cases per 10 000 patient bed-days from 2011–12 to 2012–13, respectively. While CDI incidence and cephalosporin usage varied widely across countries studied, there was no clear association between overall cephalosporin prescribing (or the use of any particular cephalosporin) and CDI incidence. Moreover, variations in the pharmacokinetic and pharmacodynamic properties of cephalosporins of the same generation make categorization by generation insufficient for predicting impact on gut microbiota. A multitude of additional factors can affect the risk of CDI. Antibiotic choice is an important consideration; however, CDI risk is associated with a range of antibiotic classes. Prescription of multiple antibiotics and a long duration of treatment are key risk factors for CDI, and risk also differs across patient populations. We propose that all of these are factors that should be taken into account when selecting an antibiotic, rather than focusing on the exclusion of individual drug classes
Susceptibility testing and reporting of new antibiotics with a focus on tedizolid: an international working group report
Inappropriate use and overuse of antibiotics are among the most important factors in resistance development, and effective antibiotic stewardship measures are needed to optimize outcomes. Selection of appropriate antimicrobials relies on accurate and timely antimicrobial susceptibility testing. However, the availability of clinical breakpoints and in vitro susceptibility testing often lags behind regulatory approval by several years for new antimicrobials. A Working Group of clinical/medical microbiologists from Brazil, Canada, Mexico, Saudi Arabia, Russia and the UK recently examined issues surrounding antimicrobial susceptibility testing for novel antibiotics. While commercially available tests are being developed, potential surrogate antibiotics may be used as marker of susceptibility. Using tedizolid as an example of a new antibiotic, this special report makes recommendations to optimize routine susceptibility reporting
Impact of recurrent Clostridium difficile infection: hospitalization and patient quality of life
Objectives: Data quantifying outcomes of recurrent Clostridium difficile infection (rCDI) are lacking. We sought to determine the UK hospital resource use and health-related quality of life (HrQoL) associated with rCDI hospitalisations.
Patients and methods: A non-interventional study in 6 UK acute hospitals collected retrospective clinical and resource use data from medical records of 64 adults hospitalised for rCDI and 64 matched inpatient controls with a first episode only (f)CDI. Patients were observed from the index event (date rCDI/fCDI confirmed) for 28-days (or death, if sooner); UK-specific reference costs were applied. HrQoL was assessed prospectively in a separate cohort of 30 patients hospitalised with CDI, who completed the EQ-5D-3L questionnaire during their illness.
Results: The median total management cost (post-index) was £7,539 and £6,294 for rCDI and fCDI, respectively (cost difference, p=0.075); median length of stay (LOS) was 21 days and 15.5 days, respectively (p=0.269). The median cost difference between matched rCDI and fCDI cases was £689 (IQR=£-1,873-£3,954). Subgroup analysis demonstrated the highest median costs (£8,542/patient) in severe rCDI cases. CDI management costs were driven primarily by hospital LOS, which accounted for >85% of costs in both groups. Mean EQ-5D index values were 46% lower in CDI patients compared with UK population values (0.42 and 0.78, respectively); EQ-VAS scores were 38% lower (47.82 and 77.3, respectively).
Conclusions: CDI has considerable impact on patients and healthcare resources. This multicentre study provides a contemporaneous estimate of the real-world UK costs associated with rCDI management, which are substantial and comparable to fCDI costs
- …