6 research outputs found

    Mathematical Model of the Impact of a Nonantibiotic Treatment for Clostridium difficile on the Endemic Prevalence of Vancomycin-Resistant Enterococci in a Hospital Setting

    Get PDF
    Introduction. Clostridium difficile-associated disease (CDAD) is treated using antibiotics, which often leads to the emergence of antibiotic-resistant bacteria such as vancomycin-resistant enterococci (VRE). This study estimated the impact of a non antibiotic treatment for CDAD on VRE prevalence. Methods. A previously published model describing the impact of in-hospital antibiotic use on VRE prevalence was adapted to include CDAD treatment. Simulations compared the prevalence of VRE when nonantibiotic versus antibiotic therapy was used. Results. Nonantibiotic treatment in 50% of CDAD patients resulted in an 18% relative reduction in the prevalence of VRE colonization compared with antibiotic use only. Sensitivity analysis found the model to be most sensitive to rates of antibiotic initiation and discontinuation, prevalence of VRE in admitted patients, length of stay of colonized patients, probability of CDAD acquisition, and hand-washing compliance. Conclusion. Nonantibiotic treatment of patients hospitalized with CDAD may significantly reduce the incidence of VRE colonization

    The Role of Mathematical Modeling in Designing and Evaluating Antimicrobial Stewardship Programs

    Get PDF
    Antimicrobial agent effectiveness continues to be threatened by the rise and spread of pathogen strains that exhibit drug resistance. This challenge is most acute in healthcare facilities where the well-established connection between resistance and suboptimal antimicrobial use has prompted the creation of antimicrobial stewardship programs (ASPs). Mathematical models offer tremendous potential for serving as an alternative to controlled human experimentation for assessing the effectiveness of ASPs. Models can simulate controlled randomized experiments between groups of virtual patients, some treated with the ASP measure under investigation, and some without. By removing the limitations inherent in human experimentation, including health risks, study cohort size, possible number of replicates, and effective study duration, model simulations can provide valuable information to inform decisions regarding the design of new ASPs, as well as evaluation and improvement of existing ASPs. To date, the potential of mathematical modeling methods in evaluating ASPs is largely untapped and much work remains to be done to leverage this potential

    The Role of Mathematical Modeling in Designing and Evaluating Antimicrobial Stewardship Programs

    Get PDF
    Antimicrobial agent effectiveness continues to be threatened by the rise and spread of pathogen strains that exhibit drug resistance. This challenge is most acute in healthcare facilities where the well-established connection between resistance and sub-optimal antimicrobial use has prompted the creation of antimicrobial stewardship programs (ASPs). Mathematical models offer tremendous potential for serving as an alternative to controlled human experimentation for assessing the effectiveness of ASPs. Models can simulate controlled randomized experiments between groups of virtual patients, some treated with the ASP measure under investigation, and some without. By removing the limitations inherent in human experimentation, including health risks, study cohort size, possible number of replicates, and effective study duration, model simulations can provide valuable information to inform decisions regarding the design of new ASPs, as well as evaluation and improvement of existing ASPs. To date, the potential of mathematical modeling methods in evaluating ASPs is largely untapped, and much work remains to be done to leverage this potential

    Estimation of 24-hour Urinary Creatinine Excretion through the Development of a Model and Its Relationship to Outcomes in Hospitalized Critically Ill Veterans

    Get PDF
    Background: Muscle mass is highly correlated with patient outcomes. Techniques to identify patients with low muscularity include computed tomography (CT) and bioelectrical impendence analysis (BIA) however disadvantages of cost, exposure to radiation and access make these measurements unavailable to the average dietitian. Urinary creatinine excretion (UCE) and estimation of creatinine height index (CHI) are strongly associated with muscularity and outcomes, however, require a 24-hour urine collection. The postulation that UCE may be predicted from patient variables, through mathematical modeling, would avoid the need for a 24-hour urine collection and may be clinically useful. Methods: Input variables of age, height, weight, gender, plasma creatinine, urea nitrogen, glucose, sodium, potassium, chloride and carbon dioxide from a deidentified data set of 967 patients who had UCE measured were used to develop models to predict UCE. The model identified with the best predictive ability was validated using four-fold cross validation and a separate data set not used to construct the model. Model predicted UCE and CHI were compared to measures of muscularity. The model was then retrospectively applied to a convenience sample of 120 critically ill veterans to examine prevalence of low muscle mass and if UCE and CHI were associated with outcomes. Results: A model to estimate UCE was identified utilizing the input variables of plasma creatinine, plasma BUN, age and weight was found to be highly correlated, moderately predictive of UCE and statistically significant. Model predicted UCE was found to be highly correlated with accepted measures of muscularity. Applying the model to a cohort of subjects identified that 44.2% of subjects had severe sarcopenia. Subjects with model estimated CHI 60% were found to have significantly lower body weight, BMI, plasma creatinine, albumin and prealbumin levels. Subjects with CHI 60% were found to be 8.0 times more likely to be diagnosed with malnutrition and 2.6 times more likely to be readmitted in 6 months. Subjects with low CHI trended towards longer hospital and ICU LOS, however it did not meet statistical significance. Conclusion: The development of a model which predicts UCE and correlates with muscle mass offers a novel method for the RDN to identify patients with sarcopenia on hospital admission. This method could allow the RDN to quickly screen new admissions without the use of CT or DEXA scans and without the inconvenience of a 24-hour urine collection by using readily available patient variables

    Mathematical modeling of clostridium difficile transmission in healthcare settings

    Get PDF
    Clostridium difficile is a frequent source of healthcare-associated infection, especially among patients on antibiotics or proton pump inhibitors (PPIs). The rate of C. difficile infection (CDI) has been steadily rising since 2000 and now represents a major burden on the healthcare system in terms of both morbidity and mortality. However, despite its public health importance, there are few mathematical models of C. difficile which might be used to evaluate our current evidence base or new control measures. Three different data sources were analyzed to provide parameters for a mathematical model: a cohort of incident CDI cases in the Duke Infection Control Outreach Network (DICON), a hospital-level surveillance time series, also from DICON, and inpatient records from UNC Healthcare, all from 7/1/2009 to 12/31/2010. Using estimates from these data, as well as from the literature, a pair of compartmental transmission models, one deterministic and the other stochastic, were created to evaluate the potential effect of the use of fecal transplantation as a treatment to prevent CDI. The analysis of the cohort of incident cases suggested that ICU patients experience a greater burden of mortality while infected with C. difficile and have longer lengths of stay and times until death, suggesting this population as one of special interest. Two interventions were simulated using the stochastic model: the use of fecal transplantation to treat CDI and prevent recurrent cases and the use of fecal transplantation after treatment with antibiotics or PPIs to prevent the development of CDI. Simulation results showed that treating patients with CDI was effective in preventing recurrence but not in reducing the overall number of incident cases of CDI. Transplantation after treatment with antibiotics or PPIs had no effect on preventing recurrence and a statistically significant reduction in incident cases that did not reach clinical significance. These results suggest that routine fecal transplantation for patients with CDI may be an effective treatment to prevent recurrence. Mathematical models such as the one described in this dissertation are powerful tools to evaluate potential interventions, suggest new directions for study, and understand the dynamics of infection on a population level.Doctor of Philosoph

    The acquisition of clostridium difficile-associated disease: the role of nutritional care

    Get PDF
    In 2013, the Centers for Disease Control and Prevention (CDC) classified Clostridium difficile, the bacterium responsible for Clostridium difficile-associated disease (CDAD), as one of the three most threatening microorganisms to human health. C. difficile has outpaced methicillin-resistant Staphylococcus aureus as the most common healthcare pathogen, and is currently the leading cause of antibiotic-associated diarrhea and gastroenteritis-related deaths in the United States. Advances in research, national campaigns for healthcare safety, mandated disease reporting, legislation for hospital accountability, and the creation of new antibiotics have all proven ineffective. The current treatment standards, which have been utilized since the discovery of C. difficile pathogenesis in pseudomembranous colitis in the 1970ā€™s, have shown waning effectiveness and may actually increase disease recurrence and treatment failure. Interest in non-antibiotic alternatives, the evolving understanding of the microbiome as a critical defense against pathogens, and the efficacy of oral nutrition supplements in hospitalized populations with gastrointestinal disease suggest that targeted nutritional therapy may provide a clinical benefit. The intestinal microbiota can be modified by diet, and is critical for immunity, metabolism, synthesis of vitamins and other bioactive substances, and resistance against pathogens; the microbiota prevents pathogen adherence directly through physical competition, and indirectly through the proliferation of anti-inflammatory and antibiotic-like substances. Any event that disturbs the microbiome may allow opportunistic pathogens, like C. difficile, to adhere, colonize, and produce disease. Therefore, the ability to protect or reestablish the microbiota could have vast therapeutic implications. Preliminary data suggest that nutritional status may relate directly to CDAD susceptibility. CDAD patients show markedly reduced gastrointestinal microbial diversity, which may promote pathological colonization and disease recurrence. Possible methods of repopulating the GI tract with healthful bacteria include fecal microbiota transplants and the administration of probiotics or prebiotics. The recent popularity of fecal transplants for treatment of GI infections is promising, but the process is expensive, unregulated, and aesthetically unappealing. Similarly, the ingestion of probiotics has generally shown potential in patients with recurrent CDAD, but premature degradation in the upper GI tract can be problematic, and caution is advised for use in critically ill or immunocompromised patients. Although prebiotics have the unique advantage of being well-tolerated, stable, commercially available, and easily incorporated into the diet, clinical studies in this area have been diverse, small, and scarce. Improvements in human studies have been attributed to measurable elevations in butyrate, an anti-inflammatory short-chain fatty acid byproduct of prebiotic fermentation, as well as an attenuated pro-inflammatory cytokine response. The first aim of this research was to summarize the existing clinical literature regarding prebiotic administration and CDAD with a systematic review. The systematic review search identified five studies, yet only three were suitable for inclusion in a meta-analysis. Studies were heterogenous, but appeared to slightly, though not significantly, favor prebiotics, as 35/374 (9.36%) supplemented patients experienced CDAD compared to 64/393 (16.28%) patients in the control groups (OR 0.43, P=0.05). Neither side effects nor mortality differed between treatments, and further research is needed to determine whether prebiotics may provide a clinical benefit for either current or potential CDAD patients. We next evaluated medical records from Carle Foundation Hospital (CFH) in two separate time periods to identify risk factors for CDAD and to determine whether malnutrition was related to CDAD prevalence and patient outcomes. A month-long preliminary study identified six risk factors (advanced age, admission from another healthcare unit or facility, recent hospitalization, and a history of diarrhea or documented CDAD diagnosis within the previous year) correlated with CDAD prevalence. These risk factors were then used to separate 1,277 patients from 2014, and then 973 patients from 2016, into high-risk groups for primary studies. Initial analysis revealed that advanced age, previous diarrhea, previous CDAD, malnutrition, nutrition consultation requests, and admission from a healthcare facility were individually associated with CDAD diagnosis in both 2014 and 2016. However, when multiple regression analysis was used to identify predictor variables for CDAD, only previous CDAD (OR 111.49, P<0.0001), age ā‰„65 years (OR 0.43, P=0.004), nutrition consultation requests (OR 1.70, P=0.04), and BMI (OR 0.96, P=0.02) retained significance in 2014, and only previous CDAD (OR 52.95, P<0.0001) and nutrition consultation requests (OR 1.96, P=0.004) in 2016. Although malnutrition was not independently associated with CDAD, we believe that it may more accurately mirror critical overlooked factors, such as frailty or comorbidity. CDAD prevalence did not change between 2014 and 2016 (18.6% vs 17.7%, P=0.57), although both malnutrition (10.4% vs 14.7%, P=0.002) and mortality (14.9% vs 18.9%, P=0.01) increased within the same time period. While our retrospective studies showed many consistencies between the two years and appeared to successfully identify high-risk patients within our sample, data restrictions prevented assessment of disease severity. Both the prevalence of malnutrition and CDAD were unusually low in comparison with national averages for hospitalized patients. Although the exclusive use of a high-risk group prevents comparison with current literature, we intend to use the results of these projects to better direct interventions to prevent CDAD in at-risk individuals, and thereby mitigate bacterial transmission and lessen the overall CDAD disease burden
    corecore