51 research outputs found

    Impact of Change to Molecular Testing for Clostridium difficile Infection on Healthcare Facility–Associated Incidence Rates

    Get PDF
    Background. Change from nonmolecular to molecular testing techniques is thought to contribute to the increasing trend in incidence of Clostridium difficile infection (CDI); however the degree of effect attributed to this versus other time-related epidemiologic factors is unclear. Methods. We compared the relative change in incidence rate (IRR) of healthcare facility–associated (HCFA) CDI among hospitals in the Duke Infection Control Outreach Network before and after the date of switch from nonmolecular tests to polymerase chain reaction (PCR) using prospectively collected surveillance data from July 2009 to December 2011. Data from 10 hospitals that switched and 22 control hospitals were included. Individual hospital estimates were determined using Poisson regression. We used an interrupted time series approach to develop a Poisson mixed-effects model. Additional regression adjustments were made for clustering and proportion of intensive care unit patient-days. The variable for PCR was treated as a fixed effect; other modeled variables were random effects. Results. For those hospitals that switched to PCR, mean incidence rate of HCFA CDI before the switch was 6.0 CDIs per 10,000 patient-days compared with 9.6 CDIs per 10,000 patient-days after the switch. Estimates of hospital-specific IRR that compared after the switch with before the switch ranged from 0.89 (95% confidence interval [CI], 0.32–2.44) to 6.91 (95% CI, 1.12–42.54). After adjustment in the mixed-effects model, the overall IRR comparing CDI incidence after the switch to before the switch was 1.56 (95% CI, 1.28–1.90). Time-trend variables did not reach statistical significance. Conclusion. Hospitals that switched from nonmolecular to molecular tests experienced an approximate 56% increase in the rate of HCFA CDI after testing change

    A Mathematical Model to Evaluate the Routine Use of Fecal Microbiota Transplantation to Prevent Incident and Recurrent Clostridium difficile Infection

    Get PDF
    Objective. Fecal microbiota transplantation (FMT) has been suggested as a new treatment to manage Clostridium difficile infection (CDI). With use of a mathematical model of C. difficile within an intensive care unit (ICU), we examined the potential impact of routine FMT. Design, Setting, and Patients. A mathematical model of C. difficile transmission, supplemented with prospective cohort, surveillance, and billing data from hospitals in the southeastern United States. Methods. Cohort, surveillance, and billing data as well as data from the literature were used to construct a compartmental model of CDI within an ICU. Patients were defined as being in 1 of 6 potential health states: uncolonized and at low risk; uncolonized and at high risk; colonized and at low risk; colonized and at high risk; having CDI; or treated with FMT. Results. The use of FMT to treat patients after CDI was associated with a statistically significant reduction in recurrence but not with a reduction in incident cases. Treatment after administration of high-risk medications, such as antibiotics, did not result in a decrease in recurrence but did result in a statistically significant difference in incident cases across treatment groups, although whether this difference was clinically relevant was questionable. Conclusions. Our study is a novel mathematical model that examines the effect of FMT on the prevention of recurrent and incident CDI. The routine use of FMT represents a promising approach to reduce complex recurrent cases, but a reduction in CDI incidence will require the use of other methods to prevent transmission

    Hospital-Acquired Clostridium difficile Infections: Estimating All-Cause Mortality and Length of Stay

    Get PDF
    Clostridium difficile is a health care–associated infection of increasing importance. The purpose of this study was to estimate the time until death from any cause and time until release among patients with C. difficile, comparing the burden of those in the intensive care unit (ICU) with those in the general hospital population

    Epidemiology of Surgical Site Infection in a Community Hospital Network

    Get PDF
    OBJECTIVE To describe the epidemiology of complex surgical site infection (SSI) following commonly performed surgical procedures in community hospitals and to characterize trends of SSI prevalence rates over time for MRSA and other common pathogens METHODS We prospectively collected SSI data at 29 community hospitals in the southeastern United States from 2008 through 2012. We determined the overall prevalence rates of SSI for commonly performed procedures during this 5-year study period. For each year of the study, we then calculated prevalence rates of SSI stratified by causative organism. We created log-binomial regression models to analyze trends of SSI prevalence over time for all pathogens combined and specifically for MRSA. RESULTS A total of 3,988 complex SSIs occurred following 532,694 procedures (prevalence rate, 0.7 infections per 100 procedures). SSIs occurred most frequently after small bowel surgery, peripheral vascular bypass surgery, and colon surgery. Staphylococcus aureus was the most common pathogen. The prevalence rate of SSI decreased from 0.76 infections per 100 procedures in 2008 to 0.69 infections per 100 procedures in 2012 (prevalence rate ratio [PRR], 0.90; 95% confidence interval [CI], 0.82–1.00). A more substantial decrease in MRSA SSI (PRR, 0.69; 95% CI, 0.54–0.89) was largely responsible for this overall trend. CONCLUSIONS The prevalence of MRSA SSI decreased from 2008 to 2012 in our network of community hospitals. This decrease in MRSA SSI prevalence led to an overall decrease in SSI prevalence over the study period. Infect Control Hosp Epidemiol 2016;37:519–52

    A modified Delphi approach to develop a trial protocol for antibiotic de-escalation in patients with suspected sepsis

    Get PDF
    Background: Early administration of antibiotics in sepsis is associated with improved patient outcomes, but safe and generalizable approaches to de-escalate or discontinue antibiotics after suspected sepsis events are unknown. Methods: We used a modified Delphi approach to identify safety criteria for an opt-out protocol to guide de-escalation or discontinuation of antibiotic therapy after 72 hours in non-ICU patients with suspected sepsis. An expert panel with expertise in antimicrobial stewardship and hospital epidemiology rated 48 unique criteria across 3 electronic survey rating tools. Criteria were rated primarily based on their impact on patient safety and feasibility for extraction from electronic health record review. The 48 unique criteria were rated by anonymous electronic survey tools, and the results were fed back to the expert panel participants. Consensus was achieved to either retain or remove each criterion. Results: After 3 rounds, 22 unique criteria remained as part of the opt-out safety checklist. These criteria included high-risk comorbidities, signs of severe illness, lack of cultures during sepsis work-up or antibiotic use prior to blood cultures, or ongoing signs and symptoms of infection. Conclusions: The modified Delphi approach is a useful method to achieve expert-level consensus in the absence of evidence suifficient to provide validated guidance. The Delphi approach allowed for flexibility in development of an opt-out trial protocol for sepsis antibiotic de-escalation. The utility of this protocol should be evaluated in a randomized controlled trial

    Short Operative Duration and Surgical Site Infection Risk in Hip and Knee Arthroplasty Procedures

    Get PDF
    OBJECTIVE To determine the association (1) between shorter operative duration and surgical site infection (SSI) and (2) between surgeon median operative duration and SSI risk among first-time hip and knee arthroplasties. DESIGN Retrospective cohort study SETTING A total of 43 community hospitals located in the southeastern United States. PATIENTS Adults who developed SSIs according to National Healthcare Safety Network criteria within 365 days of first-time knee or hip arthroplasties performed between January 1, 2008 and December 31, 2012. METHODS Log-binomial regression models estimated the association (1) between operative duration and SSI outcome and (2) between surgeon median operative duration and SSI outcome. Hip and knee arthroplasties were evaluated in separate models. Each model was adjusted for American Society of Anesthesiology score and patient age. RESULTS A total of 25,531 hip arthroplasties and 42,187 knee arthroplasties were included in the study. The risk of SSI in knee arthroplasties with an operative duration shorter than the 25 th percentile was 0.40 times the risk of SSI in knee arthroplasties with an operative duration between the 25 th and 75 th percentile (risk ratio [RR], 0.40; 95% confidence interval [CI], 0.38–0.56; P <.01). Short operative duration did not demonstrate significant association with SSI for hip arthroplasties (RR, 1.04; 95% CI, 0.79–1.37; P =.36). Knee arthroplasty surgeons with shorter median operative durations had a lower risk of SSI than surgeons with typical median operative durations (RR, 0.52; 95% CI, 0.43–0.64; P <.01). CONCLUSIONS Short operative durations were not associated with a higher SSI risk for knee or hip arthroplasty procedures in our analysis. Infect. Control Hosp. Epidemiol. 2015;36(12):1431–143

    Delays in Appropriate Antibiotic Therapy for Gram-Negative Bloodstream Infections: A Multicenter, Community Hospital Study

    Get PDF
    BackgroundGram-negative bacterial bloodstream infection (BSI) is a serious condition with estimated 30% mortality. Clinical outcomes for patients with severe infections improve when antibiotics are appropriately chosen and given early. The objective of this study was to estimate the association of prior healthcare exposure on time to appropriate antibiotic therapy in patients with gram-negative BSI.MethodWe performed a multicenter cohort study of adult, hospitalized patients with gram-negative BSI using time to event analysis in nine community hospitals from 2003-2006. Event time was defined as the first administration of an antibiotic with in vitro activity against the infecting organism. Healthcare exposure status was categorized as community-acquired, healthcare-associated, or hospital-acquired. Time to appropriate therapy among groups of patients with differing healthcare exposure status was assessed using Kaplan-Meier analyses and multivariate Cox proportional hazards models.ResultsThe cohort included 578 patients with gram-negative BSI, including 320 (55%) healthcare-associated, 217 (38%) community-acquired, and 41 (7%) hospital-acquired infections. 529 (92%) patients received an appropriate antibiotic during their hospitalization. Time to appropriate therapy was significantly different among the groups of healthcare exposure status (log-rank p=0.02). Time to first antibiotic administration regardless of drug appropriateness was not different between groups (p=0.3). The unadjusted hazard ratios (HR) (95% confidence interval) were 0.80 (0.65-0.98) for healthcare-associated and 0.72 (0.63-0.82) for hospital-acquired, relative to patients with community-acquired BSI. In multivariable analysis, interaction was found between the main effect and baseline Charlson comorbidity index. When Charlson index was 3, adjusted HRs were 0.66 (0.48-0.92) for healthcare-associated and 0.57 (0.44-0.75) for hospital-acquired, relative to patients with community-acquired infections.ConclusionsPatients with healthcare-associated or hospital-acquired BSI experienced delays in receipt of appropriate antibiotics for gram-negative BSI compared to patients with community-acquired BSI. This difference was not due to delayed initiation of antibiotic therapy, but due to the inappropriate choice of antibiotic

    The Antimicrobial Scrub Contamination and Transmission (ASCOT) Trial: A Three-Arm, Blinded, Randomized Controlled Trial With Crossover Design to Determine the Efficacy of Antimicrobial-Impregnated Scrubs in Preventing Healthcare Provider Contamination

    Get PDF
    OBJECTIVE To determine whether antimicrobial-impregnated textiles decrease the acquisition of pathogens by healthcare provider (HCP) clothing. DESIGN We completed a 3-arm randomized controlled trial to test the efficacy of 2 types of antimicrobial-impregnated clothing compared to standard HCP clothing. Cultures were obtained from each nurse participant, the healthcare environment, and patients during each shift. The primary outcome was the change in total contamination on nurse scrubs, measured as the sum of colony-forming units (CFU) of bacteria. PARTICIPANTS AND SETTING Nurses working in medical and surgical ICUs in a 936-bed tertiary-care hospital. INTERVENTION Nurse subjects wore standard cotton-polyester surgical scrubs (control), scrubs that contained a complex element compound with a silver-alloy embedded in its fibers (Scrub 1), or scrubs impregnated with an organosilane-based quaternary ammonium and a hydrophobic fluoroacrylate copolymer emulsion (Scrub 2). Nurse participants were blinded to scrub type and randomly participated in all 3 arms during 3 consecutive 12-hour shifts in the intensive care unit. RESULTS In total, 40 nurses were enrolled and completed 3 shifts. Analyses of 2,919 cultures from the environment and 2,185 from HCP clothing showed that scrub type was not associated with a change in HCP clothing contamination ( P =.70). Mean difference estimates were 0.118 for the Scrub 1 arm (95% confidence interval [CI], −0.206 to 0.441; P =.48) and 0.009 for the Scrub 2 rm (95% CI, −0.323 to 0.342; P =.96) compared to the control. HCP became newly contaminated with important pathogens during 19 of the 120 shifts (16%). CONCLUSIONS Antimicrobial-impregnated scrubs were not effective at reducing HCP contamination. However, the environment is an important source of HCP clothing contamination. TRIAL REGISTRATION Clinicaltrials.gov Identifier: NCT 02645214 Infect Control Hosp Epidemiol 2017;38:1147–115

    Rising Rates of Carbapenem-Resistant Enterobacteriaceae in Community Hospitals: A Mixed-Methods Review of Epidemiology and Microbiology Practices in a Network of Community Hospitals in the Southeastern United States

    Get PDF
    (See the commentary by Pfeiffer and Beldavs, on pages  984–986 .) Objective Describe the epidemiology of carbapenem-resistant Enterobacteriaceae (CRE) and examine the effect of lower carbapenem breakpoints on CRE detection. Design Retrospective cohort. Setting Inpatient care at community hospitals. Patients All patients with CRE-positive cultures were included. Methods CRE isolated from 25 community hospitals were prospectively entered into a centralized database from January 2008 through December 2012. Microbiology laboratory practices were assessed using questionnaires. Results A total of 305 CRE isolates were detected at 16 hospitals (64%). Patients with CRE had symptomatic infection in 180 cases (59%) and asymptomatic colonization in the remainder (125 cases; 41%). Klebsiella pneumoniae (277 isolates; 91%) was the most prevalent species. The majority of cases were healthcare associated (288 cases; 94%). The rate of CRE detection increased more than fivefold from 2008 (0.26 cases per 100,000 patient-days) to 2012 (1.4 cases per 100,000 patient-days; incidence rate ratio (IRR), 5.3 [95% confidence interval (CI), 1.22–22.7]; P = .01). Only 5 hospitals (20%) had adopted the 2010 Clinical and Laboratory Standards Institute (CLSI) carbapenem breakpoints. The 5 hospitals that adopted the lower carbapenem breakpoints were more likely to detect CRE after implementation of breakpoints than before (4.1 vs 0.5 cases per 100,000 patient-days; P < .001; IRR, 8.1 [95% CI, 2.7–24.6]). Hospitals that implemented the lower carbapenem breakpoints were more likely to detect CRE than were hospitals that did not (3.3 vs 1.1 cases per 100,000 patient-days; P = .01). Conclusions The rate of CRE detection increased fivefold in community hospitals in the southeastern United States from 2008 to 2012. Despite this, our estimates are likely underestimates of the true rate of CRE detection, given the low adoption of the carbapenem breakpoints recommended in the 2010 CLSI guidelines
    • …
    corecore