47 research outputs found
Optimal urine culture diagnostic stewardship practice- Results from an expert modified-Delphi procedure
BACKGROUND: Urine cultures are nonspecific and often lead to misdiagnosis of urinary tract infection and unnecessary antibiotics. Diagnostic stewardship is a set of procedures that modifies test ordering, processing, and reporting in order to optimize diagnosis and downstream treatment. In this study, we aimed to develop expert guidance on best practices for urine culture diagnostic stewardship.
METHODS: A RAND-modified Delphi approach with a multidisciplinary expert panel was used to ascertain diagnostic stewardship best practices. Clinical questions to guide recommendations were grouped into three thematic areas (ordering, processing, reporting) in practice settings of emergency department, inpatient, ambulatory, and long-term care. Fifteen experts ranked recommendations on a 9-point Likert scale. Recommendations on which the panel did not reach agreement were discussed during a virtual meeting, then a second round of ranking by email was completed. After secondary review of results and panel discussion, a series of guidance statements was developed.
RESULTS: One hundred and sixty-five questions were reviewed. The panel reaching agreement on 104, leading to 18 overarching guidance statements. The following strategies were recommended to optimize ordering urine cultures: requiring documentation of symptoms, sending alerts to discourage ordering in the absence of symptoms, and cancelling repeat cultures. For urine culture processing, conditional urine cultures and urine white blood cell count as criteria were supported. For urine culture reporting, appropriate practices included nudges to discourage treatment under specific conditions and selective reporting of antibiotics to guide therapy decisions.
CONCLUSIONS: These 18 guidance statements can optimize use of urine cultures for better patient outcomes
Real-world, Multicenter Experience With Meropenem-Vaborbactam for Gram-Negative Bacterial Infections Including Carbapenem-Resistant Enterobacterales and Pseudomonas Aeruginosa
Background: We aimed to describe the clinical characteristics and outcomes of patients treated with meropenem-vaborbactam (MEV) for a variety of gram-negative infections (GNIs), primarily including carbapenem-resistant Enterobacterales (CRE).
Methods: This is a real-world, multicenter, retrospective cohort within the United States between 2017 and 2020. Adult patients who received MEV for ≥72 hours were eligible for inclusion. The primary outcome was 30-day mortality. Classification and regression tree analysis (CART) was used to identify the time breakpoint (BP) that delineated the risk of negative clinical outcomes (NCOs) and was examined by multivariable logistic regression analysis (MLR).
Results: Overall, 126 patients were evaluated from 13 medical centers in 10 states. The most common infection sources were respiratory tract (38.1%) and intra-abdominal (19.0%) origin, while the most common isolated pathogens were CRE (78.6%). Thirty-day mortality and recurrence occurred in 18.3% and 11.9%, respectively. Adverse events occurred in 4 patients: nephrotoxicity (n = 2), hepatoxicity (n = 1), and rash (n = 1). CART-BP between early and delayed treatment was 48 hours (P = .04). MEV initiation within 48 hours was independently associated with reduced NCO following analysis by MLR (adusted odds ratio, 0.277; 95% CI, 0.081-0.941).
Conclusions: Our results support current evidence establishing positive clinical and safety outcomes of MEV in GNIs, including CRE. We suggest that delaying appropriate therapy for CRE significantly increases the risk of NCOs
Time Is of the Essence: The Impact of Delayed Antibiotic Therapy on Patient Outcomes in Hospital-Onset Enterococcal Bloodstream Infections
BACKGROUND: With increasing prevalence of vancomycin-resistant enterococci (VRE), appropriate antibiotic therapy for enterococcal bloodstream infections (EBSI) can be delayed. Data regarding the impact of delayed therapy on EBSI outcomes are conflicting, and the time delay most strongly associated with poor outcomes has not been defined.
METHODS: This was a single-center, retrospective cohort study of adult, nonneutropenic patients with hospital-onset EBSI from 2010 to 2014. Classification and regression tree (CART) analysis was used to determine the delay in appropriate therapy most predictive of 30-day mortality. Appropriate therapy was defined as antibiotic therapy to which the enterococci and copathogen, where applicable, were susceptible. Outcomes and clinical characteristics were compared between patients receiving early or delayed therapy, defined by CART timepoint. Poisson regression was employed to determine the independent association of delayed therapy on 30-day mortality and predictors of delayed therapy.
RESULTS: Overall, 190 patients were included. A breakpoint in time to appropriate therapy was identified at 48.1 hours, where 30-day mortality was substantially increased (14.6% vs 45.3%; P \u3c .001). Patients receiving appropriate therapy after 48.1 hours also experienced higher in-hospital mortality and longer EBSI duration. After adjustment for severity of illness and comorbidity, delayed therapy ≥48.1 hours was associated with a 3-fold increase in 30-day mortality (risk ratio, 3.16 [95% confidence interval, 1.96-5.09]). Vancomycin resistance was the only independent predictor of delayed therapy.
CONCLUSIONS: In patients with hospital-onset EBSI, receipt of appropriate therapy within the first 48 hours was associated with reduced mortality, underscoring the potential role of rapid diagnostic testing for early identification of VRE
Novel application of published risk factors for methicillin-resistant S aureus in acute bacterial skin and skin structure infections
Methicillin-resistant Staphylococcus aureus acute bacterial skin and skin structure infections (MRSA ABSSSIs) are associated with a significant clinical and economic burden; however, rapid identification of MRSA remains a clinical challenge. This study aimed to use a novel method of predictive modeling to determine those at highest risk of MRSA ABSSSIs. Risk factors for MRSA ABSSSI were derived from a combination of previously published literature and multivariable logistic regression of individual patient data (IPD) using the \u27adaptation method.\u27 A risk-scoring tool was derived from weight-proportional integer-adjusted coefficients of the predictive model. Likelihood ratios were used to adjust posterior probability of MRSA. Risk factors were identified from 12 previously published studies and adapted based on IPD (n = 231). Risk factors were: history of diabetes with obesity (adapted odds ratio [aOR] = 1.1), prior antibiotics (90 days) (aOR = 2.6), chronic kidney disease/hemodialysis (aOR = 1.4), intravenous drug use (aOR = 2.8), previous MRSA exposure/infection (12 months) (aOR = 2.8), previous hospitalization (12 months) (aOR = 7.5), and HIV/AIDS (aOR = 4.0). Baseline prevalence of MRSA was 42.7%. Scores ranged from 0 - 8 points. Post-test probability of MRSA: score 0 = 35.0%; score 1 - 2 = 45.0%; score 3 = 63.0%. The newly derived risk-scoring tool is proof-of-concept of the adaptation method. This study is hypothesis generating and such a tool remains to be validated for clinical use