6 research outputs found
Optimal Dosing of Enoxaparin in Critically Ill Patients with Venous Thromboembolism
Background: Evidence suggests that goal anti-Xa levels are achieved in only 33% of critically ill patients receiving standard prophylactic enoxaparin dosing. There has been limited focus on the potential suboptimal anticoagulation effect on medical intensive care unit (MICU) patients receiving therapeutic enoxaparin dosing for venous thromboembolism (VTE).
Methods: MICU patients receiving enoxaparin 1 mg/kg twice daily or 1.5 mg/kg daily for VTE treatment in a 350-bed community teaching hospital between 2013 and 2019 with at least one peak anti-Xa level measured were included. The primary outcome was the proportion who achieved therapeutic anti-Xa levels with standard dosing. Secondary outcomes included types of dose-adjustments required and the proportion requiring subsequent dose-adjustments. Descriptive statistics were presented for all outcomes.
Results: Fifty-three patients were evaluated, including those receiving either twice-daily or once-daily standard therapeutic dosing. Optimal anti-Xa levels at first measurement were recorded after the initiation of enoxaparin in 26.4% (n=14) patients. Dose adjustments were required in 70.7% (n=29) of patients receiving twice-daily dosing and in 83.3% (n=10) receiving once-daily dosing (P=0.97) to appropriately increase or decrease the enoxaparin dose. By the third anti-Xa level measurement, 3 patients remained outside of the therapeutic range.
Conclusions: Standard therapeutic enoxaparin dosing did not result in optimal anti-Xa levels for a majority of MICU patients regardless of dosing regimen used or patient specific factors. Future studies should identify patient factors associated with the requirement for higher or lower enoxaparin dosing
Multicenter, Observational Cohort Study Evaluating Third-Generation Cephalosporin Therapy for Bloodstream Infections Secondary to Enterobacter, Serratia, and Citrobacter Species
Objectives: There is debate on whether the use of third-generation cephalosporins (3GC) increases the risk of clinical failure in bloodstream infections (BSIs) caused by chromosomally-mediated AmpC-producing Enterobacterales (CAE). This study evaluates the impact of definitive 3GC therapy versus other antibiotics on clinical outcomes in BSIs due to Enterobacter, Serratia, or Citrobacter species. Methods: This multicenter, retrospective cohort study evaluated adult hospitalized patients with BSIs secondary to Enterobacter, Serratia, or Citrobacter species from 1 January 2006 to 1 September 2014. Definitive 3GC therapy was compared to definitive therapy with other non-3GC antibiotics. Multivariable Cox proportional hazards regression evaluated the impact of definitive 3GC on overall treatment failure (OTF) as a composite of in-hospital mortality, 30-day hospital readmission, or 90-day reinfection. Results: A total of 381 patients from 18 institutions in the southeastern United States were enrolled. Common sources of BSIs were the urinary tract and central venous catheters (78 (20.5%) patients each). Definitive 3GC therapy was utilized in 65 (17.1%) patients. OTF occurred in 22/65 patients (33.9%) in the definitive 3GC group vs. 94/316 (29.8%) in the non-3GC group (p = 0.51). Individual components of OTF were comparable between groups. Risk of OTF was comparable with definitive 3GC therapy vs. definitive non-3GC therapy (aHR 0.93, 95% CI 0.51–1.72) in multivariable Cox proportional hazards regression analysis. Conclusions: These outcomes suggest definitive 3GC therapy does not significantly alter the risk of poor clinical outcomes in the treatment of BSIs secondary to Enterobacter, Serratia, or Citrobacter species compared to other antimicrobial agents
Multicenter, Observational Cohort Study Evaluating Third-Generation Cephalosporin Therapy for Bloodstream Infections Secondary to Enterobacter, Serratia, and Citrobacter Species
Objectives: There is debate on whether the use of third-generation cephalosporins (3GC) increases the risk of clinical failure in bloodstream infections (BSIs) caused by chromosomally-mediated AmpC-producing Enterobacterales (CAE). This study evaluates the impact of definitive 3GC therapy versus other antibiotics on clinical outcomes in BSIs due to Enterobacter, Serratia, or Citrobacter species. Methods: This multicenter, retrospective cohort study evaluated adult hospitalized patients with BSIs secondary to Enterobacter, Serratia, or Citrobacter species from 1 January 2006 to 1 September 2014. Definitive 3GC therapy was compared to definitive therapy with other non-3GC antibiotics. Multivariable Cox proportional hazards regression evaluated the impact of definitive 3GC on overall treatment failure (OTF) as a composite of in-hospital mortality, 30-day hospital readmission, or 90-day reinfection. Results: A total of 381 patients from 18 institutions in the southeastern United States were enrolled. Common sources of BSIs were the urinary tract and central venous catheters (78 (20.5%) patients each). Definitive 3GC therapy was utilized in 65 (17.1%) patients. OTF occurred in 22/65 patients (33.9%) in the definitive 3GC group vs. 94/316 (29.8%) in the non-3GC group (p = 0.51). Individual components of OTF were comparable between groups. Risk of OTF was comparable with definitive 3GC therapy vs. definitive non-3GC therapy (aHR 0.93, 95% CI 0.51–1.72) in multivariable Cox proportional hazards regression analysis. Conclusions: These outcomes suggest definitive 3GC therapy does not significantly alter the risk of poor clinical outcomes in the treatment of BSIs secondary to Enterobacter, Serratia, or Citrobacter species compared to other antimicrobial agents
Outcomes of Intravenous Push versus Intermittent Infusion Administration of Cefepime in Critically Ill Patients
The equivalence of intravenous push (IVP) and piggyback (IVPB) administration has not been evaluated in the critically ill population for most medications, but it is especially relevant for antibiotics, such as cefepime, that exhibit time-dependent bactericidal activity. A single center, retrospective, observational pre/post-protocol change study included critically ill adults who received cefepime as empiric therapy between August 2015 and 2021. The primary outcome was treatment failure, which was defined as a composite of escalation of antibiotic regimen or all-cause mortality. Secondary outcomes included adverse drug events, days of cefepime therapy, total days of antibiotic therapy, and ICU and hospital length of stay. Outcomes were compared using Chi-squared, Mann Whitney U, and binary logistic regression as appropriate. A total of 285 patients were included: 87 IVPB and 198 IVP. Treatment failure occurred in 18% (n = 16) of the IVPB group and 27% (n = 54) of the IVP group (p = 0.109). There were no significant differences in secondary outcomes. Longer duration of antibiotics (odds ratio [OR] 1.057, 95% confidence interval [CI] 1.013–1.103), SOFA score (OR 1.269, 95% CI 1.154–1.397) and IVP administration of cefepime (OR 2.370, 95% CI 1.143–4.914) were independently associated with treatment failure. Critically ill patients who received IVP cefepime were more likely to experience treatment failure in an adjusted analysis. The current practice of IVP cefepime should be reevaluated, as it may not provide similar clinical outcomes in the critically ill population