27 research outputs found
The effect of adding comorbidities to current centers for disease control and prevention central-line–associated bloodstream infection risk-adjustment methodology
BACKGROUNDRisk adjustment is needed to fairly compare central-line–associated bloodstream infection (CLABSI) rates between hospitals. Until 2017, the Centers for Disease Control and Prevention (CDC) methodology adjusted CLABSI rates only by type of intensive care unit (ICU). The 2017 CDC models also adjust for hospital size and medical school affiliation. We hypothesized that risk adjustment would be improved by including patient demographics and comorbidities from electronically available hospital discharge codes.METHODSUsing a cohort design across 22 hospitals, we analyzed data from ICU patients admitted between January 2012 and December 2013. Demographics and International Classification of Diseases, Ninth Edition, Clinical Modification (ICD-9-CM) discharge codes were obtained for each patient, and CLABSIs were identified by trained infection preventionists. Models adjusting only for ICU type and for ICU type plus patient case mix were built and compared using discrimination and standardized infection ratio (SIR). Hospitals were ranked by SIR for each model to examine and compare the changes in rank.RESULTSOverall, 85,849 ICU patients were analyzed and 162 (0.2%) developed CLABSI. The significant variables added to the ICU model were coagulopathy, paralysis, renal failure, malnutrition, and age. The C statistics were 0.55 (95% CI, 0.51–0.59) for the ICU-type model and 0.64 (95% CI, 0.60–0.69) for the ICU-type plus patient case-mix model. When the hospitals were ranked by adjusted SIRs, 10 hospitals (45%) changed rank when comorbidity was added to the ICU-type model.CONCLUSIONSOur risk-adjustment model for CLABSI using electronically available comorbidities demonstrated better discrimination than did the CDC model. The CDC should strongly consider comorbidity-based risk adjustment to more accurately compare CLABSI rates across hospitals.Infect Control Hosp Epidemiol 2017;38:1019–1024</jats:sec
Which comorbid conditions should we be analyzing as risk factors for healthcare-associated infections?
OBJECTIVETo determine which comorbid conditions are considered causally related to central-line associated bloodstream infection (CLABSI) and surgical-site infection (SSI) based on expert consensus.DESIGNUsing the Delphi method, we administered an iterative, 2-round survey to 9 infectious disease and infection control experts from the United States.METHODSBased on our selection of components from the Charlson and Elixhauser comorbidity indices, 35 different comorbid conditions were rated from 1 (not at all related) to 5 (strongly related) by each expert separately for CLABSI and SSI, based on perceived relatedness to the outcome. To assign expert consensus on causal relatedness for each comorbid condition, all 3 of the following criteria had to be met at the end of the second round: (1) a majority (>50%) of experts rating the condition at 3 (somewhat related) or higher, (2) interquartile range (IQR)≤1, and (3) standard deviation (SD)≤1.RESULTSFrom round 1 to round 2, the IQR and SD, respectively, decreased for ratings of 21 of 35 (60%) and 33 of 35 (94%) comorbid conditions for CLABSI, and for 17 of 35 (49%) and 32 of 35 (91%) comorbid conditions for SSI, suggesting improvement in consensus among this group of experts. At the end of round 2, 13 of 35 (37%) and 17 of 35 (49%) comorbid conditions were perceived as causally related to CLABSI and SSI, respectively.CONCLUSIONSOur results have produced a list of comorbid conditions that should be analyzed as risk factors for and further explored for risk adjustment of CLABSI and SSI.Infect Control Hosp Epidemiol 2017;38:449–454</jats:sec
Electronically available patient claims data improve models for comparing antibiotic use across hospitals: Results from 576 US facilities
BACKGROUND: The Centers for Disease Control and Prevention (CDC) uses standardized antimicrobial administration ratios (SAARs)-that is, observed-to-predicted ratios-to compare antibiotic use across facilities. CDC models adjust for facility characteristics when predicting antibiotic use but do not include patient diagnoses and comorbidities that may also affect utilization. This study aimed to identify comorbidities causally related to appropriate antibiotic use and to compare models that include these comorbidities and other patient-level claims variables to a facility model for risk-adjusting inpatient antibiotic utilization.
METHODS: The study included adults discharged from Premier Database hospitals in 2016-2017. For each admission, we extracted facility, claims, and antibiotic data. We evaluated 7 models to predict an admission\u27s antibiotic days of therapy (DOTs): a CDC facility model, models that added patient clinical constructs in varying layers of complexity, and an external validation of a published patient-variable model. We calculated hospital-specific SAARs to quantify effects on hospital rankings. Separately, we used Delphi Consensus methodology to identify Elixhauser comorbidities associated with appropriate antibiotic use.
RESULTS: The study included 11 701 326 admissions across 576 hospitals. Compared to a CDC-facility model, a model that added Delphi-selected comorbidities and a bacterial infection indicator was more accurate for all antibiotic outcomes. For total antibiotic use, it was 24% more accurate (respective mean absolute errors: 3.11 vs 2.35 DOTs), resulting in 31-33% more hospitals moving into bottom or top usage quartiles postadjustment.
CONCLUSIONS: Adding electronically available patient claims data to facility models consistently improved antibiotic utilization predictions and yielded substantial movement in hospitals\u27 utilization rankings
Significant regional differences in antibiotic use across 576 US hospitals and 11 701 326 adult admissions, 2016-2017
BACKGROUND: Quantifying the amount and diversity of antibiotic use in United States hospitals assists antibiotic stewardship efforts but is hampered by limited national surveillance. Our study aimed to address this knowledge gap by examining adult antibiotic use across 576 hospitals and nearly 12 million encounters in 2016-2017.
METHODS: We conducted a retrospective study of patients aged ≥ 18 years discharged from hospitals in the Premier Healthcare Database between 1 January 2016 and 31 December 2017. Using daily antibiotic charge data, we mapped antibiotics to mutually exclusive classes and to spectrum of activity categories. We evaluated relationships between facility and case-mix characteristics and antibiotic use in negative binomial regression models.
RESULTS: The study included 11 701 326 admissions, totaling 64 064 632 patient-days, across 576 hospitals. Overall, patients received antibiotics in 65% of hospitalizations, at a crude rate of 870 days of therapy (DOT) per 1000 patient-days. By class, use was highest among β-lactam/β-lactamase inhibitor combinations, third- and fourth-generation cephalosporins, and glycopeptides. Teaching hospitals averaged lower rates of total antibiotic use than nonteaching hospitals (834 vs 957 DOT per 1000 patient-days; P \u3c .001). In adjusted models, teaching hospitals remained associated with lower use of third- and fourth-generation cephalosporins and antipseudomonal agents (adjusted incidence rate ratio [95% confidence interval], 0.92 [.86-.97] and 0.91 [.85-.98], respectively). Significant regional differences in total and class-specific antibiotic use also persisted in adjusted models.
CONCLUSIONS: Adult inpatient antibiotic use remains high, driven predominantly by broad-spectrum agents. Better understanding reasons for interhospital usage differences, including by region and teaching status, may inform efforts to reduce inappropriate antibiotic prescribing
Copper Chaperone for Cu/Zn Superoxide Dismutase is a sensitive biomarker of mild copper deficiency induced by moderately high intakes of zinc
BACKGROUND: Small increases in zinc (Zn) consumption above recommended amounts have been shown to reduce copper (Cu) status in experimental animals and humans. Recently, we have reported that copper chaperone for Cu/Zn superoxide dismutase (CCS) protein level is increased in tissues of overtly Cu-deficient rats and proposed CCS as a novel biomarker of Cu status. METHODS: Weanling male Wistar rats were fed one of four diets normal in Cu and containing normal (30 mg Zn/kg diet) or moderately high (60, 120 or 240 mg Zn/kg diet) amounts of Zn for 5 weeks. To begin to examine the clinical relevance of CCS, we compared the sensitivity of CCS to mild Cu deficiency, induced by moderately high intakes of Zn, with conventional indices of Cu status. RESULTS: Liver and erythrocyte CCS expression was significantly (P < 0.05) increased in rats fed the Zn-60 and/or Zn-120 diet compared to rats fed normal levels of Zn (Zn-30). Erythrocyte CCS expression was the most sensitive measure of reduced Cu status and was able to detect a decrease in Cu nutriture in rats fed only twice the recommended amount of Zn. Liver, erythrocyte and white blood cell CCS expression showed a significant (P < 0.05) inverse correlation with plasma and liver Cu concentrations and caeruloplasmin activity. Unexpectedly, rats fed the highest level of Zn (Zn-240) showed overall better Cu status than rats fed a lower level of elevated Zn (Zn-120). Improved Cu status in these rats correlated with increased duodenal mRNA expression of several Zn-trafficking proteins (i.e. MT-1, ZnT-1, ZnT-2 and ZnT-4). CONCLUSION: Collectively, these data show that CCS is a sensitive measure of Zn-induced mild Cu deficiency and demonstrate a dose-dependent biphasic response for reduced Cu status by moderately high intakes of Zn
Sparing effects of selenium and ascorbic acid on vitamin C and E in guinea pig tissues
BACKGROUND: Selenium (Se), vitamin C and vitamin E function as antioxidants within the body. In this study, we investigated the effects of reduced dietary Se and L-ascorbic acid (AA) on vitamin C and α-tocopherol (AT) status in guinea pig tissues. METHODS: Male Hartley guinea pigs were orally dosed with a marginal amount of AA and fed a diet deficient (Se-D/MC), marginal (Se-M/MC) or normal (Se-N/MC) in Se. An additional diet group (Se-N/NC) was fed normal Se and dosed with a normal amount of AA. Guinea pigs were killed after 5 or 12 weeks on the experimental diets at 24 and 48 hours post AA dosing. RESULTS: Liver Se-dependent glutathione peroxidase activity was decreased (P < 0.05) in guinea pigs fed Se or AA restricted diets. Plasma total glutathione concentrations were unaffected (P > 0.05) by reduction in dietary Se or AA. All tissues examined showed a decrease (P < 0.05) in AA content in Se-N/MC compared to Se-N/NC guinea pigs. Kidney, testis, muscle and spleen showed a decreasing trend (P < 0.05) in AA content with decreasing Se in the diet. Dehydroascorbic acid concentrations were decreased (P < 0.05) in several tissues with reduction in dietary Se (heart and spleen) or AA (liver, heart, kidney, muscle and spleen). At week 12, combined dietary restriction of Se and AA decreased AT concentrations in most tissues. In addition, restriction of Se (liver, heart and spleen) and AA (liver, kidney and spleen) separately also reduced AT in tissues. CONCLUSION: Together, these data demonstrate sparing effects of Se and AA on vitamin C and AT in guinea pig tissues
