7 research outputs found

    Supplementary Material for: Warfarin Use and Increased Mortality in End-Stage Renal Disease

    No full text
    <p><b><i>Background:</i></b> Controversy exists regarding the benefits and risks of warfarin therapy in chronic kidney disease (CKD) and end-stage renal disease (ESRD) patients. In this study, we assessed mortality and cardiovascular outcomes associated with warfarin treatment in patients with stages 3-5 CKD and ESRD admitted to the University of California-Irvine Medical Center. <b><i>Methods:</i></b> In a retrospective matched cohort study, we identified 59 adult patients with stages 3-6 CKD initiated on warfarin during the period 2011-2013, and 144 patients with stages 3-6 CKD who had indications for anticoagulation therapy but were not initiated on warfarin. All-cause mortality risk associated with warfarin treatment was estimated using Cox proportional hazard regression analysis, and the risk of significant bleeding and major adverse cardiovascular events were analyzed with Poisson regression analysis. Adjustment models were used to account for age, gender, diabetes mellitus, use of antiplatelet agents, and preexisting cardiovascular disease, and stratified by pre-dialysis CKD stages 3-5 vs. ESRD. <b><i>Findings:</i></b> During 5.8 years of follow-up, unadjusted mortality risk was higher in CKD patients on warfarin therapy (hazard ratio [HR] 2.34 with 95% CI 1.25-4.39; <i>p</i> < 0.01). After multivariate adjustment and stratification by CKD stage, the mortality risk remained significant in ESRD patients receiving warfarin (HR 6.62 with 95% CI 2.56-17.16; <i>p</i> < 0.001). Furthermore, adjusted rates of significant bleeding (incident rate ratio, IRR 3.57 with 95% CI 1.51-8.45; <i>p</i> < 0.01) and myocardial infarction (IRR 4.20 with 95% CI 1.78-9.91; <i>p</i> < 0.01) were higher among warfarin users. No differences in rates of ischemic or hemorrhagic strokes were found between the 2 groups. <b><i>Conclusions:</i></b> Warfarin use was associated with several-fold higher risk of death, bleeding, and myocardial infarction in dialysis patients. If additional studies suggest similar associations, the use of warfarin in dialysis patients warrants immediate reconsideration.</p

    Supplementary Material for: Lymphocyte Cell Ratios and Mortality among Incident Hemodialysis Patients

    No full text
    <b><i>Background:</i></b> Neutrophil-to-lymphocyte ratio (NLR) and platelet-to-lymphocyte ratio (PLR) have been previously suggested as oncologic prognostication markers. These are associated with malnutrition and inflammation, and hence, may provide benefit in predicting mortality among hemodialysis patients. <b><i>Methods:</i></b> Among 108,548 incident hemodialysis patients in a large U.S. dialysis organization (2007–2011), we compared the mortality predictability of NLR and PLR with baseline and time-varying covariate Cox models using the receiver operating characteristic curve (AUROC), net reclassification index (NRI), and adjusted R<sup>2</sup>. <b><i>Results:</i></b> During the median follow-up period of 1.4 years, 28,618 patients died. Median (IQR) NLR and PLR at baseline were 3.64 (2.68–5.00) and 179 (136–248) respectively. NLR was associated with higher mortality, which appeared stronger in the time-varying versus baseline model. PLR exhibited a J-shaped association with mortality in both models. NLR provided better mortality prediction in addition to demographics, comorbidities, and serum albumin; ΔAUROC and NRI for 1-year mortality (95% CI) were 0.010 (0.009–0.012) and 6.4% (5.5–7.3%) respectively. Additionally, adjusted R<sup>2</sup> (95% CI) for the Cox model increased from 0.269 (0.262–0.276) to 0.283 (0.276–0.290) in the non-time-varying model and from 0.467 (0.461–0.472) to 0.505 (0.500–0.512) in the time-varying model. There was little to no benefit of adding PLR to predict mortality. <b><i>Conclusions:</i></b> High NLR in incident hemodialysis patients predicted mortality, especially in the short-term period. NLR, but not PLR, added modest benefit in predicting mortality along with demographics, comorbidities, and serum albumin, and should be included in prognostication approaches

    Supplementary Material for: Association of Ultrafiltration Rate with Mortality in Incident Hemodialysis Patients

    No full text
    <b>Background/Aims:</b> Ultrafiltration rate (UFR) appears to be associated with mortality in prevalent hemodialysis (HD) patients. However, the association of UFR with mortality in incident HD patients remains unknown. <b><i>Methods:</i></b> We examined a US cohort of 110,880 patients who initiated HD from 2007 to 2011. Baseline UFR was divided into 5 groups (<4, 4 to <6, 6 to <8, 8 to <10, and ≥10 mL/h/kg body weight [BW]). We examined predictors of higher baseline UFR using logistic regression and the association of baseline UFR and all-cause and cardiovascular (CV) mortality using Cox proportional hazard models with adjustments for demographics, comorbidities, and markers of malnutrition-inflammation-cachexia syndrome. <b><i>Results:</i></b> Patients were 63 ± 15 years, with 43% women, 32% African Americans, and had a mean baseline UFR of 7.5 ± 3.1 mL/h/kg BW. In the fully adjusted logistic regression models, factors associated with higher UFR (≥7.5 mL/h/kg BW) included Hispanic ethnicity, diabetes, and higher dietary protein intake. There was a linear association between UFR and all-cause and CV mortality, where UFR ≥10 mL/h/kg BW (reference UFR 6–<8 mL/h/kg BW) conferred the highest risk in both unadjusted (HR 1.15 [95% CI 1.10–1.19]) and adjusted models (HR 1.23 [95% CI 1.16–1.31]). The linear association with all-cause mortality remained consistent across strata of age, urine volume, and treatment time. <b><i>Conclusions:</i></b> Higher UFR is independently associated with higher all-cause and CV mortality in incident HD patients. Clinical trials are warranted to examine the effects of lowering UFR on outcomes

    Supplementary Material for: Changes in Markers of Mineral and Bone Disorders and Mortality in Incident Hemodialysis Patients

    No full text
    <b><i>Background:</i></b> Abnormalities in mineral and bone disorder (MBD) markers are common in patients with chronic kidney disease. However, previous studies have not accounted for their changes over time, and it is unclear whether these changes are associated with survival. <b><i>Methods:</i></b> We examined the association of change in MBD markers (serum phosphorus (Phos), albumin-corrected calcium (Ca<sub>Alb</sub>), intact parathyroid hormone (iPTH) and alkaline phosphatase (ALP)) during the first 6 months of hemodialysis (HD) with all-cause mortality across baseline MBD strata using survival models adjusted for clinical characteristics and laboratory measurements in 102,754 incident HD patients treated in a large dialysis organization between 2007 and 2011. <b><i>Results:</i></b> Across all MBD markers (Phos, Ca<sub>Alb</sub>, iPTH and ALP), among patients whose baseline MBD levels were higher than the reference range, increase in MBD levels was associated with higher mortality (reference group: MBD level within reference range at baseline and no change at 6 months follow-up). Conversely, decrease in Phos and iPTH, among baseline Phos and iPTH levels lower than the reference range, respectively, were associated with higher mortality. An increase in ALP was associated with higher mortality across baseline strata of ALP ≥80 U/l. However, patients with baseline ALP <80 U/l trended towards a lower risk of mortality irrespective of the direction of change at 6 months follow-up. <b><i>Conclusions:</i></b> There is a differential association between changes in MBD markers with mortality across varying baseline levels in HD patients. Further study is needed to determine if consideration of both baseline and longitudinal changes in the management of MBD derangements improves outcomes in this population

    Supplementary Material for: Serum Ferritin Variations and Mortality in Incident Hemodialysis Patients

    No full text
    <p><b><i>Background:</i></b> Higher serum ferritin levels may be influenced by iron use and inflammation, and are associated with higher mortality in hemodialysis (HD) patients. We hypothesized that a major rise in serum ferritin is associated with a higher risk of mortality, irrespective of baseline serum ferritin in incident HD patients. <b><i>Methods:</i></b> In a cohort of 93,979 incident HD patients between 2007 and 2011, we examined the association of change in serum ferritin from the baseline patient quarter (first 91 days from dialysis start) to the subsequent quarter with mortality. Multivariable adjustments were done for case-mix and markers of the malnutrition, and inflammation complex and intravenous iron dose. Change in serum ferritin was stratified into 5 groups: <-400, -400 to <-100, -100 to <100, 100 to <400, and ≥400 ng/mL/quarter. <b><i>Results:</i></b> The median change in serum ferritin was 89 ng/mL/quarter (interquartile range -55 to 266 ng/mL/quarter). Compared to stable serum ferritin (-100 to <100 ng/mL/quarter), a major rise (≥400 ng/mL/quarter) was associated with higher all-cause mortality (hazard ratio [95% CI] 1.07 [0.99-1.15], 1.17 [1.09-1.24], 1.26 [1.12-1.41], and 1.49 [1.27-1.76] according to baseline serum ferritin: <200, 200 to <500, 500 to <800, and ≥800 ng/mL in adjusted models, respectively. The mortality risk associated with a rise in serum ferritin was robust, irrespective of intravenous iron use. <b><i>Conclusions:</i></b> During the first 6-months after HD initiation, a major rise in serum ferritin in those with a baseline ferritin ≥200 ng/mL and even a slight rise in serum ferritin in those with a baseline ferritin ≥800 ng/mL are associated with higher mortality.</p

    Supplementary Material for: Racial and Ethnic Differences in Mortality Associated with Serum Potassium in a Large Hemodialysis Cohort

    No full text
    <p><b><i>Background:</i></b> Hyperkalemia is observed in chronic kidney disease patients and may be a risk factor for life-threatening arrhythmias and death. Race/ethnicity may be important modifiers of the potassium-mortality relationship in maintenance hemodialysis (MHD) patients given that potassium intake and excretion vary among minorities. <b><i>Methods:</i></b> We examined racial/ethnic differences in baseline serum potassium levels and all-cause and cardiovascular mortality using Cox proportional hazard models and restricted cubic splines in a cohort of 102,241 incident MHD patients. Serum potassium was categorized into 6 groups: ≤3.6, >3.6 to ≤4.0, >4.0 to ≤4.5 (reference), >4.5 to ≤5.0, >5.0 to ≤5.5, and >5.5 mEq/L. Models were adjusted for case-mix and malnutrition-inflammation cachexia syndrome (MICS) covariates. <b><i>Results:</i></b> The cohort was composed of 50% whites, 34% African-Americans, and 16% Hispanics. Hispanics tended to have the highest baseline serum potassium levels (mean ± SD: 4.58 ± 0.55 mEq/L). Patients in our cohort were followed for a median of 1.3 years (interquartile range 0.6-2.5). In our cohort, associations between higher potassium (>5.5 mEq/L) and higher mortality risk were observed in African-American and whites, but not Hispanic patients in models adjusted for case-mix and MICS covariates. While in Hispanics only, lower serum potassium (<3.6 mEq/L) levels were associated with higher mortality risk. Similar trends were observed for cardiovascular mortality. <b><i>Conclusions:</i></b> Higher potassium levels were associated with higher mortality risk in white and African-American MHD patients, whereas lower potassium levels were associated with higher death risk in Hispanics. Further studies are needed to determine the underlying mechanisms for the differential association between potassium and mortality across race/ethnicity.</p

    Supplementary Material for: Association of Pre-End-Stage Renal Disease Hemoglobin with Early Dialysis Outcomes

    No full text
    <b><i>Background:</i></b> Incident hemodialysis patients have a high mortality risk within the first months after dialysis initiation. Pre-end-stage renal disease (ESRD) factors like anemia management may impact early post-ESRD outcomes. Therefore, we evaluated the impact of pre-ESRD hemoglobin (Hgb) and pre-ESRD Hgb slope on post-ESRD mortality and hospitalization outcomes. <b><i>Methods:</i></b> The study included 31,472 veterans transitioning to ESRD. Using Cox and negative binomial regression models, we evaluated the association of pre-ESRD Hgb and Hgb slope with 12-month post-ESRD all-cause and cardiovascular mortality and hospitalization rates using 4 levels of hierarchical multivariable adjustment, including erythropoietin use and kidney decline in slope models. <b><i>Results:</i></b> The cohort was 2% female, 30% African-American, and on average 68 ± 11 years old. Compared to Hgb 10–< 11 g/dL, both low (< 10 g/dL) and high (≥12 g/dL) levels were associated with higher all-cause mortality after full adjustment (HR 1.25 [95% CI 1.15–1.35] and 1.09 [95% CI 1.02–1.18], respectively). Similarly, Hgb exhibited a U-shaped association with CV mortality, while only lower Hgb was associated with a higher hospitalization rate. Neither an annual pre-ESRD decline in Hgb nor increase was associated with higher post-ESRD mortality risk after adjustment for kidney decline. However, we observed a modest J-shaped association between pre-ESRD Hgb slope and post-ESRD hospitalization rate. <b><i>Conclusions:</i></b> Lower and higher pre-ESRD Hgb levels are associated with a higher risk of early post-ESRD mortality, while there was no association between the pre-ESRD slope and mortality. An increase in pre-ESRD Hgb slope was associated with higher risk of post-ESRD hospitalization. Additional studies aimed at anemia management prior to ESRD transition are warranted
    corecore