7 research outputs found

    Plasma Cystatin C in decompensated cirrhosis

    Get PDF
    Introduction : Cystatin C (CysC) is biomarker for early detection of acute kidney injury (AKI). However, there is limited evidence in decompensated cirrhotic patients without AKI at admission. This study aimed to assess CysC as a predictor of 90-day mortality. Methods : Decompensated cirrhotic patients without AKI were prospectively enrolled. CysC and creatinine were measured within 24 hours of admission and compared between patients with in-hospital complications (AKI, hepatorenal syndrome (HRS), acute-on-chronic liver failure (ACLF)) vs. those without, and survivors vs. non-survivors. The AUROC and cut-off point of CysC in predicting 90-day mortality were determined. Results : Of 137 decompensated cirrhotic patients, 46 without AKI at admission were included (58.7% male, age 60.8 ± 11.2years, MELD 13.1 ± 5.1, ChildA / B / C 43.5% / 39.1% / 17.4%). The mean CysC level tended to be higher in patients with ACLF (1.52 ± 0.60 vs. 1.11 ± 0.28, p = 0.05), and significantly higher in non-survivors than survivors (1.61 ± 0.53 vs. 1.08 ± 0.28, p = 0.013). The 90-day mortality rate was 21.7%. After adjusting with age and bacterial infection on admission, CysC level ≥ 1.25 mg / L was significantly associated with 90-day mortality. The CysC cut-off level ≥ 1.25 mg / L provided 80% sensitivity and 75% specificity for predicting 90-day mortality. Conclusion : Plasma CysC within 24 hours could be used as a predictor for 90-day mortality and development of ACLF in decompensated cirrhotic patients

    Changes in clinical indicators related to the transition from dialysis to kidney transplantation-data from the ERA-EDTA Registry

    Get PDF
    Background. Kidney transplantation should improve abnormalities that are common during dialysis treatment, like anaemia and mineral and bone disorder. However, its impact is incompletely understood. We therefore aimed to assess changes in clinical indicators after the transition from chronic dialysis to kidney transplantation. Methods. We used European Renal Association-European Dialysis and Transplant Association Registry data and included adult dialysis patients for whom data on clinical indicators before and after transplantation (2005-15) were available. Linear mixed models were used to quantify the effect of transplantation and of time after transplantation for each indicator. Results. In total, 16 312 patients were included. The mean age at transplantation was 50.1 (standard deviation 14.2) years, 62.9% were male and 70.2% were on haemodialysis before transplantation. Total, low-density lipoprotein (LDL) and high-density lipoprotein (HDL) cholesterol and triglycerides increased right after transplantation but decreased thereafter. All other indicators normalized or approached the target range soon after transplantation and these improvements were sustained for the first 4 years of follow-up. In patients with higher estimated glomerular filtration rate (eGFR) levels (30-60 and >60 mL/min/1.73 m(2)), the improvement of haemoglobin, ferritin, ionized calcium, phosphate, parathyroid hormone, HDL cholesterol, triglycerides, albumin and C-reactive protein levels was more pronounced than in patients with a lower eGFR ( Conclusions. Except for total cholesterol, LDL cholesterol and triglycerides, all clinical indicators improved after transplantation. These improvements were related to eGFR. Nevertheless, values remained out of range in a considerable proportion of patients and anaemia and hyperparathyroidism were still common problems. Further research is needed to understand the complex relationship between eGFR and the different clinical indicators.Peer reviewe

    The Association of Beta-Blocker Use and Bone Mineral Density Level in Hemodialysis Patients: A Cross-Sectional Study

    No full text
    Background and Objectives: Osteoporosis results in increasing morbidity and mortality in hemodialysis patients. The medication for treatment has been limited. There is evidence that beta-blockers could increase bone mineral density (BMD) and reduce the risk of fracture in non-dialysis patients, however, a study in hemodialysis patients has not been conducted. This study aims to determine the association between beta-blocker use and bone mineral density level in hemodialysis patients. Materials and Methods: We conducted a cross-sectional study in hemodialysis patients at Thammasat University Hospital from January 2018 to December 2020. A patient receiving a beta-blocker ≥ 20 weeks was defined as a beta-blocker user. The association between beta-blocker use and BMD levels was determined by univariate and multivariate linear regression analysis. Results: Of the 128 patients receiving hemodialysis, 71 were beta-blocker users and 57 were non-beta-blocker users (control group). The incidence of osteoporosis in hemodialysis patients was 50%. There was no significant difference in the median BMD between the control and the beta-blocker groups of the lumbar spine (0.93 vs. 0.91, p = 0.88), femoral neck (0.59 vs. 0.57, p = 0.21), total hip (0.73 vs. 0.70, p = 0.38), and 1/3 radius (0.68 vs. 0.64, p = 0.40). The univariate and multivariate linear regression analyses showed that the beta-blocker used was not associated with BMD. In the subgroup analysis, the beta-1 selective blocker used was associated with lower BMD of the femoral neck but not within the total spine, total hip, and 1/3 radius. The multivariate logistic regression showed that the factors of age ≥ 65 years (aOR 3.31 (1.25–8.80), p = 0.02), female sex (aOR 4.13 (1.68–10.14), p = 0.002), lower BMI (aOR 0.89 (0.81–0.98), p = 0.02), and ALP > 120 U/L (aOR 3.88 (1.33–11.32), p = 0.01) were independently associated with osteoporosis in hemodialysis patients. Conclusions: In hemodialysis patients, beta-blocker use was not associated with BMD levels, however a beta-1 selective blocker used was associated with lower BMD in the femoral neck

    Dialysate White Blood Cell Change after Initial Antibiotic Treatment Represented the Patterns of Response in Peritoneal Dialysis-Related Peritonitis

    No full text
    Background. Patients with peritoneal dialysis-related peritonitis usually have different responses to initial antibiotic treatment. This study aimed to explore the patterns of response by using the changes of dialysate white blood cell count on the first five days of the initial antibiotic treatment. Materials and Methods. A retrospective cohort study was conducted. All peritoneal dialysis-related peritonitis episodes from January 2014 to December 2015 were reviewed. We categorized the patterns of antibiotic response into 3 groups: early response, delayed response, and failure group. The changes of dialysate white blood cell count for each pattern were determined by multilevel regression analysis. Results. There were 644 episodes in 455 patients: 378 (58.7%) of early response, 122 (18.9%) of delayed response, and 144 (22.3%) of failure episodes. The patterns of early, delayed, and failure groups were represented by the average rate reduction per day of dialysate WBC of 68.4%, 34.0%, and 14.2%, respectively (p value < 0.001 for all comparisons). Conclusion. Three patterns, which were categorized by types of responses, have variable rates of WBC declining. Clinicians should focus on the delayed response and failure patterns in order to make a decision whether to continue medical therapies or to aggressively remove the peritoneal catheter

    Predictors for Unsuccessful Reductions in Hemodialysis Frequency during the Pandemic

    No full text
    Background and Objectives: Patients receiving in-center hemodialysis are at a high risk of coronavirus disease 2019 (COVID-19) infection. A reduction in hemodialysis frequency is one of the proposed measures for preventing COVID-19 infection. However, the predictors for determining an unsuccessful reduction in hemodialysis frequency are still lacking. Materials and Methods: This retrospective observational study enrolled patients who were receiving long-term thrice-weekly hemodialysis at the Thammasat University Hospital in 2021 and who decreased their dialysis frequency to twice weekly during the COVID-19 outbreak. The outcomes were to determine the predictors and a prediction model of unsuccessful reduction in dialysis frequency at 4 weeks. Bootstrapping was performed for the purposes of internal validation. Results: Of the 161 patients, 83 patients achieved a dialysis frequency reduction. Further, 33% and 82% of the patients failed to reduce their dialysis frequency at 4 and 8 weeks, respectively. The predictors for unsuccessful reduction were diabetes, congestive heart failure (CHF), pre-dialysis overhydration, set dry weight (DW), DW from bioelectrical impedance analysis, and the mean pre- and post-dialysis body weight. The final model including these predictors demonstrated an AUROC of 0.763 (95% CI 0.654–0.866) for the prediction of an unsuccessful reduction. Conclusions: The prediction score involving diabetes, CHF, pre-dialysis overhydration, DW difference, and net ultrafiltration demonstrated a good performance in predicting an unsuccessful reduction in hemodialysis frequency at 4 weeks

    Real-life effectiveness of COVID-19 vaccine during the Omicron variant-dominant pandemic: How many booster doses do we need?

    No full text
    The surge in coronavirus disease 2019 (COVID-19) caused by the Omicron variants of the severe acute respiratory syndrome coronavirus 2 necessitates researches to inform vaccine effectiveness (VE) and other preventive measures to halt the pandemic. A test-negative case-control study was conducted among adults (age ≥18 years) who were at-risk for COVID-19 and presented for nasopharyngeal real-time polymerase chain reaction testing during the Omicron variant-dominant period in Thailand (1 January 2022 to 15 June 2022). All participants were prospectively followed-up for COVID-19 development for 14 days after the enrollment. Vaccine effectiveness was estimated and adjusted for characteristics associated with COVID-19. Of the 7,971 included individuals, there were 3,104 cases and 4,867 controls. The adjusted VE among persons receiving 2-dose, 3-dose, and 4-dose vaccine regimens for preventing infection and preventing moderate-to-critical diseases were 33%, 48%, 62% and 60%, 74%, 76%, respectively. The VE were generally higher among those receiving the last dose of vaccine within 90 days compared to those receiving the last dose more than 90 days prior to the enrollment. The highest VE were observed in individuals receiving the 4-dose regimen, CoronaVac-CoronaVac-ChAdOx1 nCoV-19-BNT162b2 for both preventing infection (65%) and preventing moderate-to-critical diseases (82%). Our study demonstrated increased VE along with increase in number of vaccine doses received. Current vaccination programs should focus on reducing COVID-19 severity and mandate at least one booster dose. The heterologous boosters with viral vector and mRNA vaccines were highly effective and can be used in individuals who previously received the primary series of inactivated vaccine.</p
    corecore