109 research outputs found

    Can transplant renal scintigraphy predict the duration of delayed graft function? A dual center retrospective study:A dual center retrospective study

    Get PDF
    Introduction: This study focused on the value of quantitatively analyzed and qualitatively graded renal scintigraphy in relation to the expected duration of delayed graft function after kidney transplantation. A more reliable prediction of delayed graft function duration may result in a more tailored and patient-specific treatment regimen post-transplantation. Methods: From 2000 to 2014, patients with early transplant dysfunction and a Tc-99m MAG3 renal scintigraphy, within 3 days post-transplantation, were included in a dual center retrospective study. Time-activity curves of renal scintigraphy procedures were qualitatively graded and various quantitative indices (R20/3, TFS, cTER, MUC10) were combined with a new index (Average upslope). The delayed graft function duration was defined as the number of days of dialysis-based/functional delayed graft function. Results: A total of 377 patients were included, with a mean age (± SD) of 52 ± 14 years, and 58% were male. A total of 274 (73%) patients experienced delayed graft function 7 days. Qualitative grading for the prediction of delayed graft function 7 days had a sensitivity and specificity of respectively 87% and 65%. The quantitative indices with the most optimal results were cTER (76% sensitivity, 72% specificity), and Average upslope (75% sensitivity, 73% specificity). Conclusions: Qualitative renal scintigraphy grading and the quantitative indices cTER and Average upslope predict delayed graft function ≥7 days with a high sensitivity. This finding may help to support both clinicians and patients in managing early post-operative expectations. However, the specificity is limited and thus renal scintigraphy does not reliably help to identify patients in whom the course of delayed graft function is longer than anticipated

    Association of Hepcidin-25 with survival after kidney transplantation

    Get PDF
    Background Hepcidin is considered the master regulator of iron homoeostasis. Novel hepcidin antagonists have recently been introduced as potential treatment for iron-restricted anaemia. Meanwhile, serum hepcidin has been shown to be positively associated with cardiovascular disease and inversely with acute kidney injury. These properties may lead to contrasting effects, especially in renal transplant recipients (RTR), which are prone to cardiovascular diseases and graft failure. To date, the role of serum hepcidin in RTR is unknown. We, therefore, prospectively determined the association of serum hepcidin with risk of graft failure, cardiovascular mortality and all-cause mortality in RTR. Materials and methods Serum hepcidin was assessed in an extensively phenotyped RTR cohort by dual-monoclonal sandwich ELISA specific immunoassay. Statistical analyses were performed using univariate linear regression followed by stepwise backward linear regression. Cox proportional hazard regression models were performed to determine prospective associations. Results We included 561 RTR (age 51 +/- 12 years). Mean haemoglobin (Hb) was 8.6 +/- 1.0 mM. Median [IQR] serum hepcidin was 7.2 [3.2-13.4] ng/mL. Mean estimated glomerular filtration rate was 47 +/- 16 mL/min/ 1.73 m(2). In univariate Cox regression analyses, serum hepcidin was not associated with risk of graft failure, cardiovascular mortality or all-cause mortality. Notably, after adjustment for high sensitivity C-reactive protein and ferritin, serum hepcidin became negatively associated with all-cause mortality (hazard ratio 0.89; 95% confidence interval 0.80-0.99, P = 0.3). Conclusions In this study, we did not find an association between serum hepcidin and outcomes, that is graft failure, cardiovascular mortality or all-cause mortality. Based on our results, it is questionable whether serum hepcidin may be used to predict a beneficial effect of hepcidin antagonists

    Repurposing HLA genotype data of renal transplant patients to prevent severe drug hypersensitivity reactions

    Get PDF
    Introduction: Specific alleles in human leukocyte antigens (HLAs) are associated with an increased risk of developing drug hypersensitivity reactions induced by abacavir, allopurinol, carbamazepine, oxcarbazepine, phenytoin, lamotrigine, or flucloxacillin. Transplant patients are genotyped for HLA as a routine practice to match a potential donor to a recipient. This study aims to investigate the feasibility and potential impact of repurposing these HLA genotype data from kidney transplant patients to prevent drug hypersensitivity reactions.Methods: A cohort of 1347 kidney transplant recipients has been genotyped in the Leiden University Medical Center (LUMC) using next-generation sequencing (NGS). The risk alleles HLA-A*31:01, HLA-B*15:02, HLA-B*15:11, HLA-B*57:01, and HLA-B*58:01 were retrieved from the NGS data. Medical history, medication use, and allergic reactions were obtained from the patient's medical records. Carrier frequencies found were compared to a LUMC blood donor population.Results: A total of 13.1% of transplant cohort patients carried at least one of the five HLA risk alleles and therefore had an increased risk of drug-induced hypersensitivity for specific drugs. HLA-A*31:01, HLA-B*15:02, HLA-B*57:01, and HLA-B*58:01 were found in carrier frequencies of 4.61%, 1.19%, 4.46%, and 3.35% respectively. No HLA-B*15:11 carrier was found. In total nine HLA-B*57:01 carriers received flucloxacillin and seven HLA-B*58:01 carriers within our cohort received allopurinol.Discussion: Our study shows that repurposing HLA genotype data from transplantation patients for the assignment of HLA risk alleles associated with drug hypersensitivity is feasible. The use of these data by physicians while prescribing drugs or by the pharmacist when dispensing drugs holds the potential to prevent drug hypersensitivity reactions. The utility of this method was highlighted by 13.1% of the transplant cohort patients carrying an actionable HLA allele. </p

    Drug-induced Fanconi syndrome associated with fumaric acid esters treatment for psoriasis: A case series

    Get PDF
    Background: Fumaric acid esters (FAEs), an oral immunomodulating treatment for psoriasis and multiple sclerosis, have been anecdotally associated with proximal renal tubular dysfunction due to a drug-induced Fanconi syndrome. Few data are available on clinical outcomes of FAE-induced Fanconi syndrome. Methods: Descriptive case series with two cases of Fanconi syndrome associated with FAE treatment diagnosed at two Dutch university nephrology departments, three cases reported at the Dutch and German national pharmacovigilance databases and six previously reported cases. Results: All 11 cases involved female patients with psoriasis. The median age at the time of onset was 38 years [interquartile range (IQR) 37-46]. Patients received long-term FAEs treatment with a median treatment duration of 60 months (IQR 28-111). Laboratory tests were typically significant for low serum levels of phosphate and uric acid, while urinalysis showed glycosuria and proteinuria. Eight (73%) patients had developed a hypophosphataemic osteomalacia and three (27%) had pathological bone fractures. All patients discontinued FAEs, while four (36%) patients were treated with supplementation of phosphate and/or vitamin D. Five (45%) patients had persisting symptoms despite FAEs discontinuation. Conclusions: FAEs treatment can cause drug-induced Fanconi syndrome, but the association has been reported infrequently. Female patients with psoriasis treated long term with FAEs seem to be particularly at risk. Physicians treating patients with FAEs should be vigilant and monitor for the potential occurrence of Fanconi syndrome. Measurement of the urinary albumin:total protein ratio is a suggested screening tool for tubular proteinuria in Fanconi syndrome

    European Society for Organ Transplantation (ESOT)-TLJ 3.0 Consensus on Histopathological Analysis of Pre-Implantation Donor Kidney Biopsy:Redefining the Role in the Process of Graft Assessment

    Get PDF
    The ESOT TLJ 3.0. consensus conference brought together leading experts in transplantation to develop evidence-based guidance on the standardization and clinical utility of pre-implantation kidney biopsy in the assessment of grafts from Expanded Criteria Donors (ECD). Seven themes were selected and underwent in-depth analysis after formulation of PICO (patient/population, intervention, comparison, outcomes) questions. After literature search, the statements for each key question were produced, rated according the GRADE approach [Quality of evidence: High (A), Moderate (B), Low (C); Strength of Recommendation: Strong (1), Weak (2)]. The statements were subsequently presented in-person at the Prague kick-off meeting, discussed and voted. After two rounds of discussion and voting, all 7 statements reached an overall agreement of 100% on the following issues: needle core/wedge/punch technique representatively [B,1], frozen/paraffin embedded section reliability [B,2], experienced/non-experienced on-call renal pathologist reproducibility/accuracy of the histological report [A,1], glomerulosclerosis/other parameters reproducibility [C,2], digital pathology/light microscopy in the measurement of histological variables [A,1], special stainings/Haematoxylin and Eosin alone comparison [A,1], glomerulosclerosis reliability versus other histological parameters to predict the graft survival, graft function, primary non-function [B,1]. This methodology has allowed to reach a full consensus among European experts on important technical topics regarding pre-implantation biopsy in the ECD graft assessment.</p

    Improving outcomes for donation after circulatory death kidney transplantation:Science of the times

    Get PDF
    The use of kidneys donated after circulatory death (DCD) remains controversial due to concerns with regard to high incidences of early graft loss, delayed graft function (DGF), and impaired graft survival. As these concerns are mainly based on data from historical cohorts, they are prone to time-related effects and may therefore not apply to the current timeframe. To assess the impact of time on outcomes, we performed a time-dependent comparative analysis of outcomes of DCD and donation after brain death (DBD) kidney transplantations. Data of all 11,415 deceased-donor kidney transplantations performed in The Netherlands between 1990-2018 were collected. Based on the incidences of early graft loss, two eras were defined (1998-2008 [n = 3,499] and 2008-2018 [n = 3,781]), and potential time-related effects on outcomes evaluated. Multivariate analyses were applied to examine associations between donor type and outcomes. Interaction tests were used to explore presence of effect modification. Results show clear time-related effects on posttransplant outcomes. The 1998-2008 interval showed compromised outcomes for DCD procedures (higher incidences of DGF and early graft loss, impaired 1-year renal function, and inferior graft survival), whereas DBD and DCD outcome equivalence was observed for the 2008-2018 interval. This occurred despite persistently high incidences of DGF in DCD grafts, and more adverse recipient and donor risk profiles (recipients were 6 years older and the KDRI increased from 1.23 to 1.39 and from 1.35 to 1.49 for DBD and DCD donors). In contrast, the median cold ischaemic period decreased from 20 to 15 hours. This national study shows major improvements in outcomes of transplanted DCD kidneys over time. The time-dependent shift underpins that kidney transplantation has come of age and DCD results are nowadays comparable to DBD transplants. It also calls for careful interpretation of conclusions based on historical cohorts, and emphasises that retrospective studies should correct for time-related effects

    Improving outcomes for donation after circulatory death kidney transplantation: Science of the times

    Get PDF
    The use of kidneys donated after circulatory death (DCD) remains controversial due to concerns with regard to high incidences of early graft loss, delayed graft function (DGF), and impaired graft survival. As these concerns are mainly based on data from historical cohorts, they are prone to time-related effects and may therefore not apply to the current timeframe. To assess the impact of time on outcomes, we performed a time-dependent comparative analysis of outcomes of DCD and donation after brain death (DBD) kidney transplantations. Data of all 11,415 deceased-donor kidney transplantations performed in The Netherlands between 1990-2018 were collected. Based on the incidences of early graft loss, two eras were defined (1998-2008 [n = 3,499] and 2008-2018 [n = 3,781]), and potential time-related effects on outcomes evaluated. Multivariate analyses were applied to examine associations between donor type and outcomes. Interaction tests were used to explore presence of effect modification. Results show clear time-related effects on posttransplant outcomes. The 1998-2008 interval showed compromised outcomes for DCD procedures (higher incidences of DGF and early graft loss, impaired 1-year renal function, and inferior graft survival), whereas DBD and DCD outcome equivalence was observed for the 2008-2018 interval. This occurred despite persistently high incidences of DGF in DCD grafts, and more adverse recipient and donor risk profiles (recipients were 6 years older and the KDRI increased from 1.23 to 1.39 and from 1.35 to 1.49 for DBD and DCD donors). In contrast, the median cold ischaemic period decreased from 20 to 15 hours. This national study shows major improvements in outcomes of transplanted DCD kidneys over time. The time-dependent shift underpins that kidney transplantation has come of age and DCD results are nowadays comparable to DBD transplants. It also calls for careful interpretation of conclusions based on historical cohorts, and emphasises that retrospective studies should correct for time-related effects.Transplant surger

    A nationwide evaluation of deceased donor kidney transplantation indicates detrimental consequences of early graft loss

    Get PDF
    Early graft loss (EGL) is a feared outcome of kidney transplantation. Consequently, kidneys with an anticipated risk of EGL are declined for transplantation. In the most favorable scenario, with optimal use of available donor kidneys, the donor pool size is balanced by the risk of EGL, with a tradeoff dictated by the consequences of EGL. To gauge the consequence of EGL we systematically evaluated its impact in an observational study that included all 10,307 deceased-donor kidney transplantations performed in The Netherlands between 1990 and 2018. Incidence of EGL, defined as graft loss within 90 days, in primary transplantation was 8.2% (699/8,511). The main causes were graft rejection (30%), primary nonfunction (25%), and thrombosis or infarction (20%). EGL profoundly impacted short- and long-term patient survival (adjusted hazard ratio; 95% confidence interval: 8.2; 5.1-13.2 and 1.7; 1.3-2.1, respectively). Of the EGL recipients who survived 90 days after transplantation (617/699) only 440 of the 617 were relisted for re-transplantation. Of those relisted, only 298 were ultimately re-transplanted leading to an actual re-transplantation rate of 43%. Noticeably, re-transplantation was associated with a doubled incidence of EGL, but similar long-term graft survival (adjusted hazard ratio 1.1; 0.6-1.8). Thus, EGL after kidney transplantation is a medical catastrophe with high mortality rates, low relisting rates, and increased risk of recurrent EGL following re-transplantation. This implies that detrimental outcomes also involve convergence of risk factors in recipients with EGL. The 8.2% incidence of EGL minimally impacted population mortality, indicating this incidence is acceptable

    Impact of immunosuppressive treatment and type of SARS-CoV-2 vaccine on antibody levels after three vaccinations in patients with chronic kidney disease or kidney replacement therapy

    Get PDF
    Background. Patients with chronic kidney disease (CKD) or kidney replacement therapy demonstrate lower antibody levels after severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) vaccination compared with healthy controls. In a prospective cohort, we analysed the impact of immunosuppressive treatment and type of vaccine on antibody levels after three SARS-CoV-2 vaccinations. Methods. Control subjects (n = 186), patients with CKD G4/5 (n = 400), dialysis patients (n = 480) and kidney transplant recipients (KTR) (n = 2468) were vaccinated with either mRNA-1273 (Moderna), BNT162b2 (Pfizer-BioNTech) or AZD1222 (Oxford/AstraZeneca) in the Dutch SARS-CoV-2 vaccination programme. Third vaccination data were available in a subgroup of patients (n = 1829). Blood samples and questionnaires were obtained 1 month after the second and third vaccination. Primary endpoint was the antibody level in relation to immunosuppressive treatment and type of vaccine. Secondary endpoint was occurrence of adverse events after vaccination. Results. Antibody levels after two and three vaccinations were lower in patients with CKD G4/5 and dialysis patients with immunosuppressive treatment compared with patients without immunosuppressive treatment. After two vaccinations, we observed lower antibody levels in KTR using mycophenolate mofetil (MMF) compared with KTR not using MMF [20 binding antibody unit (BAU)/mL (3-113) vs 340 BAU/mL (50-1492), P &lt; .001]. Seroconversion was observed in 35% of KTR using MMF, compared with 75% of KTR not using MMF. Of the KTR who used MMF and did not seroconvert, eventually 46% seroconverted after a third vaccination. mRNA-1273 induces higher antibody levels as well as a higher frequency of adverse events compared with BNT162b2 in all patient groups. Conclusions. Immunosuppressive treatment adversely affects the antibody levels after SARS-CoV-2 vaccination in patients with CKD G4/5, dialysis patients and KTR. mRNA-1273 vaccine induces a higher antibody level and higher frequency of adverse events.</p

    ABO-incompatible kidney transplantation in perspective of deceased donor transplantation and induction strategies:a propensity-matched analysis

    Get PDF
    BACKGROUND: Kidney transplant candidates are blood group incompatible with roughly one out of three potential living donors. We compared outcomes after ABO-incompatible (ABOi) kidney transplantation with matched ABO-compatible (ABOc) living and deceased donor transplantation and analyzed different induction regimens. METHODS: We performed a retrospective study with propensity matching and compared patient and death-censored graft survival after ABOi versus ABOc living donor and deceased donor kidney transplantation in a nationwide registry from 2006 till 2019. RESULTS: 296 ABOi were compared to 1184 center and propensity matched ABOc living donor and 1184 deceased donor recipients (matching: recipient age, sex, blood group and PRA). Patient survival was better compared to deceased donor (hazard ratio (HR) for death of HR 0.69 [0.49-0.96], and not-significantly different from ABOc living donor recipients (HR 1.28 [0.90-1.81]). Rate of graft failure was higher compared to ABOc living donor transplantation (HR 2.63 [1.72-4.01]). Rejection occurred in 47% of 140 rituximab versus 22% of 50 rituximab/basiliximab, and 4% of 92 alemtuzumab treated recipients (p <0.001). CONCLUSIONS: ABOi kidney transplantation is superior to deceased donor transplantation. Rejection rate and graft failure are higher compared to matched ABOc living donor transplantation, underscoring the need for further studies into risk stratification and induction therapy
    corecore