116 research outputs found

    Can transplant renal scintigraphy predict the duration of delayed graft function? A dual center retrospective study:A dual center retrospective study

    Get PDF
    Introduction: This study focused on the value of quantitatively analyzed and qualitatively graded renal scintigraphy in relation to the expected duration of delayed graft function after kidney transplantation. A more reliable prediction of delayed graft function duration may result in a more tailored and patient-specific treatment regimen post-transplantation. Methods: From 2000 to 2014, patients with early transplant dysfunction and a Tc-99m MAG3 renal scintigraphy, within 3 days post-transplantation, were included in a dual center retrospective study. Time-activity curves of renal scintigraphy procedures were qualitatively graded and various quantitative indices (R20/3, TFS, cTER, MUC10) were combined with a new index (Average upslope). The delayed graft function duration was defined as the number of days of dialysis-based/functional delayed graft function. Results: A total of 377 patients were included, with a mean age (± SD) of 52 ± 14 years, and 58% were male. A total of 274 (73%) patients experienced delayed graft function 7 days. Qualitative grading for the prediction of delayed graft function 7 days had a sensitivity and specificity of respectively 87% and 65%. The quantitative indices with the most optimal results were cTER (76% sensitivity, 72% specificity), and Average upslope (75% sensitivity, 73% specificity). Conclusions: Qualitative renal scintigraphy grading and the quantitative indices cTER and Average upslope predict delayed graft function ≥7 days with a high sensitivity. This finding may help to support both clinicians and patients in managing early post-operative expectations. However, the specificity is limited and thus renal scintigraphy does not reliably help to identify patients in whom the course of delayed graft function is longer than anticipated

    Association of Hepcidin-25 with survival after kidney transplantation

    Get PDF
    Background Hepcidin is considered the master regulator of iron homoeostasis. Novel hepcidin antagonists have recently been introduced as potential treatment for iron-restricted anaemia. Meanwhile, serum hepcidin has been shown to be positively associated with cardiovascular disease and inversely with acute kidney injury. These properties may lead to contrasting effects, especially in renal transplant recipients (RTR), which are prone to cardiovascular diseases and graft failure. To date, the role of serum hepcidin in RTR is unknown. We, therefore, prospectively determined the association of serum hepcidin with risk of graft failure, cardiovascular mortality and all-cause mortality in RTR. Materials and methods Serum hepcidin was assessed in an extensively phenotyped RTR cohort by dual-monoclonal sandwich ELISA specific immunoassay. Statistical analyses were performed using univariate linear regression followed by stepwise backward linear regression. Cox proportional hazard regression models were performed to determine prospective associations. Results We included 561 RTR (age 51 +/- 12 years). Mean haemoglobin (Hb) was 8.6 +/- 1.0 mM. Median [IQR] serum hepcidin was 7.2 [3.2-13.4] ng/mL. Mean estimated glomerular filtration rate was 47 +/- 16 mL/min/ 1.73 m(2). In univariate Cox regression analyses, serum hepcidin was not associated with risk of graft failure, cardiovascular mortality or all-cause mortality. Notably, after adjustment for high sensitivity C-reactive protein and ferritin, serum hepcidin became negatively associated with all-cause mortality (hazard ratio 0.89; 95% confidence interval 0.80-0.99, P = 0.3). Conclusions In this study, we did not find an association between serum hepcidin and outcomes, that is graft failure, cardiovascular mortality or all-cause mortality. Based on our results, it is questionable whether serum hepcidin may be used to predict a beneficial effect of hepcidin antagonists

    Repurposing HLA genotype data of renal transplant patients to prevent severe drug hypersensitivity reactions

    Get PDF
    Introduction: Specific alleles in human leukocyte antigens (HLAs) are associated with an increased risk of developing drug hypersensitivity reactions induced by abacavir, allopurinol, carbamazepine, oxcarbazepine, phenytoin, lamotrigine, or flucloxacillin. Transplant patients are genotyped for HLA as a routine practice to match a potential donor to a recipient. This study aims to investigate the feasibility and potential impact of repurposing these HLA genotype data from kidney transplant patients to prevent drug hypersensitivity reactions.Methods: A cohort of 1347 kidney transplant recipients has been genotyped in the Leiden University Medical Center (LUMC) using next-generation sequencing (NGS). The risk alleles HLA-A*31:01, HLA-B*15:02, HLA-B*15:11, HLA-B*57:01, and HLA-B*58:01 were retrieved from the NGS data. Medical history, medication use, and allergic reactions were obtained from the patient's medical records. Carrier frequencies found were compared to a LUMC blood donor population.Results: A total of 13.1% of transplant cohort patients carried at least one of the five HLA risk alleles and therefore had an increased risk of drug-induced hypersensitivity for specific drugs. HLA-A*31:01, HLA-B*15:02, HLA-B*57:01, and HLA-B*58:01 were found in carrier frequencies of 4.61%, 1.19%, 4.46%, and 3.35% respectively. No HLA-B*15:11 carrier was found. In total nine HLA-B*57:01 carriers received flucloxacillin and seven HLA-B*58:01 carriers within our cohort received allopurinol.Discussion: Our study shows that repurposing HLA genotype data from transplantation patients for the assignment of HLA risk alleles associated with drug hypersensitivity is feasible. The use of these data by physicians while prescribing drugs or by the pharmacist when dispensing drugs holds the potential to prevent drug hypersensitivity reactions. The utility of this method was highlighted by 13.1% of the transplant cohort patients carrying an actionable HLA allele. </p

    Drug-induced Fanconi syndrome associated with fumaric acid esters treatment for psoriasis: A case series

    Get PDF
    Background: Fumaric acid esters (FAEs), an oral immunomodulating treatment for psoriasis and multiple sclerosis, have been anecdotally associated with proximal renal tubular dysfunction due to a drug-induced Fanconi syndrome. Few data are available on clinical outcomes of FAE-induced Fanconi syndrome. Methods: Descriptive case series with two cases of Fanconi syndrome associated with FAE treatment diagnosed at two Dutch university nephrology departments, three cases reported at the Dutch and German national pharmacovigilance databases and six previously reported cases. Results: All 11 cases involved female patients with psoriasis. The median age at the time of onset was 38 years [interquartile range (IQR) 37-46]. Patients received long-term FAEs treatment with a median treatment duration of 60 months (IQR 28-111). Laboratory tests were typically significant for low serum levels of phosphate and uric acid, while urinalysis showed glycosuria and proteinuria. Eight (73%) patients had developed a hypophosphataemic osteomalacia and three (27%) had pathological bone fractures. All patients discontinued FAEs, while four (36%) patients were treated with supplementation of phosphate and/or vitamin D. Five (45%) patients had persisting symptoms despite FAEs discontinuation. Conclusions: FAEs treatment can cause drug-induced Fanconi syndrome, but the association has been reported infrequently. Female patients with psoriasis treated long term with FAEs seem to be particularly at risk. Physicians treating patients with FAEs should be vigilant and monitor for the potential occurrence of Fanconi syndrome. Measurement of the urinary albumin:total protein ratio is a suggested screening tool for tubular proteinuria in Fanconi syndrome

    European Society for Organ Transplantation (ESOT)-TLJ 3.0 Consensus on Histopathological Analysis of Pre-Implantation Donor Kidney Biopsy:Redefining the Role in the Process of Graft Assessment

    Get PDF
    The ESOT TLJ 3.0. consensus conference brought together leading experts in transplantation to develop evidence-based guidance on the standardization and clinical utility of pre-implantation kidney biopsy in the assessment of grafts from Expanded Criteria Donors (ECD). Seven themes were selected and underwent in-depth analysis after formulation of PICO (patient/population, intervention, comparison, outcomes) questions. After literature search, the statements for each key question were produced, rated according the GRADE approach [Quality of evidence: High (A), Moderate (B), Low (C); Strength of Recommendation: Strong (1), Weak (2)]. The statements were subsequently presented in-person at the Prague kick-off meeting, discussed and voted. After two rounds of discussion and voting, all 7 statements reached an overall agreement of 100% on the following issues: needle core/wedge/punch technique representatively [B,1], frozen/paraffin embedded section reliability [B,2], experienced/non-experienced on-call renal pathologist reproducibility/accuracy of the histological report [A,1], glomerulosclerosis/other parameters reproducibility [C,2], digital pathology/light microscopy in the measurement of histological variables [A,1], special stainings/Haematoxylin and Eosin alone comparison [A,1], glomerulosclerosis reliability versus other histological parameters to predict the graft survival, graft function, primary non-function [B,1]. This methodology has allowed to reach a full consensus among European experts on important technical topics regarding pre-implantation biopsy in the ECD graft assessment.</p

    Impact of a Public Health Emergency on Behavior, Stress, Anxiety and Glycemic Control in Patients With Pancreas or Islet Transplantation for Type 1 Diabetes

    Get PDF
    A public health emergency such as the COVID-19 pandemic has behavioral, mental and physical implications in patients with type 1 diabetes (T1D). To what extent the presence of a transplant further increases this burden is not known. Therefore, we compared T1D patients with an islet or pancreas transplant (β-cell Tx; n = 51) to control T1D patients (n = 272). Fear of coronavirus infection was higher in those with β-cell Tx than without (Visual Analogue Scale 5.0 (3.0–7.0) vs. 3.0 (2.0–5.0), p = 0.004) and social isolation behavior was more stringent (45.8% vs. 14.0% reported not leaving the house, p &lt; 0.001). A previous β-cell Tx was the most important predictor of at-home isolation. Glycemic control worsened in patients with β-cell Tx, but improved in control patients (ΔHbA1c +1.67 ± 8.74 vs. −1.72 ± 6.15 mmol/mol, p = 0.006; ΔTime-In-Range during continuous glucose monitoring −4.5% (−6.0%–1.5%) vs. +3.0% (−2.0%–6.0%), p = 0.038). Fewer patients with β-cell Tx reported easier glycemic control during lockdown (10.4% vs. 22.6%, p = 0.015). All T1D patients, regardless of transplantation status, experienced stress (33.4%), anxiety (27.9%), decreased physical activity (42.0%), weight gain (40.5%), and increased insulin requirements (29.7%). In conclusion, T1D patients with β-cell Tx are increasingly affected by a viral pandemic lockdown with higher fear of infection, more stringent social isolation behavior and deterioration of glycemic control.This trial has been registered in the clinicaltrials.gov registry under identifying number NCT05977205 (URL: https://clinicaltrials.gov/study/NCT05977205)

    Improving outcomes for donation after circulatory death kidney transplantation: Science of the times

    Get PDF
    The use of kidneys donated after circulatory death (DCD) remains controversial due to concerns with regard to high incidences of early graft loss, delayed graft function (DGF), and impaired graft survival. As these concerns are mainly based on data from historical cohorts, they are prone to time-related effects and may therefore not apply to the current timeframe. To assess the impact of time on outcomes, we performed a time-dependent comparative analysis of outcomes of DCD and donation after brain death (DBD) kidney transplantations. Data of all 11,415 deceased-donor kidney transplantations performed in The Netherlands between 1990-2018 were collected. Based on the incidences of early graft loss, two eras were defined (1998-2008 [n = 3,499] and 2008-2018 [n = 3,781]), and potential time-related effects on outcomes evaluated. Multivariate analyses were applied to examine associations between donor type and outcomes. Interaction tests were used to explore presence of effect modification. Results show clear time-related effects on posttransplant outcomes. The 1998-2008 interval showed compromised outcomes for DCD procedures (higher incidences of DGF and early graft loss, impaired 1-year renal function, and inferior graft survival), whereas DBD and DCD outcome equivalence was observed for the 2008-2018 interval. This occurred despite persistently high incidences of DGF in DCD grafts, and more adverse recipient and donor risk profiles (recipients were 6 years older and the KDRI increased from 1.23 to 1.39 and from 1.35 to 1.49 for DBD and DCD donors). In contrast, the median cold ischaemic period decreased from 20 to 15 hours. This national study shows major improvements in outcomes of transplanted DCD kidneys over time. The time-dependent shift underpins that kidney transplantation has come of age and DCD results are nowadays comparable to DBD transplants. It also calls for careful interpretation of conclusions based on historical cohorts, and emphasises that retrospective studies should correct for time-related effects.Transplant surger

    A nationwide evaluation of deceased donor kidney transplantation indicates detrimental consequences of early graft loss

    Get PDF
    Early graft loss (EGL) is a feared outcome of kidney transplantation. Consequently, kidneys with an anticipated risk of EGL are declined for transplantation. In the most favorable scenario, with optimal use of available donor kidneys, the donor pool size is balanced by the risk of EGL, with a tradeoff dictated by the consequences of EGL. To gauge the consequence of EGL we systematically evaluated its impact in an observational study that included all 10,307 deceased-donor kidney transplantations performed in The Netherlands between 1990 and 2018. Incidence of EGL, defined as graft loss within 90 days, in primary transplantation was 8.2% (699/8,511). The main causes were graft rejection (30%), primary nonfunction (25%), and thrombosis or infarction (20%). EGL profoundly impacted short- and long-term patient survival (adjusted hazard ratio; 95% confidence interval: 8.2; 5.1-13.2 and 1.7; 1.3-2.1, respectively). Of the EGL recipients who survived 90 days after transplantation (617/699) only 440 of the 617 were relisted for re-transplantation. Of those relisted, only 298 were ultimately re-transplanted leading to an actual re-transplantation rate of 43%. Noticeably, re-transplantation was associated with a doubled incidence of EGL, but similar long-term graft survival (adjusted hazard ratio 1.1; 0.6-1.8). Thus, EGL after kidney transplantation is a medical catastrophe with high mortality rates, low relisting rates, and increased risk of recurrent EGL following re-transplantation. This implies that detrimental outcomes also involve convergence of risk factors in recipients with EGL. The 8.2% incidence of EGL minimally impacted population mortality, indicating this incidence is acceptable

    Improving outcomes for donation after circulatory death kidney transplantation:Science of the times

    Get PDF
    The use of kidneys donated after circulatory death (DCD) remains controversial due to concerns with regard to high incidences of early graft loss, delayed graft function (DGF), and impaired graft survival. As these concerns are mainly based on data from historical cohorts, they are prone to time-related effects and may therefore not apply to the current timeframe. To assess the impact of time on outcomes, we performed a time-dependent comparative analysis of outcomes of DCD and donation after brain death (DBD) kidney transplantations. Data of all 11,415 deceased-donor kidney transplantations performed in The Netherlands between 1990-2018 were collected. Based on the incidences of early graft loss, two eras were defined (1998-2008 [n = 3,499] and 2008-2018 [n = 3,781]), and potential time-related effects on outcomes evaluated. Multivariate analyses were applied to examine associations between donor type and outcomes. Interaction tests were used to explore presence of effect modification. Results show clear time-related effects on posttransplant outcomes. The 1998-2008 interval showed compromised outcomes for DCD procedures (higher incidences of DGF and early graft loss, impaired 1-year renal function, and inferior graft survival), whereas DBD and DCD outcome equivalence was observed for the 2008-2018 interval. This occurred despite persistently high incidences of DGF in DCD grafts, and more adverse recipient and donor risk profiles (recipients were 6 years older and the KDRI increased from 1.23 to 1.39 and from 1.35 to 1.49 for DBD and DCD donors). In contrast, the median cold ischaemic period decreased from 20 to 15 hours. This national study shows major improvements in outcomes of transplanted DCD kidneys over time. The time-dependent shift underpins that kidney transplantation has come of age and DCD results are nowadays comparable to DBD transplants. It also calls for careful interpretation of conclusions based on historical cohorts, and emphasises that retrospective studies should correct for time-related effects

    SARS-CoV-2-specific immune responses converge in kidney disease patients and controls with hybrid immunity

    Get PDF
    Healthy individuals with hybrid immunity, due to a SARS-CoV-2 infection prior to first vaccination, have stronger immune responses compared to those who were exclusively vaccinated. However, little is known about the characteristics of antibody, B- and T-cell responses in kidney disease patients with hybrid immunity. Here, we explored differences between kidney disease patients and controls with hybrid immunity after asymptomatic or mild coronavirus disease-2019 (COVID-19). We studied the kinetics, magnitude, breadth and phenotype of SARS-CoV-2-specific immune responses against primary mRNA-1273 vaccination in patients with chronic kidney disease or on dialysis, kidney transplant recipients, and controls with hybrid immunity. Although vaccination alone is less immunogenic in kidney disease patients, mRNA-1273 induced a robust immune response in patients with prior SARS-CoV-2 infection. In contrast, kidney disease patients with hybrid immunity develop SARS-CoV-2 antibody, B- and T-cell responses that are equally strong or stronger than controls. Phenotypic analysis showed that Spike (S)-specific B-cells varied between groups in lymph node-homing and memory phenotypes, yet S-specific T-cell responses were phenotypically consistent across groups. The heterogeneity amongst immune responses in hybrid immune kidney patients warrants further studies in larger cohorts to unravel markers of long-term protection that can be used for the design of targeted vaccine regimens.</p
    • …
    corecore