16 research outputs found

    Evaluation of spelt germplasm for polyphenol oxidase activity and aluminium resistance

    Get PDF
    Kidney transplantation is the best treatment option for patients with end-stage renal failure. At present, approximately 800 Dutch patients are registered on the active waiting list of Eurotransplant. The waiting time in the Netherlands for a kidney from a deceased donor is on average between 3 and 4years. During this period, patients are fully dependent on dialysis, which replaces only partly the renal function, whereas the quality of life is limited. Mortality among patients on the waiting list is high. In order to increase the number of kidney donors, several initiatives have been undertaken by the Dutch Kidney Foundation including national calls for donor registration and providing information on organ donation and kidney transplantation. The aim of the national PROCARE consortium is to develop improved matching algorithms that will lead to a prolonged survival of transplanted donor kidneys and a reduced HLA immunization. The latter will positively affect the waiting time for a retransplantation. The present algorithm for allocation is among others based on matching for HLA antigens, which were originally defined by antibodies using serological typing techniques. However, several studies suggest that this algorithm needs adaptation and that other immune parameters which are currently not included may assist in improving graft survival rates. We will employ a multicenter-based evaluation on 5429 patients transplanted between 1995 and 2005 in the Netherlands. The association between key clinical endpoints and selected laboratory defined parameters will be examined, including Luminex-defined HLA antibody specificities, T and B cell epitopes recognized on the mismatched HLA antigens, non-HLA antibodies, and also polymorphisms in complement and Fc receptors functionally associated with effector functions of anti-graft antibodies. From these data, key parameters determining the success of kidney transplantation will be identified which will lead to the identification of additional parameters to be included in future matching algorithms aiming to extend survival of transplanted kidneys and to diminish HLA immunization. Computer simulation studies will reveal the number of patients having a direct benefit from improved matching, the effect on shortening of the waiting list, and the decrease in waiting time

    Allocation to highly sensitized patients based on acceptable mismatches results in low rejection rates comparable to nonsensitized patients

    Get PDF
    Contains fulltext : 208426.pdf (publisher's version ) (Open Access)Whereas regular allocation avoids unacceptable mismatches on the donor organ, allocation to highly sensitized patients within the Eurotransplant Acceptable Mismatch (AM) program is based on the patient's HLA phenotype plus acceptable antigens. These are HLA antigens to which the patient never made antibodies, as determined by extensive laboratory testing. AM patients have superior long-term graft survival compared with highly sensitized patients in regular allocation. Here, we questioned whether the AM program also results in lower rejection rates. From the PROCARE cohort, consisting of all Dutch kidney transplants in 1995-2005, we selected deceased donor single transplants with a minimum of 1 HLA mismatch and determined the cumulative 6-month rejection incidence for patients in AM or regular allocation. Additionally, we determined the effect of minimal matching criteria of 1 HLA-B plus 1 HLA-DR, or 2 HLA-DR antigens on rejection incidence. AM patients showed significantly lower rejection rates than highly immunized patients in regular allocation, comparable to nonsensitized patients, independent of other risk factors for rejection. In contrast to highly sensitized patients in regular allocation, minimal matching criteria did not affect rejection rates in AM patients. Allocation based on acceptable antigens leads to relatively low-risk transplants for highly sensitized patients with rejection rates similar to those of nonimmunized individuals

    Recovery of dialysis patients with COVID-19 : health outcomes 3 months after diagnosis in ERACODA

    Get PDF
    Background. Coronavirus disease 2019 (COVID-19)-related short-term mortality is high in dialysis patients, but longer-term outcomes are largely unknown. We therefore assessed patient recovery in a large cohort of dialysis patients 3 months after their COVID-19 diagnosis. Methods. We analyzed data on dialysis patients diagnosed with COVID-19 from 1 February 2020 to 31 March 2021 from the European Renal Association COVID-19 Database (ERACODA). The outcomes studied were patient survival, residence and functional and mental health status (estimated by their treating physician) 3 months after COVID-19 diagnosis. Complete follow-up data were available for 854 surviving patients. Patient characteristics associated with recovery were analyzed using logistic regression. Results. In 2449 hemodialysis patients (mean ± SD age 67.5 ± 14.4 years, 62% male), survival probabilities at 3 months after COVID-19 diagnosis were 90% for nonhospitalized patients (n = 1087), 73% for patients admitted to the hospital but not to an intensive care unit (ICU) (n = 1165) and 40% for those admitted to an ICU (n = 197). Patient survival hardly decreased between 28 days and 3 months after COVID-19 diagnosis. At 3 months, 87% functioned at their pre-existent functional and 94% at their pre-existent mental level. Only few of the surviving patients were still admitted to the hospital (0.8-6.3%) or a nursing home (∼5%). A higher age and frailty score at presentation and ICU admission were associated with worse functional outcome. Conclusions. Mortality between 28 days and 3 months after COVID-19 diagnosis was low and the majority of patients who survived COVID-19 recovered to their pre-existent functional and mental health level at 3 months after diagnosis

    Hospital specific factors affect quality of blood pressure treatment in chronic kidney disease

    No full text
    Item does not contain fulltextBACKGROUND: Blood pressure (BP) is the most important modifiable risk factor for cardiovascular (CV) disease and progression of kidney dysfunction in patients with chronic kidney disease. Despite extensive antihypertensive treatment possibilities, adequate control is notoriously hard to achieve. Several determinants have been identified which affect BP control. In the current analysis we evaluated differences in achieved BP and achievement of the BP goal between hospitals and explored possible explanations. METHODS: At baseline, BP was measured in a supine position with an oscillometric device in 788 patients participating in the MASTER PLAN study. We also retrieved the last measured office BP from the patient records. Additional baseline characteristics were derived from the study database. Univariate and multivariate analyses were performed with general linear modelling using hospital as a random factor. RESULTS: In univariate analysis, hospital was a determinant of the level of systolic and diastolic BP at baseline. Adjustment for patient, kidney disease, treatment or hospital characteristics affected the relation. Yet, in a fully adjusted model, differences between centres persisted with a range of 15 mmHg for systolic BP and 11 mmHg for diastolic BP. CONCLUSION: Despite extensive adjustments, a clinically relevant, statistically significant difference between hospitals was found in standardised BP measurements at baseline of a randomised controlled study. We hypothesise that differences in the approach towards BP control exist at the physician level and that these explain the differences between hospitals

    Nurse Practitioner Care Improves Renal Outcome in Patients with CKD

    No full text
    Item does not contain fulltextTreatment goals for patients with CKD are often unrealized for many reasons, but support by nurse practitioners may improve risk factor levels in these patients. Here, we analyzed renal endpoints of the Multifactorial Approach and Superior Treatment Efficacy in Renal Patients with the Aid of Nurse Practitioners (MASTERPLAN) study after extended follow-up to determine whether strict implementation of current CKD guidelines through the aid of nurse practitioners improves renal outcome. In total, 788 patients with moderate to severe CKD were randomized to receive nurse practitioner support added to physician care (intervention group) or physician care alone (control group). Median follow-up was 5.7 years. Renal outcome was a secondary endpoint of the MASTERPLAN study. We used a composite renal endpoint of death, ESRD, and 50% increase in serum creatinine. Event rates were compared with adjustment for baseline serum creatinine concentration and changes in estimated GFR were determined. During the randomized phase, there were small but significant differences between the groups in BP, proteinuria, LDL cholesterol, and use of aspirin, statins, active vitamin D, and antihypertensive medications, in favor of the intervention group. The intervention reduced the incidence of the composite renal endpoint by 20% (hazard ratio, 0.80; 95% confidence interval, 0.66 to 0.98; P=0.03). In the intervention group, the decrease in estimated GFR was 0.45 ml/min per 1.73 m(2) per year less than in the control group (P=0.01). In conclusion, additional support by nurse practitioners attenuated the decline of kidney function and improved renal outcome in patients with CKD

    Toward a Sensible Single-antigen Bead Cutoff Based on Kidney Graft Survival

    Get PDF
    Contains fulltext : 204258.pdf (publisher's version ) (Open Access)BACKGROUND: There is no consensus in the literature on the interpretation of single-antigen bead positive for a specific HLA antibody. METHODS: To inform the debate, we studied the relationship between various single-antigen bead positivity algorithms and the impact of resulting donor-specific HLA antibody (DSA) positivity on long-term kidney graft survival in 3237 deceased-donor transplants. RESULTS: First, we showed that the interassay variability can be greatly reduced when working with signal-to-background ratios instead of absolute median fluorescence intensities (MFIs). Next, we determined pretransplant DSA using various MFI cutoffs, signal-to-background ratios, and combinations thereof. The impact of the various cutoffs was studied by comparing the graft survival between the DSA-positive and DSA-negative groups. We did not observe a strong impact of various cutoff levels on 10-year graft survival. A stronger relationship between the cutoff level and 1-year graft survival for DSA-positive transplants was found when using signal-to-background ratios, most pronounced for the bead of the same HLA locus with lowest MFI taken as background. CONCLUSIONS: With respect to pretransplant risk stratification, we propose a signal-to-background ratio-6 (using the bead of the same HLA-locus with lowest MFI as background) cutoff of 15 combined with an MFI cutoff of 500, resulting in 8% and 21% lower 1- and 10-year graft survivals, respectively, for 8% DSA-positive transplants
    corecore