131 research outputs found

    Dietary protein intake and kidney function decline after myocardial infarction:the Alpha Omega Cohort

    Get PDF
    BACKGROUND: Post-myocardial infarction (MI) patients have a doubled rate of kidney function decline compared with the general population. We investigated the extent to which high intake of total, animal and plant protein are risk factors for accelerated kidney function decline in older stable post-MI patients. METHODS: We analysed 2255 post-MI patients (aged 60-80 years, 80% men) of the Alpha Omega Cohort. Dietary data were collected with a biomarker-validated 203-item food frequency questionnaire. At baseline and 41 months, we estimated glomerular filtration rate based on the Chronic Kidney Disease Epidemiology Collaboration equations for serum cystatin C [estimated glomerular filtration rate (eGFRcysC)] alone and both creatinine and cystatin C (eGFRcr-cysC). RESULTS: Mean [standard deviation (SD)] baseline eGFRcysC and eGFRcr-cysC were 82 (20) and 79 (19) mL/min/1.73 m2. Of all patients, 16% were current smokers and 19% had diabetes. Mean (SD) total protein intake was 71 (19) g/day, of which two-thirds was animal and one-third plant protein. After multivariable adjustment, including age, sex, total energy intake, smoking, diabetes, systolic blood pressure, renin-angiotensin system blocking drugs and fat intake, each incremental total daily protein intake of 0.1 g/kg ideal body weight was associated with an additional annual eGFRcysC decline of -0.12 (95% confidence interval -0.19 to -0.04) mL/min/1.73 m2, and was similar for animal and plant protein. Patients with a daily total protein intake of ≥1.20 compared with <0.80 g/kg ideal body weight had a 2-fold faster annual eGFRcysC decline of -1.60 versus -0.84 mL/min/1.73 m2. Taking eGFRcr-cysC as outcome showed similar results. Strong linear associations were confirmed by restricted cubic spline analyses. CONCLUSION: A higher protein intake was significantly associated with a more rapid kidney function decline in post-MI patients.</p

    Peripheral Blood Immune Cell Composition After Autologous MSC Infusion in Kidney Transplantation Recipients

    Get PDF
    Tacrolimus is the backbone of immunosuppressive agents to prevent transplant rejection. Paradoxically, tacrolimus is nephrotoxic, causing irreversible tubulointerstitial damage. Therefore, infusion of mesenchymal stromal cells (MSC) 6 and 7 weeks post-transplantation was assessed to facilitate withdrawal of tacrolimus in the randomized phase II TRITON trial. Here, we performed detailed analysis of the peripheral blood immune composition using mass cytometry to assess potential effects of MSC therapy on the immune system. We developed two metal-conjugated antibody panels containing 40 antibodies each. PBMC samples from 21 MSC-treated patients and 13 controls, obtained pre-transplant and at 24 and 52 weeks post-transplantation, were analyzed. In the MSC group at 24 weeks, 17 CD4+ T cell clusters were increased of which 14 Th2-like clusters and three Th1/Th2-like clusters, as well as CD4+FoxP3+ Tregs. Additionally, five B cell clusters were increased, representing either class switched memory B cells or proliferating B cells. At 52 weeks, CCR7+CD38+ mature B cells were decreased. Finally, eight Tc1 (effector) memory cytotoxic T cell clusters were increased. Our work provides a comprehensive account of the peripheral blood immune cell composition in kidney transplant recipients after MSC therapy and tacrolimus withdrawal. These results may help improving therapeutic strategies using MSCs with the aim to reduce the use of calcineurin inhibitors. Clinical Trial Registration: ClinicalTrials.gov, identifier NCT02057965.</p

    Serum bile acids associate with liver volume in polycystic liver disease and decrease upon treatment with lanreotide

    Get PDF
    Background: Polycystic liver disease (PLD) is a common extrarenal manifestation of autosomal dominant polycystic kidney disease (ADPKD). Bile acids may play a role in PLD pathogenesis. We performed a post-hoc exploratory analysis of bile acids in ADPKD patients, who had participated in a trial on the effect of a somatostatin analogue. Our hypothesis was that serum bile acid levels increase in PLD, and that lanreotide, which reduces liver growth, may also reduce bile acid levels. Furthermore, in PLD, urinary excretion of bile acids might contribute to renal disease. Methods:With liquid chromatography-mass spectrometry, 11 bile acids in serum and 6 in urine were quantified in 105 PLD ADPKD patients and 52 age-, sex-, mutation- and eGFR-matched non-PLD ADPKD patients. Sampling was done at baseline and after 120 weeks of either lanreotide or standard care. Results: Baseline serum levels of taurine- and glycine-conjugated bile acids were higher in patients with larger livers. In PLD patients, multiple bile acids decreased upon treatment with lanreotide but remained stable in untreated subjects. Changes over time did not correlate with changes in liver volume. Urine bile acid levels did not change and did not correlate with renal disease progression. Conclusion: In ADPKD patients with PLD, baseline serum bile acids were associated with liver volume. Lanreotide reduced bile acid levels and has previously been shown to reduce liver volume. However, in this study, the decrease in bile acids was not associated with the change in liver volume.</p

    AUC-guided dosing of tacrolimus prevents progressive systemic overexposure in renal transplant recipients

    Get PDF
    AUC-guided dosing of tacrolimus prevents progressive systemic overexposure in renal transplant recipients.BackgroundTacrolimus has a narrow therapeutic window, and bioavailability is known to vary considerably between renal transplant recipients. Most centers still rely on measurement of trough levels, but there are conflicting reports on the correlation between tacrolimus trough levels and systemic exposure, as measured by the area-under-the-concentration-over-time curve (AUC(0-12h)).MethodsWe developed and validated a two-compartmental population-based pharmacokinetic model with Bayesian estimation of tacrolimus systemic exposure. Subsequently, we used this model to apply prospectively AUC-guided dosing of tacrolimus in 15 consecutive renal transplant recipients. The main objective was to study intrapatient variability in the course of time.ResultsBayesian forecasting with a two-point sampling strategy, a trough level, and a second sample obtained between two and four hours post-dose significantly improved the squared correlation with the AUC(0-12h) (r2= 0.94). Compared with trough level monitoring only, this approach reduced the 95%-prediction interval by 50%. The Bayesian approach proved to be feasible in clinical practice, and provided accurate information about systemic tacrolimus exposure in individual patients. In the AUC-guided dosing cohort the apparent clearance of tacrolimus decreased gradually over time, which was not reflected in corresponding trough levels.ConclusionThis simple, flexible method provides the opportunity to tailor immunosuppression, and should help minimize tacrolimus-related toxicity, such as nephrotoxicity and post-transplant diabetes mellitus

    Hepatic Cyst Infection During Use of the Somatostatin Analog Lanreotide in Autosomal Dominant Polycystic Kidney Disease: An Interim Analysis of the Randomized Open-Label Multicenter DIPAK-1 Study

    Get PDF
    Introduction and Aims: The DIPAK-1 Study investigates the reno- and hepatoprotective efficacy of the somatostatin analog lanreotide compared with standard care in patients with later stage autosomal dominant polycystic kidney disease (ADPKD). During this trial, we witnessed several episodes of hepatic cyst infection, all during lanreotide treatment. We describe these events and provide a review of the literature. Methods: The DIPAK-1 Study is an ongoing investigator-driven, randomized, controlled, open-label multicenter trial. Patients (ADPKD, ages 18–60 years, estimated glomerular filtration rate 30–60 mL/min/1.73 m2) were randomized 1:1 to receive lanreotide 120 mg subcutaneously every 28 days or standard care during 120 weeks. Hepatic cyst infection was diagnosed by local physicians. Results: We included 309 ADPKD patients of which seven (median age 53 years [interquartile range: 48–55], 71% female, median estimated glomerular filtration rate 42 mL/min/1.73 m2 [interquartile range: 41–58]) developed eight episodes of hepatic cyst infection during 342 patient-years of lanreotide use (0.23 cases per 10 patient-years). These events were limited to patients receiving lanreotide (p < 0.001 vs. standard care). Baseline characteristics were similar between subjects who did or did not develop a hepatic cyst infection during lanreotide use, except for a history of hepatic cyst infection (29 vs. 0.7%, p < 0.001). Previous studies with somatostatin analogs reported cyst infections, but did not identify a causal relationship. Conclusions: These data suggest an increased risk for hepatic cyst infection during use of somatostatin analogs, especially in ADPKD patients with a history of hepatic cyst infection. The main results are still awaited to fully appreciate the risk–benefit ratio. ClinicalTrials.gov identifier: NCT 01616927

    Salt, but not protein intake, is associated with accelerated disease progression in autosomal dominant polycystic kidney disease

    Get PDF
    In autosomal dominant polycystic kidney disease (ADPKD), there are only scarce data on the effect of salt and protein intake on disease progression. Here we studied association of these dietary factors with the rate of disease progression in ADPKD, and what the mediating factors are by analyzing an observational cohort of 589 patients with ADPKD. Salt and protein intake were estimated from 24-hour urine samples and the plasma copeptin concentration measured as a surrogate for vasopressin. The association of dietary intake with annual change in the estimated glomerular filtration rate (eGFR) and height adjusted total kidney volume (htTKV) growth was analyzed with mixed models. In case of significant associations, mediation analyses were performed to elucidate potential mechanisms. These patients (59% female) had a mean baseline age of 47, eGFR 64 mL/min/1.73m2 and the median htTKV was 880 mL. The mean estimated salt intake was 9.1 g/day and protein intake 84 g/day. During a median follow-up of 4.0 years, eGFR was assessed a median of six times and 24-hour urine was collected a median of five times. Salt intake was significantly associated with annual change in eGFR of -0.11 (95% confidence interval (0.20 - - 0.02) mL/min/1.73m2 per gram of salt, whereas protein intake was not (-0.00001 (-0.01 - 0.01) mL/min/1.73m2 per gram of protein. The effect of salt intake on eGFR slope was significantly mediated by plasma copeptin (crude analysis: 77% mediation, and, adjusted analysis: 45% mediation), but not by systolic blood pressure. Thus, higher salt, but not higher protein intake may be detrimental in ADPKD. The substantial mediation by plasma copeptin suggests that this effect is primarily a consequence of a salt-induced rise in vasopressin

    Conversion from calcineurin inhibitor to belatacept-based maintenance immunosuppression in renal transplant recipients:A randomized phase 3b Trial

    Get PDF
    Significance Statement This randomized trial demonstrates the safety and efficacy of conversion from calcineurin inhibitor (CNI)? to belatacept-based maintenance immunosuppression in renal transplant recipients 6?60 months post-transplant. Patients converted to belatacept showed sustained improvement in renal function associated with an acceptable safety profile consistent with prior experience and a smaller treatment difference in acute rejection postconversion compared with that observed in earlier studies in de novo renal allograft recipients. These results favor the use of belatacept as an alternative to continued long-term CNI-based maintenance immunosuppression, which is particularly relevant for CNI-intolerant patients, including those who experience nephrotoxicity. These data help inform clinical practice guidelines regarding the conversion of such patients to an alternative immunosuppressive drug regimen.Background Calcineurin inhibitors (CNIs) are standard of care after kidney transplantation, but they are associated with nephrotoxicity and reduced long-term graft survival. Belatacept, a selective T cell costimulation blocker, is approved for the prophylaxis of kidney transplant rejection. This phase 3 trial evaluated the efficacy and safety of conversion from CNI-based to belatacept-based maintenance immunosuppression in kidney transplant recipients.Methods Stable adult kidney transplant recipients 6?60 months post-transplantation under CNI-based immunosuppression were randomized (1:1) to switch to belatacept or continue treatment with their established CNI. The primary end point was the percentage of patients surviving with a functioning graft at 24 months.Results Overall, 446 renal transplant recipients were randomized to belatacept conversion (n=223) or CNI continuation (n=223). The 24-month rates of survival with graft function were 98% and 97% in the belatacept and CNI groups, respectively (adjusted difference, 0.8; 95.1% CI, ?2.1 to 3.7). In the belatacept conversion versus CNI continuation groups, 8% versus 4% of patients experienced biopsy-proven acute rejection (BPAR), respectively, and 1% versus 7% developed de novo donor-specific antibodies (dnDSAs), respectively. The 24-month eGFR was higher with belatacept (55.5 versus 48.5 ml/min per 1.73 m(2) with CNI). Both groups had similar rates of serious adverse events, infections, and discontinuations, with no unexpected adverse events. One patient in the belatacept group had post-transplant lymphoproliferative disorder.Conclusions Switching stable renal transplant recipients from CNI-based to belatacept-based immunosuppression was associated with a similar rate of death or graft loss, improved renal function, and a numerically higher BPAR rate but a lower incidence of dnDSA. Clinical Trial registry name and registration number: A Study in Maintenance Kidney Transplant Recipients Following Conversion to Nulojix? (Belatacept)-Based, NCT01820572Nephrolog

    Body-fat indicators and kidney function decline in older post-myocardial infarction patients:The Alpha Omega Cohort Study

    Get PDF
    Background: Obesity increases risk of hypertension and diabetes, the leading causes of end-stage renal disease. The effect of obesity on kidney function decline in stable post-myocardial infarction patients is poorly documented. This relation was investigated in a large cohort of older post-myocardial infarction patients. Design: Data were analysed from 2410 post-myocardial infarction patients in the Alpha Omega Trial, aged 60–80 years receiving optimal pharmacotherapy treatment (79% men, 18% diabetes). Methods: Cystatin C based estimated glomerular filtration rate (eGFRcysC) was calculated at baseline and after 41 months, using the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equation. Obesity was defined as body mass index ≥ 30 kg/m2 and high waist circumference as ≥102 and ≥88 cm for men and women. The relation between body mass index, waist circumference and annual eGFRcysC decline was evaluated by linear regression. Results: At baseline, mean (standard deviation) eGFRcysC was 81.5 (19.6) ml/min/1.73 m2, 23% of all patients were obese. After multivariable adjustment, the annual mean (95% confidence interval) eGFRcysC decline in men and women was –1.45 (–1.59 to –1.31) and –0.92 (–1.20 to –0.63) ml/min/1.73 m2, respectively (p = 0.001). Obese versus non-obese patients and patients with high versus normal waist circumference experienced greater annual eGFRcysC decline. Men and women showed an additional annual eGFRcysC decline of –0.35 (–0.56 to –0.14) and –0.21 (–0.55 to 0.14) ml/min/1.73 m2 per 5 kg/m2 body mass index increment (p for interaction 0.3). Conclusions: High compared to normal body mass index or waist circumference were associated with more rapid kidney function decline in older stable post-myocardial infarction patients receiving optimal drug therapy.</p
    • …
    corecore