198 research outputs found

    Discrepancy between self-perceived mycophenolic acid-associated diarrhea and stool water content after kidney transplantation

    Get PDF
    BACKGROUND: Diarrhea is a well-known side effect of mycophenolic acid (MPA) use in kidney transplant recipients (KTRs). It is unknown whether self-reported diarrhea using the Modified Transplant Symptom Occurrence and Symptom Distress Scale (MTSOSD-59R) corresponds to stool water content and how both relate to MPA usage. METHODS: MTSOSD-59R questionnaires filled out by 700 KTRs from the TransplantLines Biobank and Cohort Study(NCT03272841) were analyzed and compared with stool water content. Stool samples(N=345) were freeze-dried and a water content ≥80% was considered diarrhea. RESULTS: Self-perceived diarrhea was reported by 46%, while stool water content ≥80% was present in 23% of KTRs. MPA use was not associated with self-perceived diarrhea (odds ratio(OR) 1.32; 95% confidence interval(CI), 0.87-1.99, P=0.2), while it was associated with stool water content ≥80% (OR 2.88; 95%CI, 1.41-5.89, P=0.004), independent of potential confounders. Adjustment for prior MPA discontinuation because of severe diarrhea, uncovered an association between MPA use and self-perceived diarrhea (OR 1.80; 95%CI, 1.13-2.89, P=0.01). CONCLUSIONS: These results suggest that reporting bias could add to the discrepancy between both methods for diarrhea assessment. We recommend use of objective biomarkers or more extensive questionnaires which assess information on stool frequency and stool consistency, to investigate post-transplantation diarrhea

    Dietary lithium intake, graft failure and mortality in kidney transplant recipients

    Get PDF
    BACKGROUND &amp; AIMS: Long-term high dose lithium therapy in bipolar disorder is known to adversely affect kidney function. However, recent animal studies revealed that low amounts of lithium are beneficial for the kidney when it is damaged by exposure to nephrotoxic compounds, inflammation, or oxidative stress. This study aimed to investigate whether urinary lithium excretion, reflecting dietary lithium intake, is associated with adverse long-term kidney graft outcomes and patient survival.METHODS: Urinary lithium concentration was measured using inductively coupled plasma-mass-spectrometry in 642 stable kidney transplant recipients. Graft failure was defined as start of dialysis or re-transplantation, and kidney function decline was defined as doubling of serum creatinine.RESULTS: Median [interquartile range] urinary lithium excretion was 3.03 [2.31-4.01] μmol/24 h. Urinary lithium excretion was associated with energy, plant protein and water intake. During a median follow-up of 5.3 [4.5-6.0] years, 79 (12%) KTR developed graft failure and 127 (20%) KTR developed kidney function decline. Higher urinary lithium excretion was associated with lower risk of graft failure (hazard ratio [95% confidence interval]: 0.54 [0.38-0.79] per log2 μmol/24 h) and kidney function decline (HR [95% CI]: 0.73 [0.54-0.99] per log2 μmol/24 h). These associations remained independent of adjustment for potential confounders and in sensitivity analyses. There was significant effect modification by use of proliferation inhibitors (P = 0.05) and baseline eGFR (P &lt; 0.001), with higher urinary lithium excretion being more protective in KTR not using proliferation inhibitors and in KTR with lower baseline eGFR. Furthermore, higher urinary lithium excretion was associated with reduced risk of all-cause mortality (HR [95% CI]: 0.64 [0.49-0.83]; P = 0.001).CONCLUSION: Dietary lithium intake may be a potentially modifiable-yet rather overlooked-risk factor for adverse long-term kidney graft outcomes and patient survival.</p

    Dietary lithium intake, graft failure and mortality in kidney transplant recipients

    Get PDF
    BACKGROUND &amp; AIMS: Long-term high dose lithium therapy in bipolar disorder is known to adversely affect kidney function. However, recent animal studies revealed that low amounts of lithium are beneficial for the kidney when it is damaged by exposure to nephrotoxic compounds, inflammation, or oxidative stress. This study aimed to investigate whether urinary lithium excretion, reflecting dietary lithium intake, is associated with adverse long-term kidney graft outcomes and patient survival.METHODS: Urinary lithium concentration was measured using inductively coupled plasma-mass-spectrometry in 642 stable kidney transplant recipients. Graft failure was defined as start of dialysis or re-transplantation, and kidney function decline was defined as doubling of serum creatinine.RESULTS: Median [interquartile range] urinary lithium excretion was 3.03 [2.31-4.01] μmol/24 h. Urinary lithium excretion was associated with energy, plant protein and water intake. During a median follow-up of 5.3 [4.5-6.0] years, 79 (12%) KTR developed graft failure and 127 (20%) KTR developed kidney function decline. Higher urinary lithium excretion was associated with lower risk of graft failure (hazard ratio [95% confidence interval]: 0.54 [0.38-0.79] per log2 μmol/24 h) and kidney function decline (HR [95% CI]: 0.73 [0.54-0.99] per log2 μmol/24 h). These associations remained independent of adjustment for potential confounders and in sensitivity analyses. There was significant effect modification by use of proliferation inhibitors (P = 0.05) and baseline eGFR (P &lt; 0.001), with higher urinary lithium excretion being more protective in KTR not using proliferation inhibitors and in KTR with lower baseline eGFR. Furthermore, higher urinary lithium excretion was associated with reduced risk of all-cause mortality (HR [95% CI]: 0.64 [0.49-0.83]; P = 0.001).CONCLUSION: Dietary lithium intake may be a potentially modifiable-yet rather overlooked-risk factor for adverse long-term kidney graft outcomes and patient survival.</p

    Dietary lithium intake, graft failure and mortality in kidney transplant recipients

    Get PDF
    BACKGROUND &amp; AIMS: Long-term high dose lithium therapy in bipolar disorder is known to adversely affect kidney function. However, recent animal studies revealed that low amounts of lithium are beneficial for the kidney when it is damaged by exposure to nephrotoxic compounds, inflammation, or oxidative stress. This study aimed to investigate whether urinary lithium excretion, reflecting dietary lithium intake, is associated with adverse long-term kidney graft outcomes and patient survival.METHODS: Urinary lithium concentration was measured using inductively coupled plasma-mass-spectrometry in 642 stable kidney transplant recipients. Graft failure was defined as start of dialysis or re-transplantation, and kidney function decline was defined as doubling of serum creatinine.RESULTS: Median [interquartile range] urinary lithium excretion was 3.03 [2.31-4.01] μmol/24 h. Urinary lithium excretion was associated with energy, plant protein and water intake. During a median follow-up of 5.3 [4.5-6.0] years, 79 (12%) KTR developed graft failure and 127 (20%) KTR developed kidney function decline. Higher urinary lithium excretion was associated with lower risk of graft failure (hazard ratio [95% confidence interval]: 0.54 [0.38-0.79] per log2 μmol/24 h) and kidney function decline (HR [95% CI]: 0.73 [0.54-0.99] per log2 μmol/24 h). These associations remained independent of adjustment for potential confounders and in sensitivity analyses. There was significant effect modification by use of proliferation inhibitors (P = 0.05) and baseline eGFR (P &lt; 0.001), with higher urinary lithium excretion being more protective in KTR not using proliferation inhibitors and in KTR with lower baseline eGFR. Furthermore, higher urinary lithium excretion was associated with reduced risk of all-cause mortality (HR [95% CI]: 0.64 [0.49-0.83]; P = 0.001).CONCLUSION: Dietary lithium intake may be a potentially modifiable-yet rather overlooked-risk factor for adverse long-term kidney graft outcomes and patient survival.</p

    Macroorchidism in FMR1 knockout mice is caused by increased Sertoli cell proliferation during testicular development

    Get PDF
    The fragile X syndrome is the most frequent hereditary form of mental retardation. This X-linked disorder is, in most cases, caused by an unstable and expanding trinucleotide CGG repeat located in the 5'-untranslated region of the gene involved, the fragile X mental retardation 1 (FMR1) gene. Expansion of the CGG repeat to a length of more than 200 trinucleotides results in silencing of the FMR1 gene promoter and, thus, in an inactive gene. The clinical features of male fragile X patients include mental retardat

    Dietary lithium intake, graft failure and mortality in kidney transplant recipients

    Get PDF
    BACKGROUND &amp; AIMS: Long-term high dose lithium therapy in bipolar disorder is known to adversely affect kidney function. However, recent animal studies revealed that low amounts of lithium are beneficial for the kidney when it is damaged by exposure to nephrotoxic compounds, inflammation, or oxidative stress. This study aimed to investigate whether urinary lithium excretion, reflecting dietary lithium intake, is associated with adverse long-term kidney graft outcomes and patient survival.METHODS: Urinary lithium concentration was measured using inductively coupled plasma-mass-spectrometry in 642 stable kidney transplant recipients. Graft failure was defined as start of dialysis or re-transplantation, and kidney function decline was defined as doubling of serum creatinine.RESULTS: Median [interquartile range] urinary lithium excretion was 3.03 [2.31-4.01] μmol/24 h. Urinary lithium excretion was associated with energy, plant protein and water intake. During a median follow-up of 5.3 [4.5-6.0] years, 79 (12%) KTR developed graft failure and 127 (20%) KTR developed kidney function decline. Higher urinary lithium excretion was associated with lower risk of graft failure (hazard ratio [95% confidence interval]: 0.54 [0.38-0.79] per log2 μmol/24 h) and kidney function decline (HR [95% CI]: 0.73 [0.54-0.99] per log2 μmol/24 h). These associations remained independent of adjustment for potential confounders and in sensitivity analyses. There was significant effect modification by use of proliferation inhibitors (P = 0.05) and baseline eGFR (P &lt; 0.001), with higher urinary lithium excretion being more protective in KTR not using proliferation inhibitors and in KTR with lower baseline eGFR. Furthermore, higher urinary lithium excretion was associated with reduced risk of all-cause mortality (HR [95% CI]: 0.64 [0.49-0.83]; P = 0.001).CONCLUSION: Dietary lithium intake may be a potentially modifiable-yet rather overlooked-risk factor for adverse long-term kidney graft outcomes and patient survival.</p

    Urinary 3-hydroxyisovaleryl carnitine excretion, protein energy malnutrition and risk of all-cause mortality in kidney transplant recipients:Results from the TransplantLines cohort studies

    Get PDF
    Background: Leucine is an essential amino acid and a potent stimulator of muscle protein synthesis. Since muscle wasting is a major risk factor for mortality in kidney transplant recipients (KTR), dietary leucine intake might be linked to long-term mortality. Urinary 3-hydroxyisovaleryl carnitine (3-HIC) excretion, a functional marker of marginal biotin deficiency, may also serve as a marker for dietary leucine intake. Objective: In this study we aimed to investigate the cross-sectional determinants of urinary 3-HIC excretion and to prospectively investigate the association of urinary 3-HIC excretion with all-cause mortality in KTR. Design: Urinary 3-HIC excretion and plasma biotin were measured in a longitudinal cohort of 694 stable KTR. Cross-sectional and prospective analyses were performed using ordinary least squares linear regression analyses and Cox regression analyses, respectively. Results: In KTR (57% male, 53 +/- 13 years, estimated glomerular filtration rate 45 +/- 19 mL/min/1.73 m(2)), urinary 3-HIC excretion (0.80 [0.57-1.16] mu mol/24 h) was significantly associated with plasma biotin (std. beta = -0.17; P 45%. During median follow-up for 5.4 [4.8-6.1] years, 150 (22%) patients died. Log(2)-transformed urinary 3-HIC excretion was inversely associated with all-cause mortality (HR: 0.52 [0.43-0.63]; P < 0.001). This association was independent of potential confounders. Conclusions: Urinary 3-HIC excretion more strongly serves as a marker of leucine intake than of biotin status. A higher urinary 3-HIC excretion is associated with a lower risk of all-cause mortality. Future studies are warranted to explore the underlying mechanism. (C) 2020 The Authors. Published by Elsevier Ltd

    Urinary Carnosinase-1 Excretion is Associated with Urinary Carnosine Depletion and Risk of Graft Failure in Kidney Transplant Recipients: Results of the TransplantLines Cohort Study

    Get PDF
    Carnosine affords protection against oxidative and carbonyl stress, yet high concentrations of the carnosinase-1 enzyme may limit this. We recently reported that high urinary carnosinase-1 is associated with kidney function decline and albuminuria in patients with chronic kidney disease. We prospectively investigated whether urinary carnosinase-1 is associated with a high risk for development of late graft failure in kidney transplant recipients (KTRs). Carnosine and carnosinase-1 were measured in 24 h urine in a longitudinal cohort of 703 stable KTRs and 257 healthy controls. Cox regression was used to analyze the prospective data. Urinary carnosine excretions were significantly decreased in KTRs (26.5 [IQR 21.4-33.3] µmol/24 h versus 34.8 [IQR 25.6-46.8] µmol/24 h; p < 0.001). In KTRs, high urinary carnosinase-1 concentrations were associated with increased risk of undetectable urinary carnosine (OR 1.24, 95%CI [1.06-1.45]; p = 0.007). During median follow-up for 5.3 [4.5-6.0] years, 84 (12%) KTRs developed graft failure. In Cox regression analyses, high urinary carnosinase-1 excretions were associated with increased risk of graft failure (HR 1.73, 95%CI [1.44-2.08]; p < 0.001) independent of potential confounders. Since urinary carnosine is depleted and urinary carnosinase-1 imparts a higher risk for graft failure in KTRs, future studies determining the potential of carnosine supplementation in these patients are warranted

    Comparison of two methods for the assessment of intra-erythrocyte magnesium and its determinants:Results from the LifeLines cohort study

    Get PDF
    BACKGROUND: Direct methods for the assessment of intra-erythrocyte magnesium (dIEM) require extensive sample preparation, making them labor intensive. An alternative, less labor intensive method is indirect calculation of intra-erythrocyte magnesium (iIEM). We compared dIEM and iIEM and studied determinants of dIEM and iIEM, plasma magnesium and 24-h urinary magnesium excretion in a large population-based cohort study. METHODS: dIEM and iIEM were measured using a validated inductively coupled plasma mass spectrometry (ICP-MS) method in 1669 individuals from the second screening from the LifeLines Cohort Study. We used linear regression analyses to study the determinants of IEM, plasma magnesium and 24-h urinary magnesium excretion. RESULTS: Mean dIEM and iIEM were 0.20 ± 0.04 mmol/1012 cells and 0.25 ± 0.04 mmol/1012 cells, respectively. We found a strong correlation between dIEM and iIEM (r = 0.75). Passing-Bablok regression analyses showed an intercept of 0.015 (95% CI: 0.005; 0.023) and a slope of 1.157 (95% CI: 1.109; 1.210). In linear regression analyses, plasma levels of total- and LDL -cholesterol, and triglycerides were positively associated dIEM, iIEM, and plasma magnesium, while glucose and HbA1c were inversely associated with plasma magnesium. CONCLUSIONS: We observed a strong correlation between dIEM and iIEM, suggesting that iIEM is a reliable alternative for the labor intensive dIEM method

    Amino Acid Homeostasis and Fatigue in Chronic Hemodialysis Patients

    Get PDF
    Patients dependent on chronic hemodialysis treatment are prone to malnutrition, at least in part due to insufficient nutrient intake, metabolic derangements, and chronic inflammation. Losses of amino acids during hemodialysis may be an important additional contributor. In this study, we assessed changes in plasma amino acid concentrations during hemodialysis, quantified intradialytic amino acid losses, and investigated whether plasma amino acid concentrations and amino acid losses by hemodialysis and urinary excretion are associated with fatigue. The study included a total of 59 hemodialysis patients (65 +/- 15 years, 63% male) and 33 healthy kidney donors as controls (54 +/- 10 years, 45% male). Total plasma essential amino acid concentration before hemodialysis was lower in hemodialysis patients compared with controls (p = 0.006), while total non-essential amino acid concentration did not differ. Daily amino acid losses were 4.0 +/- 1.3 g/24 h for hemodialysis patients and 0.6 +/- 0.3 g/24 h for controls. Expressed as proportion of protein intake, daily amino acid losses of hemodialysis patients were 6.7 +/- 2.4% of the total protein intake, compared to 0.7 +/- 0.3% for controls (p < 0.001). Multivariable regression analyses demonstrated that hemodialysis efficacy (Kt/V) was the primary determinant of amino acid losses (Std. beta = 0.51; p < 0.001). In logistic regression analyses, higher plasma proline concentrations were associated with higher odds of severe fatigue (OR (95% CI) per SD increment: 3.0 (1.3; 9.3); p = 0.03), while higher taurine concentrations were associated with lower odds of severe fatigue (OR (95% CI) per log2 increment: 0.3 (0.1; 0.7); p = 0.01). Similarly, higher daily taurine losses were also associated with lower odds of severe fatigue (OR (95% CI) per log2 increment: 0.64 (0.42; 0.93); p = 0.03). Lastly, a higher protein intake was associated with lower odds of severe fatigue (OR (95% CI) per SD increment: 0.2 (0.04; 0.5); p = 0.007). Future studies are warranted to investigate the mechanisms underlying these associations and investigate the potential of taurine supplementation
    corecore