23 research outputs found

    In situ immobilization of cadmium and zinc in contaminated soils : fiction or fixation?

    Get PDF
    Keywords: beringite, cadmium, DOC, DOM, earthworms, immobilization, leaching, lime, manganese oxides, metal binding, metal uptake, organic matter partitioning, pH, soil contamination, remediation, sorption, Swiss chard, zeolites, zinc.It is generally assumed that a decrease in metal concentration in the soil solution reduces metal leaching, and metal uptake by and toxicity to plants and soil organisms. In situ immobilization is a soil remediation technique that aims at reducing the metal concentration in the soil solution by adding a binding material to the soil. Application of this technique requires understanding of underlying mechanisms and potential side effects.Both laboratory experiments and model calculations have been performed to gain insight in immobilizing processes. It is essential to quantify metal binding to natural organic matter. The NICA-Donnan model was designed to calculate metal binding by organic materials, but specific Zn parameters were not available due to a lack of analytical data. The Wageningen Donnan Membrane technique (WDMT) was therefore further developed to measure free Zn concentrations in humic acid solutions.Many immobilizing materials increase the soil pH. This results in an increased negative charge of soil particles, and hence in a decreased metal mobility. In some cases, the addition of alkaline materials simultaneously increases the dissolved organic matter (DOM) concentration in the soil solution, resulting in increased leaching of metal-DOM complexes. We showed that alkaline soil amendments need to contain enough Ca to suppress the dispersion of organic matter as induced by a pH increase. We also quantified the dispersion of organic matter. It appeared that the Donnan potential of the organic matter, as calculated by the NICA-Donnan model, correlated very well with the DOM concentration.The addition of alkaline materials strongly decreased the metal concentration in Swiss chard ( Beta vulgaris L. var. cicla ). In contrast, the uptake of Cd and Zn by earthworms ( Lumbricus rubellus and Eisenia veneta ) was hardly influenced by the addition of alkaline materials. Another experiment showed that the addition of MnO 2 , which did not affect soil pH, resulted in a decreased Cd concentration in the earthworm tissue. Apparently, next to dermal uptake, pH independent Cd uptake via the intestine was an important uptake route. Cd uptake by earthworms could be estimated by a soil extraction with 0.1 M triethanolamine and 0.01 M CaCl 2 adjusted to pH 7.2.</font

    Association of Circulating Trimethylamine N-Oxide and Its Dietary Determinants with the Risk of Kidney Graft Failure:Results of the TransplantLines Cohort Study

    Get PDF
    BACKGROUND: Due to the critical shortage of kidneys for transplantation, the identification of modifiable factors related to graft failure is highly desirable. The role of trimethylamine-N-oxide (TMAO) in graft failure remains undetermined. Here, we investigated the clinical utility of TMAO and its dietary determinants for graft failure prediction in renal transplant recipients (RTRs). METHODS: We included 448 RTRs who participated in the TransplantLines Cohort Study. Cox proportional-hazards regression analyses were performed to study the association of plasma TMAO with graft failure. Net Benefit, which is a decision analysis method, was performed to evaluate the clinical utility of TMAO and dietary information in the prediction of graft failure. RESULTS: Among RTRs (age 52.7 ± 13.1 years; 53% males), the baseline median TMAO was 5.6 (3.0-10.2) µmol/L. In multivariable regression analysis, the most important dietary determinants of TMAO were egg intake (Std. β = 0.09 [95%CI, 0.01; 0.18]; p = 0.03), fiber intake (Std. β = -0.14 [95%CI, -0.22, -0.05]; p = 0.002), and fish and seafood intake (Std. β = 0.12 [95%CI, 0.03,0.21]; p = 0.01). After a median follow-up of 5.3 (4.5-6.0) years, graft failure was observed in 58 subjects. TMAO was associated with an increased risk of graft failure, independent of age, sex, the body mass index (BMI), blood pressure, lipids, albuminuria, and the Estimated Glomerular Filtration Rate (eGFR) (Hazard Ratio per 1-SD increase of TMAO, 1.62 (95% confidence interval (CI): 1.22; 2.14, p < 0.001)). A TMAO and dietary enhanced prediction model offered approximately double the Net Benefit compared to a previously reported, validated prediction model for future graft failure, allowing the detection of 21 RTRs per 100 RTRs tested, with no false positives versus 10 RTRs, respectively. CONCLUSIONS: A predictive model for graft failure, enriched with TMAO and its dietary determinants, yielded a higher Net Benefit compared with an already validated model. This study suggests that TMAO and its dietary determinants are associated with an increased risk of graft failure and that it is clinically meaningful

    Post-transplant obesity impacts long-term survival after liver transplantation

    Get PDF
    Background: Short-term survival after orthotopic liver transplantation (OLT) has improved over the past decades, but long-term survival remains impaired. The effects of obesity on long-term survival after OLT are controversial. Because pre-transplant body mass index (BMI) can be confounded by ascites, we hypothesized that post-transplant BMI at 1 year could predict long-term survival. Methods: A post-hoc analysis was performed of an observational cohort study consisting of adult recipients of a first OLT between 1993 and 2010. Baseline BMI was measured at 1-year post-transplantation to represent a stable condition. Recipients were stratified into normal weight (BMI 30 kg/m2). Kaplan-Meier survival analyses were performed with log-rank testing, followed by multivariable Cox proportional hazards regression analysis. Results: Out of 370 included recipients, 184 had normal weight, 136 were overweight, and 50 were obese at 1-year post-transplantation. After median follow-up for 12.3 years, 107 recipients had died, of whom 46 (25%) had normal weight, 39 (29%) were overweight, and 22 (44%) were obese (log-rank P = 0.020). Obese recipients had a significantly increased mortality risk compared to normal weight recipients (HR 2.00, 95% CI 1.08–3.68, P = 0.027). BMI was inversely associated with 15 years patient survival (HR 1.08, 95% CI 1.03–1.14, P = 0.001 per kg/m2), independent of age, gender, muscle mass, transplant characteristics, cardiovascular risk factors, kidney- and liver function. Conclusion: Obesity at 1-year post-transplantation conveys a 2-fold increased mortality risk, which may offer potential for interventional strategies (i.e. dietary advice, lifestyle modification, or bariatric surgery) to improve long-term survival after OLT

    High-density lipoprotein particles and their relationship to posttransplantation diabetes mellitus in renal transplant recipients

    Get PDF
    High concentrations of high-density lipoprotein (HDL) cholesterol are likely associated with a lower risk of posttransplantation diabetes mellitus (PTDM). However, HDL particles vary in size and density with yet unestablished associations with PTDM risk. The aim of our study was to determine the association between different HDL particles and development of PTDM in renal transplant recipients (RTRs). We included 351 stable outpatient adult RTRs without diabetes at baseline evaluation. HDL particle characteristics and size were measured by nuclear magnetic resonance (NMR) spectroscopy. During 5.2 (IQR, 4.1‒5.8) years of follow-up, 39 (11%) RTRs developed PTDM. In multivariable Cox regression analysis, levels of HDL cholesterol (hazard ratio [HR] 0.61, 95% confidence interval [CI] 0.40–0.94 per 1SD increase; p = 0.024) and of large HDL particles (HR 0.68, 95% CI 0.50–0.93 per log 1SD increase; p = 0.017), as well as larger HDL size (HR 0.58, 95% CI 0.36–0.93 per 1SD increase; p = 0.025) were inversely associated with PTDM development, independently of relevant covariates including, age, sex, body mass index, medication use, transplantation-specific parameters, blood pressure, triglycerides, and glucose. In conclusion, higher concentrations of HDL cholesterol and of large HDL particles and greater HDL size were associated with a lower risk of PTDM development in RTRs, independently of established risk factors for PTDM development

    Metabolic syndrome-related dietary pattern and risk of mortality in kidney transplant recipients

    Get PDF
    Background and aims: Presence of the metabolic syndrome (MetS) importantly contributes to excess mortality in kidney transplant recipients (KTRs). However, it is unclear which dietary factors drive the adverse role of MetS in KTRs. We aimed to define a dietary pattern that maximally explained the variation in MetS components, and to investigate the association between this MetS-related dietary pattern (MetS-DP) and all-cause mortality in KTRs. Methods and results: We included 429 adult KTRs who had a functioning graft ⩾1 year. A MetS-DP was constructed using habitual dietary intake derived from a 177-item food frequency questionnaire. We used reduced rank regression (RRR), and defined the six components of MetS (waist circumference, systolic blood pressure, diastolic blood pressure, serum triglycerides, HbA1c, and HDL cholesterol) as response variables and 48 food groups as predictor variables. We evaluated the association between the MetS-DP and all-cause mortality using multivariable Cox regression analysis. The MetS-DP was characterized by high intakes of processed meat and desserts, and low intakes of vegetables, tea, rice, fruits, milk, and meat substitutes. During a mean follow-up of 5.3 ± 1.2 years, 63 KTRs (14.7%) died. Compared to the lowest tertile of the Mets-DP score, those with the greatest adherence had a more than 3-fold higher risk of all-cause mortality (hazard ratio [HR] = 3.63; 95% confidence interval [CI], 1.70–7.74, P < 0.001), independent of potential confounders. Conclusions: We identified a MetS-related dietary pattern which was independently associated with all-cause mortality in KTRs. The association between this dietary pattern and all-cause mortality was mediated by MetS. Clinical trial reg. no. NCT02811835</p

    Consumption of fruits and vegetables and cardiovascular mortality in renal transplant recipients:A prospective cohort study

    Get PDF
    Background It currently remains understudied whether low consumption of fruits and vegetables after kidney transplantation may be a modifiable cardiovascular risk factor. We aimed to investigate the associations between consumption of fruits and vegetables and cardiovascular mortality in renal transplant recipients (RTRs). Methods Consumption of fruits and vegetables was assessed in an extensively phenotyping cohort of RTRs. Multivariable-adjusted Cox proportional hazards regression analyses were performed to assess the risk of cardiovascular mortality. Results We included 400 RTRs (age 5212 years, 54% males). At a median follow-up of 7.2years, 23% of RTRs died (53% were due to cardiovascular causes). Overall, fruit consumption was not associated with cardiovascular mortality {hazard ratio [HR] 0.82 [95% confidence interval (CI) 0.60-1.14]; P = 0.24}, whereas vegetable consumption was inversely associated with cardiovascular mortality [HR 0.49 (95% CI 0.34-0.71); P 45mL/min/1.73 m(2) [HR 0.56 (95% CI 0.35-0.92); P = 0.02] or the absence of proteinuria [HR 0.62 (95% CI 0.41-0.92); P = 0.02]. Conclusions In RTRs, a relatively higher vegetable consumption is independently and strongly associated with lower cardiovascular mortality. A relatively higher fruit consumption is also associated with lower cardiovascular mortality, although particularly in RTRs with eGFR>45mL/min/1.73 m(2) or an absence of proteinuria. Further studies seem warranted to investigate whether increasing consumption of fruits and vegetables may open opportunities for potential interventional pathways to decrease the burden of cardiovascular mortality in RTRs.Dutch Kidney Foundation: C00.187

    Net Endogenous Acid Excretion and Kidney Allograft Outcomes

    Get PDF
    BACKGROUND AND OBJECTIVES: High dietary acid load may accelerate a decline in kidney function. We prospectively investigated whether dietary acid load is associated with graft outcomes in kidney transplant recipients, and whether venous bicarbonate mediates this association. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: We used data from 642 kidney transplant recipients with a functioning graft ≥1 year after transplantation. Net endogenous acid production was estimated using food frequency questionnaires and, alternatively, 24-hour urinary urea and potassium excretion to estimate net endogenous acid production. We defined the composite kidney end point as a doubling of plasma creatinine or graft failure. Multivariable Cox regression analyses, adjusted for potential confounders, were used to study the associations of dietary acid load with the kidney end point. We evaluated potential mediation effects of venous bicarbonate, urinary bicarbonate excretion, urinary ammonium excretion, titratable acid excretion, and net acid excretion on the association between net endogenous acid production and the kidney end point. RESULTS: The median net endogenous acid production using food frequency questionnaires and net endogenous acid production using urinary excretion were 40 (interquartile range, 35-45) and 54 (interquartile range, 44-66) mEq/day, respectively. During a median follow-up of 5.3 years (interquartile range, 4.1-6.0), 121 (19%) participants reached the kidney end point. After multivariable adjustment, net endogenous acid production using food frequency questionnaires and net endogenous acid production using urinary excretion (per SD higher) were independently associated with higher risk for kidney end point (hazard ratio, 1.33; 95% confidence interval, 1.12 to 1.57, P=0.001 and hazard ratio, 1.44; 95% confidence interval, 1.24 to 1.69, P<0.001, respectively). Baseline venous bicarbonate mediated 20% of the association between net endogenous acid production using food frequency questionnaires and the kidney end point. Baseline venous bicarbonate, urinary ammonium excretion, and net acid excretion mediated 25%, -14%, and -18%, respectively, of the association between net endogenous acid production using urinary excretion and the kidney end point. CONCLUSIONS: Higher dietary acid load was associated with a higher risk of doubling of plasma creatinine or graft failure, and this association was partly mediated by venous bicarbonate, urinary ammonium, and net acid excretion

    Circulating Arsenic is Associated with Long-Term Risk of Graft Failure in Kidney Transplant Recipients:A Prospective Cohort Study

    Get PDF
    Arsenic is toxic to many organ systems, the kidney being the most sensitive target organ. We aimed to investigate whether, in kidney transplant recipients (KTRs), the nephrotoxic exposure to arsenic could represent an overlooked hazard for graft survival. We performed a prospective cohort study of 665 KTRs with a functional graft >= 1 year, recruited in a university setting (20082011), in The Netherlands. Plasma arsenic was measured by ICP-MS, and dietary intake was comprehensively assessed using a validated 177-item food-frequency questionnaire. The endpoint graft failure was defined as restart of dialysis or re-transplantation. Median arsenic concentration was 1.26 (IQR, 1.042.04) mu g/L. In backwards linear regression analyses we found that fish consumption (std beta = 0.26; p < 0.001) was the major independent determinant of plasma arsenic. During 5 years of follow-up, 72 KTRs developed graft failure. In Cox proportional-hazards regression analyses, we found that arsenic was associated with increased risk of graft failure (HR 1.80; 95% CI 1.28-2.53; p = 0.001). This association remained materially unaltered after adjustment for donor and recipient characteristics, immunosuppressive therapy, eGFR, primary renal disease, and proteinuria. In conclusion, in KTRs, plasma arsenic is independently associated with increased risk of late graft failure.Top Institute Food and Nutrition of the Netherlands A-1003 Comision Nacional de Investigacion Cientifica y Tecnologica (CONICYT) F 7219011

    Outlook on use of bioavailability in assessment of soil and sediment : Opportunities for metals

    No full text
    Om te bepalen of de kwaliteit van een bodem geschikt is voor (her)gebruik, wordt een risicobeoordeling uitgevoerd. Daarmee wordt onder meer beoordeeld of de aanwezige metalen een risico vormen voor mens, plant en dier. Momenteel wordt hiervoor de totale concentratie van de aanwezige metalen gemeten. Bekend is echter dat niet al het aanwezige metaal schadelijke effecten veroorzaakt. Door de hoeveelheid metalen te bepalen die effecten kan veroorzaken, wordt de risicobeoordeling van metalen in land- en waterbodem verbeterd. Aanbevolen wordt een meetmethode met verdund salpeterzuur toe te passen waarmee dit kan. Het RIVM heeft een visiedocument opgesteld, waarin staat waar, hoe en waarom dit in het bodem- en waterbodembeleid mogelijk is. Het heeft de voorkeur de nieuwe werkwijze in de eerste stap van de risicobeoordeling voor land- en waterbodem te gebruiken. Met de voorgestelde methode wordt nauwkeuriger het deel van de metalen bepaald dat verantwoordelijk is voor effecten op organismen. Voor kwik in beide bodemsoorten blijkt deze methode vooralsnog niet geschikt. Geadviseerd wordt om uit te werken welke concentraties voor land- en waterbodem toelaatbaar zijn. Ook wordt aangeraden de consequenties van de voorgestelde methode in de eerste stap van de risicobeoordeling voor de uitvoering van het bodembeleid in de praktijk in beeld te brengen. De nieuwe werkwijze kan wel direct worden gebruikt als aanvullende risicobeoordeling (tweede stap) van land- en waterbodem en binnen het beleid voor de diepe plassen van de aankomende Omgevingswet.Risk assessments are carried out to determine if the quality of the soil should be increased or to determine if it is fit for re-use. It is assessed if compounds, amongst which metals, cause a risk for men and environment. Currently the total content of metals is measured, of which is known that not all the metals present cause environmental impact. This study concludes that by measuring metals that potentially can cause an effect, the risk assessment of metals in soil and sediment can be improved. We recommend using the 'aqua nitrosa' method (with diluted nitric acid) for on soil and sediment assessments. The RIVM made a vision document, which describes where, how and why this method can be applied in policy on soil and sediment. It is preferred to use the method in the first step of the risk assessment of terrestrial soil as well as for sediment, because it more accurately measures the part that is responsible for effects on organisms. For mercury, this method currently is not suitable. It is advised to elaborate the accepted concentrations in soil and sediments in different legislations and to analyse the consequences of the proposed method before implementation into policy. In any case, the proposed method can be used as an additional measurement in the second step of soil risk assessment and in the oncoming risk assessment of deep freshwater pools in the oncoming Environment & Planning ActDGMI-Directie Duurzaamheid, DGRW-Directie Water en Bode
    corecore