18 research outputs found

    Potential accumulation of toxic trace elements in soils during enhanced rock weathering

    Get PDF
    Terrestrial enhanced rock weathering (ERW) is a carbon dioxide removal technology that aims at accelerating one of the most powerful negative feedbacks on Earth's climate, the chemical weathering of silicates. To achieve this, ERW proposes to spread ground silicate rock on agricultural soils. According to many models, global application rates of 40 tonnes of ground basaltic rock per hectare and per year would be necessary to sequester a significant amount of CO2, representing up to 24% of the current net annual increase in atmospheric CO2. When assessing the viability of ERW as a global geo-engineering strategy, a pivotal but overlooked question to address is whether ERW may lead to toxic trace element accumulation in soils at unauthorized and potentially harmful levels. This study evaluates the legal sustainability of ERW with regard to trace element contents in soils. We compare different trace element accumulation scenarios considering a range of rock sources, application rates and national regulatory limits. The results indicate that, at the suggested annual application rate of 40 tonnes per hectare, the first regulatory limits would be exceeded after 6 and 10 years for copper and nickel, respectively. This study argues in favour of close tailoring of ERW deployment to local conditions in order to tap into its climate mitigation potential while preserving long-term soil uses

    A baseline metabolomic signature is associated with immunological CD4+ T-Cell recovery after 36 months of art in HIV-infected patients

    Get PDF
    Poor immunological recovery in treated HIV-infected patients is associated with greater morbidity and mortality. To date, predictive biomarkers of this incomplete immune reconstitution have not been established. We aimed to identify a baseline metabolomic signature associated with a poor immunological recovery after ART in order to envisage the underlying mechanistic pathways that influence the treatment response.Peer reviewe

    Clinical characteristics and outcome of Spanish patients with ANCA-associated vasculitides Impact of the vasculitis type, ANCA specificity, and treatment on mortality and morbidity

    Get PDF
    The aim of this study was to describe the clinical characteristics of ANCA-associated vasculitides (AAV) at presentation, in a wide cohort of Spanish patients, and to analyze the impact of the vasculitis type, ANCA specificity, prognostic factors, and treatments administered at diagnosis, in the outcome. A total of 450 patients diagnosed between January 1990 and January 2014 in 20 Hospitals from Spain were included. Altogether, 40.9% had granulomatosis with polyangiitis (GPA), 37.1% microscopic polyangiitis (MPA), and 22% eosinophilic granulomatosis with polyangiitis (EGPA). The mean age at diagnosis was 55.6±17.3 years, patients with MPA being significantly older (P<0.001). Fever, arthralgia, weight loss, respiratory, and ear-nose-throat (ENT) symptoms, were the most common at disease onset. ANCAs tested positive in 86.4% of cases: 36.2% C-ANCA-PR3 and 50.2% P-ANCA-MPO. P-ANCA-MPO was significantly associated with an increased risk for renal disease (OR 2.6, P<0.001) and alveolar hemorrhage (OR 2, P=0.010), while C-ANCA-PR3 was significantly associated with an increased risk for ENT (OR 3.4, P<0.001) and ocular involvement (OR 2.3, P=0.002). All patients received corticosteroids (CS) and 74.9% cyclophosphamide (CYC). The median follow-up was 82 months (IQR 100.4). Over this period 39.9% of patients suffered bacterial infections and 14.6% opportunistic infections, both being most prevalent in patients with highcumulated doses of CYC and CS (P<0.001). Relapses were recorded in 36.4% of cases with a mean rate of 2.5±2.3, and were more frequent in patients with C-ANCA-PR3 (P=0.012). The initial disease severity was significantly associated with mortality but not with the occurrence of relapses. One hundred twenty-nine (28.7%) patients (74 MPA, 41 GPA, 14 EGPA) died. The mean survival was 58 months (IQR 105) and was significantly lower for patients with MPA (P<0.001). Factors independently related to death were renal involvement (P=0.010), cardiac failure (P=0.029) and age over 65 years old (P<0.001) at disease onset, and bacterial infections (P<0.001). An improved outcome with significant decrease in mortality and treatment-related morbidity was observed in patients diagnosed after 2000, and was related to the implementation of less toxic regimens adapted to the disease activity and stage, and a drastic reduction in the cumulated CYC and CS dose

    A baseline metabolomic signature is associated with immunological CD4 + T-cell recovery after 36 months of antiretroviral therapy in HIV-infected patients

    Get PDF
    Poor immunological recovery in treated HIV-infected patients is associated with greater morbidity and mortality. To date, predictive biomarkers of this incomplete immune reconstitution have not been established. We aimed to identify a baseline metabolomic signature associated with a poor immunological recovery after antiretroviral therapy (ART) to envisage the underlying mechanistic pathways that influence the treatment response. This was a multicentre, prospective cohort study in ART-naive and a pre-ART low nadir (<200 cells/μl) HIV-infected patients (n = 64). We obtained clinical data and metabolomic profiles for each individual, in which low molecular weight metabolites, lipids and lipoproteins (including particle concentrations and sizes) were measured by NMR spectroscopy. Immunological recovery was defined as reaching CD4 + T-cell count at least 250 cells/μl after 36 months of virologically successful ART. We used univariate comparisons, Random Forest test and receiver-operating characteristic curves to identify and evaluate the predictive factors of immunological recovery after treatment. HIV-infected patients with a baseline metabolic pattern characterized by high levels of large high density lipoprotein (HDL) particles, HDL cholesterol and larger sizes of low density lipoprotein particles had a better immunological recovery after treatment. Conversely, patients with high ratios of non-HDL lipoprotein particles did not experience this full recovery. Medium very-low-density lipoprotein particles and glucose increased the classification power of the multivariate model despite not showing any significant differences between the two groups. In HIV-infected patients, a baseline healthier metabolomic profile is related to a better response to ART where the lipoprotein profile, mainly large HDL particles, may play a key role

    Spatiotemporal Characteristics of the Largest HIV-1 CRF02_AG Outbreak in Spain: Evidence for Onward Transmissions

    Get PDF
    Background and Aim: The circulating recombinant form 02_AG (CRF02_AG) is the predominant clade among the human immunodeficiency virus type-1 (HIV-1) non-Bs with a prevalence of 5.97% (95% Confidence Interval-CI: 5.41–6.57%) across Spain. Our aim was to estimate the levels of regional clustering for CRF02_AG and the spatiotemporal characteristics of the largest CRF02_AG subepidemic in Spain.Methods: We studied 396 CRF02_AG sequences obtained from HIV-1 diagnosed patients during 2000–2014 from 10 autonomous communities of Spain. Phylogenetic analysis was performed on the 391 CRF02_AG sequences along with all globally sampled CRF02_AG sequences (N = 3,302) as references. Phylodynamic and phylogeographic analysis was performed to the largest CRF02_AG monophyletic cluster by a Bayesian method in BEAST v1.8.0 and by reconstructing ancestral states using the criterion of parsimony in Mesquite v3.4, respectively.Results: The HIV-1 CRF02_AG prevalence differed across Spanish autonomous communities we sampled from (p &lt; 0.001). Phylogenetic analysis revealed that 52.7% of the CRF02_AG sequences formed 56 monophyletic clusters, with a range of 2–79 sequences. The CRF02_AG regional dispersal differed across Spain (p = 0.003), as suggested by monophyletic clustering. For the largest monophyletic cluster (subepidemic) (N = 79), 49.4% of the clustered sequences originated from Madrid, while most sequences (51.9%) had been obtained from men having sex with men (MSM). Molecular clock analysis suggested that the origin (tMRCA) of the CRF02_AG subepidemic was in 2002 (median estimate; 95% Highest Posterior Density-HPD interval: 1999–2004). Additionally, we found significant clustering within the CRF02_AG subepidemic according to the ethnic origin.Conclusion: CRF02_AG has been introduced as a result of multiple introductions in Spain, following regional dispersal in several cases. We showed that CRF02_AG transmissions were mostly due to regional dispersal in Spain. The hot-spot for the largest CRF02_AG regional subepidemic in Spain was in Madrid associated with MSM transmission risk group. The existence of subepidemics suggest that several spillovers occurred from Madrid to other areas. CRF02_AG sequences from Hispanics were clustered in a separate subclade suggesting no linkage between the local and Hispanic subepidemics

    Changes in topsoil organic carbon content in the Swiss leman region cropland from 1993 to present. Insights from large scale on-farm study

    No full text
    Increasing cropland topsoil organic carbon (SOC) content is a key goal for soil improving quality and adaptating soils to climate change. Moreover, the short term potential of climate mitigation by carbon sequestration is mostly attributed to increasing topsoil SOC content (Balesdent and Arrouays, 1999, Chambers et al., 2016; Minasny et al., 2017; Balesdent et al., 2018). However, the possibility to increase SOC content is highly disputed in current literature which is mostly based on field experiments. We quantified the on-farm SOC content deficit and SOC content change rate of cropland topsoil (0–20 cm) from western Switzerland using the data bases of Geneva and Vaud cantons containing more than 30,000 topsoil analyses, performed every ten years on every cultivated field of the region since 1993. SOC deficit was estimated as the amount of SOC necessary to reach the 0.1 SOC:clay ratio considered as the minimum required SOC amount for acceptable soil quality. Cropland topsoils of the Vaud and Geneva cantons displayed a 20% and 70% SOC content deficit, respectively. In both cantons, the range of observed rates of change in SOC content from 1993 to present was very large, from −30 to +30‰ per year, with a median value of 0. However, the time trends showed a highly significant linear increase of rates from −5‰ to +6‰ per year on average, in 1995 and 2015, respectively, with no change in SOC content reached by 2005–2007. These trends were attributed to the Swiss agri-environmental schemes applied at the end of 20th century, namely mandatory cover crops and minimum rotations of 4 crops. Further, SOC content increase was accordant with the continuing adoption of minimum tillage, conservation agriculture and multi-species intense cover crops. These findings oppose to those obtained in Swiss long-term experiments, which emphasizes the need to use on-farm information when adressing agriculture policy, climate mitigation or soil quality management issues

    ACS et teneur en matière organique du sol ::quelques enseignements tirés de la région lémanique

    No full text
    L’agriculture de conservation des sols (ACS) n’est pas née avec la découverte du changement climatique. Elle est née de la volonté d’agriculteurs de préserver et de restaurer leurs sols et elle s’est développée en marge des politiques publiques. L’initiative 4-pour-mille (https://www.4p1000.org/fr) met en avant l’ACS comme solution de premier plan pour séquestrer le carbone dans les sols (figure 1), ce qui a suscité une grande production d’articles scientifiques et de prises de position, tant pour soutenir cette option que pour en souligner les faiblesses. Il en résulte des controverses et des affirmations probablement illisibles pour la plupart des acteurs. Cet article, s’appuyant sur les résultats de nos recherches, vise à apporter quelques éclaircissements sur les enjeux liés à la teneur en matière organique des sols en agriculture, notamment les enjeux climatiques, et les apports de l’ACS, à travers un retour d’expérience à grande échelle sur la région lémanique (cantons suisses de Vaud et Genève)

    On-farm relationships between agricultural practices and annual changes in organic carbon content at a regional scale

    No full text
    Both soil quality degradation and climate change mitigation issues emphasize the need to increase, or at least stabilize, the topsoil organic carbon content (wt%) in arable land. This on-farm study aimed at measuring the impact of agricultural practices on changes in soil organic carbon (SOC) content over 10 years. A total of 120 fields belonging to 120 farms representative of the cropping systems and soil properties in Western Switzerland (Lake Geneva region) was randomly selected. The field 0–20 cm topsoil was sampled at a 10-years interval, and the corresponding cropping practices were gathered using farmer’s interviews and the mandatory records of yearly practices at field level in Swiss-farms. Only 1) organic matter inputs and 2) cover-crop intensity were significantly correlated to SOC increase while 3) the soil tillage intensity and 4) the soil saturation in carbon expressed as a SOC to clay content ratio were correlated to SOC decrease. Among others, temporary meadows were not correlated to changes in SOC content mainly due to increased tillage and decreased cover-crops between meadows. Organic farming did not correlate either with SOC changes due to the large tillage intensity applied for weed control. The observed SOC content changes ranged from −56‰ to +74‰ and were well explained by a linear regression model with additive effect of the four identified SOC change factors. The additivity of these factors means that farmers can emphasize the methods of their choice when regenerating their soils. This study advocates that strict no-till is not required at low carbon saturation level (small SOC:Clay ratio). However, as carbon saturation increases, conservation tillage and then no-till practices become necessary to further increase SOC contents. These findings are in accordance with previous studies showing that since 2015 SOC is increasing at more than +4‰ on average in the region and provide practical insights to further manage the transition of farming systems towards soil regeneration

    On-Farm Relationships Between Agricultural Practices and Annual Changes in Organic Carbon Content at a Regional Scale

    Get PDF
    &lt;jats:p&gt;Both soil quality degradation and climate change mitigation issues emphasize the need to increase, or at least stabilize, the topsoil organic carbon content (wt%) in arable land. This on-farm study aimed at measuring the impact of agricultural practices on changes in soil organic carbon (SOC) content over 10 years. A total of 120 fields belonging to 120 farms representative of the cropping systems and soil properties in Western Switzerland (Lake Geneva region) was randomly selected. The field 0–20 cm topsoil was sampled at a 10-years interval, and the corresponding cropping practices were gathered using farmer’s interviews and the mandatory records of yearly practices at field level in Swiss-farms. Only 1) organic matter inputs and 2) cover-crop intensity were significantly correlated to SOC increase while 3) the soil tillage intensity and 4) the soil saturation in carbon expressed as a SOC to clay content ratio were correlated to SOC decrease. Among others, temporary meadows were not correlated to changes in SOC content mainly due to increased tillage and decreased cover-crops between meadows. Organic farming did not correlate either with SOC changes due to the large tillage intensity applied for weed control. The observed SOC content changes ranged from −56‰ to +74‰ and were well explained by a linear regression model with additive effect of the four identified SOC change factors. The additivity of these factors means that farmers can emphasize the methods of their choice when regenerating their soils. This study advocates that strict no-till is not required at low carbon saturation level (small SOC:Clay ratio). However, as carbon saturation increases, conservation tillage and then no-till practices become necessary to further increase SOC contents. These findings are in accordance with previous studies showing that since 2015 SOC is increasing at more than +4‰ on average in the region and provide practical insights to further manage the transition of farming systems towards soil regeneration.&lt;/jats:p&gt
    corecore