293 research outputs found
What does the French REIN registry tell us about Stage 4-5 CKD care in older adults?
The aim of this paper is to illustrate all the clinical epidemiology searches made within the French network REIN to improve CKD stage 4-5 care in older adults. We summarize various studies describing clinical practice, care organization, prognosis and health economics evaluation in order to develop personalized care plans and decision-making tools. In France, for 20 years now, various databases have been mobilized including the national REIN registry which includes all patients receiving dialysis or transplantation. REIN data are indirectly linked to the French administrative healthcare database. They are also pooled with data from the PSPA cohort, a multicenter prospective cohort study of patients aged 75 or over with advanced CKD, monitored for 5 years, and the CKD-REIN clinical-based prospective cohort which included 3033 patients with CKD stage 3-4 from 2013 to 2016. During our various research work, we identified heterogeneous trajectories specific to this growing older population, raising ethical, organizational and economic issues. Renal registries will help clinicians, health providers and policy-makers if suitable decision- making tools are developed and validated
Drainage des sols organiques peu profonds par tranchée drainante
Les sols organiques de la région de Montérégie permettent une culture maraîchère productive. Cependant, leur mise en culture amène la dégradation du sol. Un sol organique dégradé peut montrer un drainage problématique pour deux raisons principales. Premièrement, il est composé de particules de sols très fines qui migrent dans le sol et forment une couche compacte de très faible perméabilité relativement près de la surface, typiquement entre 25 et 40 centimètres de profondeur. Cette couche va limiter de façon importante le rabattement de l’eau de surface. Deuxièmement, lorsqu’il se décompose, le sol s’affaisse sous son propre poids et la couche de dessous, composée principalement d’argile ou de coprogène peu perméable, vient jouer un rôle de plus en plus important dans le ralentissement du drainage. Une solution à ce problème généralisé de drainage pourrait être l’installation des drains en tranchées drainantes remplies avec un matériel de meilleure perméabilité comme un mélange de sol organique et de biomasse broyée. Le premier objectif de ce mémoire est d’évaluer la stabilité de différents mélanges par le changement de la conductivité hydraulique à saturation au cours d’un drainage continu en colonnes de sol. Le second objectif est de modéliser l’effet comparé de la présence d’une tranchée drainante ou d’un drainage conventionnel sur le drainage d’un profil de sol réel ayant une faible profondeur de sol organique. Cinq différents mélanges ont été testés dans des colonnes d’une hauteur de 60 cm (trois mélanges de sol organique avec de la biomasse broyée de saule, de miscanthus ou de pin défibré, une tresse de fibre de coco qui agissait comme un drain vertical partant du drain à 20 cm de la surface et un témoin composé uniquement de sol organique). Les colonnes ont été gardées saturées avec une hauteur d’eau constante durant 36 jours au cours desquels la conductivité hydraulique saturée a été mesurée à plusieurs reprises. La modélisation du drainage s’est faite avec Hydrus. Les paramètres du modèle de Van Genuchten de la courbe de rétention mesurés en plusieurs points ont été déterminés pour les différents mélanges pour tranchée drainante. Les paramètres du modèle ont également été obtenus pour le profil de champ réel à l’aide de données de hauteurs d’eau et de débit au drain. Les résultats montrent que le mélange avec la plus grande stabilité était le pin défibré; il avait la conductivité la plus élevée pour une quantité d’eau passée dans la colonne plus importante. iii Avec la modélisation des interventions, le rabattement en tranchées drainantes avec mélanges de biomasses était approximativement de 1,5 à deux fois supérieur au rabattement en drainage conventionnel. Cette étude nous permet de conclure que le drainage en tranchée drainante avec mélange de biomasses devrait être privilégié dans un sol organique peu profond à la place ou en même temps de diminuer de l’espacement entre les drains.In Montérégie, histosols enable a productive vegetable crop. Once drained and cultivated, organic soils start decomposing and subside. This leads to a problematic drainage for two main reasons. Firstly, histosols are composed of very fine organic material which migrates to downward in the soil profile to form a moorsh layer, typically between 25 and 40 cm below the root zone, and of very low permeability. Secondly, Soil subsidence also becomes a problem when the depth of the soil profile is reduced to less than a meter, because soil composition below that depth is dominated by low permeability clays or impermeable coprogenic material. A solution is to fill the drainage trench with material with a better permeability such as a mix of organic soil and shredded biomass. However, such material if organic will also start decomposing and compacting. The first objective of this study was to assess the stability of different mixes by measuring changes in the saturated hydraulic conductivity during continuous drainage in soil columns containing different organic soil mixtures. The second objective was to model the effect of the presence of a drainage trench as compared to conventional drainage in a shallow organic soil profile, based on the parameters of a real field. Five different trenches filling treatments were tested (three mixes of degraded organic soil with shredded biomasses of willow, miscanthus and defibered pine, a coarse rope of coconut fiber acting as vertical drain and a control with degraded organic soil). The mixes were filled into 60 cm high PVC columns and then fully saturated. Steadystate saturated flow was imposed, and the saturated hydraulic conductivity measured at different time intervals. The modeling of the drainage situation was done on Hydrus. The Van Genuchten parameters were determined for the different mixes for draining trench with points on the retention curve and for the real field profile with data of water table height and drain flow. The mix with defibered pine was the most stable. The difference of hydraulic conductivity between the defibered pine conductivity and the other two biomass mixes was not significant, but the amount of water which passed through the columns for defibered pine was twice the amount of the other two. For the modeling of the drainage process, the water table drawdown in the drainage trench with biomass mixes was approximately 1.5 to twice the water table drawdown of the classical drainage. This study enables us to conclude that drainage trench is efficient with or without decreasing drain spacing in a shallow organic soil
Development and validation of a risk score for chronic kidney disease in HIV infection using prospective cohort data from the D:A:D study.
Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice
Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study
Background:Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice.Methods and Findings:A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with ≥3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR ≤ 60 ml/min/1.73 m2. Poisson regression was used to develop a risk score, externally validated on two independent cohorts.In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7–6.7; median follow-up 6.1 y, range 0.3–9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was −2 (interquartile range –4 to 2). There was a 1:393 chance of developing CKD in the next 5 y in the low risk group (risk score < 0, 33 events), rising to 1:47 and 1:6 in the medium (risk score 0–4, 103 events) and high risk groups (risk score ≥ 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166–3,367); NNTH was 202 (95% CI 159–278) and 21 (95% CI 19–23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506–1462), 88 (95% CI 69–121), and 9 (95% CI 8–10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor.The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3–12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6–8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria.Conclusions:Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD
Development of a definition for Rapid Progression (RP) of renal function in HIV-positive persons: the D:A:D study
Background No consensus exists on how to define abnormally rapid deterioration in renal function (Rapid Progression, RP). We developed an operational definition of RP in HIV-positive persons with baseline estimated glomerular filtration rate (eGFR) >90 ml/min/1.73 m2 (using Cockcroft Gault) in the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study from 2004 to 2011. Methods Two definitions were evaluated; RP definition A: An average eGFR decline (slope) ≥5 ml/min/1.73 m2/year over four years of follow-up with ≥3 eGFR measurements/year, last eGFR <90 ml/min/1.73 m2 and an absolute decline ≥5 ml/min/1.73 m2/year in two consecutive years. RP definition B: An absolute annual decline ≥5 ml/min/1.73 m2/year in each year and last eGFR <90 ml/min/1.73 m2. Sensitivity analyses were performed considering two and three years' follow-up. The percentage with and without RP who went on to subsequently develop incident chronic kidney disease (CKD; 2 consecutive eGFRs <60 ml/min/1.73 m2 and 3 months apart) was calculated. Results 22,603 individuals had baseline eGFR ≥90 ml/min/1.73 m2. 108/3655 (3.0%) individuals with ≥4 years' follow-up and ≥3 measurements/year experienced RP under definition A; similar proportions were observed when considering follow-up periods of three (n=195/6375; 3.1%) and two years (n=355/10756; 3.3%). In contrast under RP definition B, greater proportions experienced RP when considering two years (n=476/10756; 4.4%) instead of three (n=48/6375; 0.8%) or four (n=15/3655; 0.4%) years' follow-up. For RP definition A, 13 (12%) individuals who experienced RP progressed to CKD, and only (21) 0.6% of those without RP progressed to CKD (sensitivity 38.2% and specificity 97.4%); whereas for RP definition B, fewer RP individuals progressed to CKD. Conclusions Our results suggest using three years' follow-up and at least two eGFR measurements per year is most appropriate for a RP definition, as it allows inclusion of a reasonable number of individuals and is associated with the known risk factors. The definition does not necessarily identify all those that progress to incident CKD, however, it can be used alongside other renal measurements to early identify and assess those at risk of developing CKD. Future analyses will use this definition to identify other risk factors for RP, including the role of antiretrovirals
Availability of assisted peritoneal dialysis in Europe : call for increased and equal access
Background Availability of assisted PD (asPD) increases access to dialysis at home, particularly for the increasing numbers of older and frail people with advanced kidney disease. Although asPD has been widely used in some European countries for many years, it remains unavailable or poorly utilized in others. A group of leading European nephrologists have therefore formed a group to drive increased availability of asPD in Europe and in their own countries. Methods Members of the group filled in a proforma with the following headings: personal experience, country experience, who are the assistants, funding of asPD, barriers to growth, what is needed to grow and their top three priorities. Results Only 5 of the 13 countries surveyed provided publicly funded reimbursement for asPD. The use of asPD depends on overall attitudes to PD, with all respondents mentioning the need for nephrology team education and/or patient education and involvement in dialysis modality decision making. Conclusions and call to action Many people with advanced kidney disease would prefer to have their dialysis at home, yet if the frail patient chooses PD most healthcare systems cannot provide their choice. AsPD should be available in all countries in Europe and in all renal centres. The top priorities to make this happen are education of renal healthcare teams about the advantages of PD, education of and discussion with patients and their families as they approach the need for dialysis, and engagement with policymakers and healthcare providers to develop and support assistance for PD.Peer reviewe
Risk Factors for Prognosis in Patients With Severely Decreased GFR
Introduction: Patients with chronic kidney disease (CKD) and estimated glomerular filtration rate (eGFR) < 30 ml/min per 1.73 m 2 (corresponding to CKD stage G4+) comprise a minority of the overall CKD population but have the highest risk for adverse outcomes. Many CKD G4+ patients are older with multiple comorbidities, which may distort associations between risk factors and clinical outcomes. Methods: We undertook a meta-analysis of risk factors for kidney failure treated with kidney replacement therapy (KRT), cardiovascular disease (CVD) events, and death in participants with CKD G4+ from 28 cohorts (n = 185,024) across the world who were part of the CKD Prognosis Consortium. Results: In the fully adjusted meta-analysis, risk factors associated with KRT were time-varying CVD, male sex, black race, diabetes, lower eGFR, and higher albuminuria and systolic blood pressure. Age was associated with a lower risk of KRT (adjusted hazard ratio: 0.74; 95% confidence interval: 0.69–0.80) overall, and also in the subgroup of individuals younger than 65 years. The risk factors for CVD events included male sex, history of CVD, diabetes, lower eGFR, higher albuminuria, and the onset of KRT. Systolic blood pressure showed a U-shaped association with CVD events. Risk factors for mortality were similar to those for CVD events but also included smoking. Most risk factors had qualitatively consistent associations across cohorts. Conclusion: Traditional CVD risk factors are of prognostic value in individuals with an eGFR < 30 ml/min per 1.73 m 2, although the risk estimates vary for kidney and CVD outcomes. These results should encourage interventional studies on correcting risk factors in this high-risk population
Follow-up of phase I trial of adalimumab and rosiglitazone in FSGS: III. Report of the FONT study group
Abstract Background Patients with resistant primary focal segmental glomerulosclerosis (FSGS) are at high risk of progression to chronic kidney disease stage V. Antifibrotic agents may slow or halt this process. We present outcomes of follow-up after a Phase I trial of adalimumab and rosiglitazone, antifibrotic drugs tested in the Novel Therapies in Resistant FSGS (FONT) study. Methods 21 patients -- 12 males and 9 females, age 16.0 ± 7.5 yr, and estimated GFR (GFRe) 121 ± 56 mL/min/1.73 m2 -- received adalimumab (n = 10), 24 mg/m2 every 14 days or rosiglitazone (n = 11), 3 mg/m2 per day for 16 weeks. The change in GFRe per month prior to entry and after completion of the Phase I trial was compared. Results 19 patients completed the 16-week FONT treatment phase. The observation period pre-FONT was 18.3 ± 10.2 months and 16.1 ± 5.7 months after the study. A similar percentage of patients, 71% and 56%, in the rosiglitazone and adalimumab cohorts, respectively, had stabilization in GFRe, defined as a reduced negative slope of the line plotting GFRe versus time without requiring renal replacement therapy after completion of the FONT treatment period (P = 0.63). Conclusion Nearly 50% of patients with resistant FSGS who receive novel antifibrotic agents may have a legacy effect with delayed deterioration in kidney function after completion of therapy. Based on this proof-of-concept preliminary study, we recommend long-term follow-up of patients enrolled in clinical trials to ascertain a more comprehensive assessment of the efficacy of experimental treatments
- …