17 research outputs found

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe

    Reuse potential of laundry greywater for irrigation based on growth, water and nutrient use of tomato

    Get PDF
    Greywater is considered as a valuable resource with a high reuse potential for irrigation of household lawns and gardens. However, there are possibilities of surfactant and sodium accumulation in soil from reuse of greywater which may affect agricultural productivity and environmental sustainability adversely. We conducted a glasshouse experiment to examine variation in growth, water and nutrient use of tomato (Lycopersicon esculentum Mill. cv. Grosse Lisse) using tap water (TW), laundry greywater (GW) and solutions of low and high concentration of a detergent surfactant (LC and HC, respectively) as irrigation treatments. Each treatment was replicated five times using a randomised block design. Measurements throughout the experiment showed greywater to be significantly more alkaline and saline than the other types of irrigation water. Although all plants received sixteen irrigations over a period of nine weeks until flowering, there were little or no significant effects of irrigation treatments on plant growth. Soil water retention following irrigation reduced significantly when plants were irrigated with GW or surfactant solutions on only three of twelve occasions. On one occasion, water use measured as evapotranspiration (ET) with GW irrigation was similar to TW, but it was significantly higher than the plants receiving HC irrigation. At harvest, various components of plant biomass and leaf area for GW irrigated plants were found to be similar or significantly higher than the TW irrigated plants with a common trend of GW >= TW > LC >= HC. Whole-plant concentration was measured for twelve essential plant nutrients (N, P, K, Ca, Mg, S, Fe, Cu, Mn, Zn, Mo and B) and Na (often considered as a beneficial nutrient). Irrigation treatments affected the concentration of four nutrients (P, Fe, Zn and Na) and uptake of seven nutrients (P, K, Ca, Mg, Na, Fe and B) significantly. Uptake of these seven nutrients by tomato was generally in the order GW >= TW > HC >= LC. GW irrigated plants had the highest concentration of P, Na and Fe which were 39-85% higher than the TW irrigated plants. Compared with tap water irrigated plants, greywater irrigated plants removed only 6% excess B, but substantially greater quantity of Na (83%) and Fe (86%).These results suggest that laundry greywater has a promising potential for reuse as irrigation water to grow tomato
    corecore