9 research outputs found

    Biodegradation of Spent Automobile Engine Oil in Soil Microcosms Amended with Cow Dung

    Get PDF
    The discharge of spent engine oil in terrestrial and aquatic environments constitutes public health and socio-economic hazards. In this study, the potentials of organic waste (cow dung) amendments as biostimulating agents of the indigenous microflora for hydrocarbon biodegradation in soil microcosms deliberately contaminated with spent engine oil (5%v/w) was investigated for a period of 6 weeks. Physico-chemical and microbiological analysis of soil samples was determined using standard methods. A microcosm constructed consists of 8 trays containing 1kg of soil, artificially contaminated with 50ml of spent engine oil and treated with 50g, 100g and 150g of cow dung. Spent engine oil degradation was assessed gravimetrically at weekly interval and chromatographically after 6 weeks of biodegradation treatment. Results of the physico-chemical analysis showed that the pH of soil was 6.56 while nitrate, moisture content, phosphate and total organic content were 0.82mg/kg, 9.28%, 0.73mg/kg and 3.60mg/kg respectively. Microbiological analysis of the soil sample showed that the total heterotrophic bacteria were 3.6x106cfu/g, while total heterotrophic fungal and hydrocarbon utilizing bacteria (HUB) were 2.2x104cfu/g and 7.9 x104cfu/g respectively. The mean value of the total viable counts (TVC) population of hydrocarbon-utilizers was higher in biostimulated soil which ranged from (2.10x105-5.30x109cfu/g) compared with that of control (1.20x105-3.10x108cfu/g). Residual oil concentration showed a more remarkable decrease throughout the incubation period (0.400-0.259mg/g, 0.420-0.218mg/g and 0.410-0.220mg/g for treatments 1, 2 and 3 respectively) when compared to that of control which ranged from 0.400-0.304mg/g. At the end of 6 weeks of microcosms biodegradation studies, percentage degradations of the spent engine oil were 23.81%, 35.29%, 45.45% and 44.94% for CON, T1, T2 and T3 respectively. The result obtained from this study showed that cow dung can be effectively used as a biostimulant during bioremediation of spent engine oil polluted site to enhance biodegradation ability of the indigenous microbial population

    Supplementary Material for: Lymphocyte Cell Ratios and Mortality among Incident Hemodialysis Patients

    No full text
    <b><i>Background:</i></b> Neutrophil-to-lymphocyte ratio (NLR) and platelet-to-lymphocyte ratio (PLR) have been previously suggested as oncologic prognostication markers. These are associated with malnutrition and inflammation, and hence, may provide benefit in predicting mortality among hemodialysis patients. <b><i>Methods:</i></b> Among 108,548 incident hemodialysis patients in a large U.S. dialysis organization (2007–2011), we compared the mortality predictability of NLR and PLR with baseline and time-varying covariate Cox models using the receiver operating characteristic curve (AUROC), net reclassification index (NRI), and adjusted R<sup>2</sup>. <b><i>Results:</i></b> During the median follow-up period of 1.4 years, 28,618 patients died. Median (IQR) NLR and PLR at baseline were 3.64 (2.68–5.00) and 179 (136–248) respectively. NLR was associated with higher mortality, which appeared stronger in the time-varying versus baseline model. PLR exhibited a J-shaped association with mortality in both models. NLR provided better mortality prediction in addition to demographics, comorbidities, and serum albumin; ΔAUROC and NRI for 1-year mortality (95% CI) were 0.010 (0.009–0.012) and 6.4% (5.5–7.3%) respectively. Additionally, adjusted R<sup>2</sup> (95% CI) for the Cox model increased from 0.269 (0.262–0.276) to 0.283 (0.276–0.290) in the non-time-varying model and from 0.467 (0.461–0.472) to 0.505 (0.500–0.512) in the time-varying model. There was little to no benefit of adding PLR to predict mortality. <b><i>Conclusions:</i></b> High NLR in incident hemodialysis patients predicted mortality, especially in the short-term period. NLR, but not PLR, added modest benefit in predicting mortality along with demographics, comorbidities, and serum albumin, and should be included in prognostication approaches

    Supplementary Material for: Changes in Markers of Mineral and Bone Disorders and Mortality in Incident Hemodialysis Patients

    No full text
    <b><i>Background:</i></b> Abnormalities in mineral and bone disorder (MBD) markers are common in patients with chronic kidney disease. However, previous studies have not accounted for their changes over time, and it is unclear whether these changes are associated with survival. <b><i>Methods:</i></b> We examined the association of change in MBD markers (serum phosphorus (Phos), albumin-corrected calcium (Ca<sub>Alb</sub>), intact parathyroid hormone (iPTH) and alkaline phosphatase (ALP)) during the first 6 months of hemodialysis (HD) with all-cause mortality across baseline MBD strata using survival models adjusted for clinical characteristics and laboratory measurements in 102,754 incident HD patients treated in a large dialysis organization between 2007 and 2011. <b><i>Results:</i></b> Across all MBD markers (Phos, Ca<sub>Alb</sub>, iPTH and ALP), among patients whose baseline MBD levels were higher than the reference range, increase in MBD levels was associated with higher mortality (reference group: MBD level within reference range at baseline and no change at 6 months follow-up). Conversely, decrease in Phos and iPTH, among baseline Phos and iPTH levels lower than the reference range, respectively, were associated with higher mortality. An increase in ALP was associated with higher mortality across baseline strata of ALP ≥80 U/l. However, patients with baseline ALP <80 U/l trended towards a lower risk of mortality irrespective of the direction of change at 6 months follow-up. <b><i>Conclusions:</i></b> There is a differential association between changes in MBD markers with mortality across varying baseline levels in HD patients. Further study is needed to determine if consideration of both baseline and longitudinal changes in the management of MBD derangements improves outcomes in this population

    Supplementary Material for: Serum Ferritin Variations and Mortality in Incident Hemodialysis Patients

    No full text
    <p><b><i>Background:</i></b> Higher serum ferritin levels may be influenced by iron use and inflammation, and are associated with higher mortality in hemodialysis (HD) patients. We hypothesized that a major rise in serum ferritin is associated with a higher risk of mortality, irrespective of baseline serum ferritin in incident HD patients. <b><i>Methods:</i></b> In a cohort of 93,979 incident HD patients between 2007 and 2011, we examined the association of change in serum ferritin from the baseline patient quarter (first 91 days from dialysis start) to the subsequent quarter with mortality. Multivariable adjustments were done for case-mix and markers of the malnutrition, and inflammation complex and intravenous iron dose. Change in serum ferritin was stratified into 5 groups: <-400, -400 to <-100, -100 to <100, 100 to <400, and ≥400 ng/mL/quarter. <b><i>Results:</i></b> The median change in serum ferritin was 89 ng/mL/quarter (interquartile range -55 to 266 ng/mL/quarter). Compared to stable serum ferritin (-100 to <100 ng/mL/quarter), a major rise (≥400 ng/mL/quarter) was associated with higher all-cause mortality (hazard ratio [95% CI] 1.07 [0.99-1.15], 1.17 [1.09-1.24], 1.26 [1.12-1.41], and 1.49 [1.27-1.76] according to baseline serum ferritin: <200, 200 to <500, 500 to <800, and ≥800 ng/mL in adjusted models, respectively. The mortality risk associated with a rise in serum ferritin was robust, irrespective of intravenous iron use. <b><i>Conclusions:</i></b> During the first 6-months after HD initiation, a major rise in serum ferritin in those with a baseline ferritin ≥200 ng/mL and even a slight rise in serum ferritin in those with a baseline ferritin ≥800 ng/mL are associated with higher mortality.</p

    Supplementary Material for: Racial and Ethnic Differences in Mortality Associated with Serum Potassium in a Large Hemodialysis Cohort

    No full text
    <p><b><i>Background:</i></b> Hyperkalemia is observed in chronic kidney disease patients and may be a risk factor for life-threatening arrhythmias and death. Race/ethnicity may be important modifiers of the potassium-mortality relationship in maintenance hemodialysis (MHD) patients given that potassium intake and excretion vary among minorities. <b><i>Methods:</i></b> We examined racial/ethnic differences in baseline serum potassium levels and all-cause and cardiovascular mortality using Cox proportional hazard models and restricted cubic splines in a cohort of 102,241 incident MHD patients. Serum potassium was categorized into 6 groups: ≤3.6, >3.6 to ≤4.0, >4.0 to ≤4.5 (reference), >4.5 to ≤5.0, >5.0 to ≤5.5, and >5.5 mEq/L. Models were adjusted for case-mix and malnutrition-inflammation cachexia syndrome (MICS) covariates. <b><i>Results:</i></b> The cohort was composed of 50% whites, 34% African-Americans, and 16% Hispanics. Hispanics tended to have the highest baseline serum potassium levels (mean ± SD: 4.58 ± 0.55 mEq/L). Patients in our cohort were followed for a median of 1.3 years (interquartile range 0.6-2.5). In our cohort, associations between higher potassium (>5.5 mEq/L) and higher mortality risk were observed in African-American and whites, but not Hispanic patients in models adjusted for case-mix and MICS covariates. While in Hispanics only, lower serum potassium (<3.6 mEq/L) levels were associated with higher mortality risk. Similar trends were observed for cardiovascular mortality. <b><i>Conclusions:</i></b> Higher potassium levels were associated with higher mortality risk in white and African-American MHD patients, whereas lower potassium levels were associated with higher death risk in Hispanics. Further studies are needed to determine the underlying mechanisms for the differential association between potassium and mortality across race/ethnicity.</p

    Supplementary Material for: Association of Pre-End-Stage Renal Disease Hemoglobin with Early Dialysis Outcomes

    No full text
    <b><i>Background:</i></b> Incident hemodialysis patients have a high mortality risk within the first months after dialysis initiation. Pre-end-stage renal disease (ESRD) factors like anemia management may impact early post-ESRD outcomes. Therefore, we evaluated the impact of pre-ESRD hemoglobin (Hgb) and pre-ESRD Hgb slope on post-ESRD mortality and hospitalization outcomes. <b><i>Methods:</i></b> The study included 31,472 veterans transitioning to ESRD. Using Cox and negative binomial regression models, we evaluated the association of pre-ESRD Hgb and Hgb slope with 12-month post-ESRD all-cause and cardiovascular mortality and hospitalization rates using 4 levels of hierarchical multivariable adjustment, including erythropoietin use and kidney decline in slope models. <b><i>Results:</i></b> The cohort was 2% female, 30% African-American, and on average 68 ± 11 years old. Compared to Hgb 10–< 11 g/dL, both low (< 10 g/dL) and high (≥12 g/dL) levels were associated with higher all-cause mortality after full adjustment (HR 1.25 [95% CI 1.15–1.35] and 1.09 [95% CI 1.02–1.18], respectively). Similarly, Hgb exhibited a U-shaped association with CV mortality, while only lower Hgb was associated with a higher hospitalization rate. Neither an annual pre-ESRD decline in Hgb nor increase was associated with higher post-ESRD mortality risk after adjustment for kidney decline. However, we observed a modest J-shaped association between pre-ESRD Hgb slope and post-ESRD hospitalization rate. <b><i>Conclusions:</i></b> Lower and higher pre-ESRD Hgb levels are associated with a higher risk of early post-ESRD mortality, while there was no association between the pre-ESRD slope and mortality. An increase in pre-ESRD Hgb slope was associated with higher risk of post-ESRD hospitalization. Additional studies aimed at anemia management prior to ESRD transition are warranted

    Temporal sensitivity analysis of erosivity estimations in a high rainfall tropical island environment

    No full text
    The Erosivity Index (EI) and the Modified Fournier Index (MFI) are two commonly used methods in calculating the R factor of the universal soil loss equation/ revised universal soil loss equation formula. Using Mauritius as a case study, the value of high-resolution data versus long-term totals in erosivity calculations is investigated. A limited number of four Mauritius Meteorological Services stations located on the west coast and the Central Plateau provided the study with detailed rainfall data for 6 years at 6-min intervals. Rainfall erosivity for erosive events was calculated using different set interval data. In this study, within the EI, the use of 6-min rainfall intervals during erosive rainfall gave estimates of around 10% more erosivity than the 30-min time intervals and 33% more rainfall erosivity than the 60-min rainfall measurements. When the MFI was used to determine erosivity through annual and monthly rainfall totals, substantially higher erosivity than the EI method was calculated in both regions. This stems from the large amount of non-erosive rainfall that is generated on Mauritius. Even when the MFI was used to calculate erosivity through monthly and annual rainfall totals derived purely from erosive rainfall, erosivity calculations were not comparable to those from high-resolution data within the EI. We suggest that for the computation of erosivity, rainfall data with the highest possible resolution should be utilised if available and that the application of annual and monthly rainfall totals to assess absolute soil erosion risk within a high rainfall tropical environment must be used with caution.http://onlinelibrary.wiley.com/journal/10.1111/(ISSN)1468-0459hb201
    corecore