12 research outputs found

    Deep subsurface drip irrigation using coal-bed sodic water: Part II. Geochemistry

    Get PDF
    Waters with low salinity and high sodium adsorption ratios (SARs) present a challenge to irrigation because they degrade soil structure and infiltration capacity. In the Powder River Basin of Wyoming, such low salinity (electrical conductivity, EC 2.1 mS cm−1) and high-SAR (54) waters are co-produced with coal-bed methane and some are used for subsurface drip irrigation (SDI). The SDI system studied mixes sulfuric acid with irrigation water and applies water year-round via drip tubing buried 92 cm deep. After six years of irrigation, SAR values between 0 and 30 cm depth (0.5–1.2) are only slightly increased over non-irrigated soils (0.1–0.5). Only 8–15% of added Na has accumulated above the drip tubing. Sodicity has increased in soil surrounding the drip tubing, and geochemical simulations show that two pathways can generate sodic conditions. In soil between 45-cm depth and the drip tubing, Na from the irrigation water accumulates as evapotranspiration concentrates solutes. SAR values \u3e12, measured by 1:1 water–soil extracts, are caused by concentration of solutes by factors up to 13. Low-EC (\u3c0.7 mS cm−1) is caused by rain and snowmelt flushing the soil and displacing ions in soil solution. Soil below the drip tubing experiences lower solute concentration factors (1–1.65) due to excess irrigation water and also contains relatively abundant native gypsum (2.4 ± 1.7 wt.%). Geochemical simulations show gypsum dissolution decreases soil-water SAR to \u3c7 and increases the EC to around 4.1 mS cm−1, thus limiting negative impacts from sodicity. With sustained irrigation, however, downward flow of excess irrigation water depletes gypsum, increasing soil-water SAR to \u3e14 and decreasing EC in soil water to 3.2 mS cm−1. Increased sodicity in the subsurface, rather than the surface, indicates that deep SDI can be a viable means of irrigating with sodic waters

    Deep subsurface drip irrigation using coal-bed sodic water: Part II. Geochemistry

    Get PDF
    Waters with low salinity and high sodium adsorption ratios (SARs) present a challenge to irrigation because they degrade soil structure and infiltration capacity. In the Powder River Basin of Wyoming, such low salinity (electrical conductivity, EC 2.1 mS cm−1) and high-SAR (54) waters are co-produced with coal-bed methane and some are used for subsurface drip irrigation (SDI). The SDI system studied mixes sulfuric acid with irrigation water and applies water year-round via drip tubing buried 92 cm deep. After six years of irrigation, SAR values between 0 and 30 cm depth (0.5–1.2) are only slightly increased over non-irrigated soils (0.1–0.5). Only 8–15% of added Na has accumulated above the drip tubing. Sodicity has increased in soil surrounding the drip tubing, and geochemical simulations show that two pathways can generate sodic conditions. In soil between 45-cm depth and the drip tubing, Na from the irrigation water accumulates as evapotranspiration concentrates solutes. SAR values \u3e12, measured by 1:1 water–soil extracts, are caused by concentration of solutes by factors up to 13. Low-EC (\u3c0.7 mS cm−1) is caused by rain and snowmelt flushing the soil and displacing ions in soil solution. Soil below the drip tubing experiences lower solute concentration factors (1–1.65) due to excess irrigation water and also contains relatively abundant native gypsum (2.4 ± 1.7 wt.%). Geochemical simulations show gypsum dissolution decreases soil-water SAR to \u3c7 and increases the EC to around 4.1 mS cm−1, thus limiting negative impacts from sodicity. With sustained irrigation, however, downward flow of excess irrigation water depletes gypsum, increasing soil-water SAR to \u3e14 and decreasing EC in soil water to 3.2 mS cm−1. Increased sodicity in the subsurface, rather than the surface, indicates that deep SDI can be a viable means of irrigating with sodic waters

    Tracking solutes and water from subsurface drip irrigation application of coalbed methane–produced waters, Powder River Basin, Wyoming

    Get PDF
    One method to beneficially use water produced from coalbed methane (CBM) extraction is subsurface drip irrigation (SDI) of croplands. In SDI systems, treated CBMwater (injectate) is supplied to the soil at depth, with the purpose of preventing the buildup of detrimental salts near the surface. The technology is expanding within the Powder River Basin, but little research has been published on its environmental impacts. This article reports on initial results from tracking water and solutes from the injected CBM-produced waters at an SDI system in Johnson County, Wyoming. In the first year of SDI operation, soil moisture significantly increased in the SDI areas, but well water levels increased only modestly, suggesting that most of the water added was stored in the vadose zone or lost to evapotranspiration. The injectate has lower concentrations of most inorganic constituents relative to ambient groundwater at the site but exhibits a high sodium adsorption ratio. Changes in groundwater chemistry during the same period of SDI operation were small; the increase in groundwater-specific conductance relative to pre-SDI conditions was observed in a single well. Conversely, groundwater samples collected beneath another SDI field showed decreased concentrations of several constituents since the SDI operation.Groundwater-specific conductance at the 12 other wells showed no significant changes. Major controls on and compositional variability of groundwater, surface water, and soil water chemistry are discussed in detail. Findings from this research provide an understanding of water and salt dynamics associated with SDI systems using CBM-produced water

    Stratification of Risk of Early-Onset Sepsis in Newborns ≥34 Weeks’ Gestation

    No full text
    OBJECTIVE: To define a quantitative stratification algorithm for the risk of early-onset sepsis (EOS) in newborns ≥34 weeks’ gestation. METHODS: We conducted a retrospective nested case-control study that used split validation. Data collected on each infant included sepsis risk at birth based on objective maternal factors, demographics, specific clinical milestones, and vital signs during the first 24 hours after birth. Using a combination of recursive partitioning and logistic regression, we developed a risk classification scheme for EOS on the derivation dataset. This scheme was then applied to the validation dataset. RESULTS: Using a base population of 608 014 live births ≥34 weeks’ gestation at 14 hospitals between 1993 and 2007, we identified all 350 EOS cases <72 hours of age and frequency matched them by hospital and year of birth to 1063 controls. Using maternal and neonatal data, we defined a risk stratification scheme that divided the neonatal population into 3 groups: treat empirically (4.1% of all live births, 60.8% of all EOS cases, sepsis incidence of 8.4/1000 live births), observe and evaluate (11.1% of births, 23.4% of cases, 1.2/1000), and continued observation (84.8% of births, 15.7% of cases, incidence 0.11/1000). CONCLUSIONS: It is possible to combine objective maternal data with evolving objective neonatal clinical findings to define more efficient strategies for the evaluation and treatment of EOS in term and late preterm infants. Judicious application of our scheme could result in decreased antibiotic treatment in 80 000 to 240 000 US newborns each year

    Stratification of Risk of Early-Onset Sepsis in Newborns ≥34 Weeks’ Gestation

    No full text
    ObjectiveTo define a quantitative stratification algorithm for the risk of early-onset sepsis (EOS) in newborns ≥ 34 weeks' gestation.MethodsWe conducted a retrospective nested case-control study that used split validation. Data collected on each infant included sepsis risk at birth based on objective maternal factors, demographics, specific clinical milestones, and vital signs during the first 24 hours after birth. Using a combination of recursive partitioning and logistic regression, we developed a risk classification scheme for EOS on the derivation dataset. This scheme was then applied to the validation dataset.ResultsUsing a base population of 608,014 live births ≥ 34 weeks' gestation at 14 hospitals between 1993 and 2007, we identified all 350 EOS cases &lt;72 hours of age and frequency matched them by hospital and year of birth to 1063 controls. Using maternal and neonatal data, we defined a risk stratification scheme that divided the neonatal population into 3 groups: treat empirically (4.1% of all live births, 60.8% of all EOS cases, sepsis incidence of 8.4/1000 live births), observe and evaluate (11.1% of births, 23.4% of cases, 1.2/1000), and continued observation (84.8% of births, 15.7% of cases, incidence 0.11/1000).ConclusionsIt is possible to combine objective maternal data with evolving objective neonatal clinical findings to define more efficient strategies for the evaluation and treatment of EOS in term and late preterm infants. Judicious application of our scheme could result in decreased antibiotic treatment in 80,000 to 240,000 US newborns each year

    Economic evaluation of caffeine for apnea of prematurity

    Full text link
    In comparison with placebo, caffeine therapy for apnea of prematurity in infants weighing less than 1250 g is economically appealing for infants up to 18 to 21 months' corrected age
    corecore