19 research outputs found

    Microbial partitioning in urban stormwaters

    Get PDF
    Contamination by high concentrations of fecal indicator bacteria has been identified as one of the most common causes of surface water quality impairment in the United States; however, there is currently very little quantitative data available for use in designing watershed restoration plans that detail microbial transport in receiving waters. In this study, association with settleable particles (partitioning), a behavior frequently neglected in water quality models that can affect in-stream fate and transport, is more thoroughly characterized through the analysis of samples from several watersheds. Results suggest that while intermittent, stormwater flows contribute the majority of indicator organism inputs to receiving waters, as cumulative storm loadings can be equal to several years' worth of equivalent background loadings. Loadings of microorganisms associated with settleable particles appear to be largely transported in the initial first flush of storm events. Observations of particle association by fecal indicator bacteria appear to be a reasonable approximation of the partitioning behavior of Salmonella; however, Salmonella bacteria, as well as the protozoan pathogens Cryptosporidium and Giardia, were readily recoverable from samples meeting current water quality standards. Monitoring data from two suburban detention basins suggest that settleable indicator organisms and Salmonella are removed at a higher rate than their free-phase counterparts, indicating that sedimentation may be an important microbial removal mechanism in stormwater treatment structures. However, despite mean removals by one pond near the USEPA's typical rate of 65%, effluent concentrations remained several orders of magnitude greater than recommended levels. Comparisons of free phase and settleable E. coli concentrations as measured by a culture-based technique and the quantitative polymerase chain reaction (qPCR) may support previous studies suggesting that particle association reduces cell die-off in addition to accelerating sedimentation in the water column, although further investigation of potential inhibition of the PCR reaction is required. Despite significant differences between enumeration techniques in free phase E. coli concentrations, measures of total concentration were equivalent and produced similar conclusions regarding water body impairment. Regardless of detection method or indicator organism used in assessment, compiled data indicate that all four study watersheds will be in violation of recommended standards following storm events

    What’s in Your Water? Development and Evaluation of the Virginia Household Water Quality Program and Virginia Master Well Owner Network

    Get PDF
    Approximately one-fifth of Virginians (about 1.7 million people) rely on private water supplies (e.g., wells, springs, cisterns) for their household water. Unlike public water systems, the Environmental Protection Agency (EPA) does not regulate private systems. As a result, private water system owners are solely responsible for system maintenance and water quality but are often unaware of common issues and lack access to objective information. We report on the development and evaluation of the Virginia Household Water Quality Program (VAHWQP), an ongoing Virginia Cooperative Extension (VCE) program that provides affordable water testing and education about private water supply system maintenance and groundwater protection. A companion capacity-building program, the Virginia Master Well Owner Network (VAMWON), provides training to volunteers, agency collaborators, and VCE agents who support the goals and objectives of the VAHWQP by conducting VAHWQP drinking water clinics and other outreach efforts. Program assessment findings indicate that VAHWQP drinking water clinic participants regard this programming favorably and are taking recommended actions. We discuss the program assessment framework and continued efforts to improve these programs to achieve long-term behavioral changes regarding water testing and system maintenance, which will yield safer private water supplies and improved environmental stewardship

    Pit latrine fecal sludge resistance using a dynamic cone penetrometer in low income areas in Mzuzu city, Malawi

    Get PDF
    Pit latrines can provide improved household sanitation, but without effective and inexpensive emptying options, they are often abandoned once full and may pose a public health threat. Emptying techniques can be difficult, as the sludge contents of each pit latrine are different. The design of effective emptying techniques (e.g., pumps) is limited by a lack of data characterizing typical in situ latrine sludge resistance. This investigation aimed to better understand the community education and technical engineering needs necessary to improve pit latrine management. In low income areas within Mzuzu city, Malawi, 300 pit latrines from three distinct areas were assessed using a dynamic cone penetrometer to quantify fecal sludge strength, and household members were surveyed to determine their knowledge of desludging procedures and practices likely to impact fecal sludge characteristics. The results demonstrate that there is a significant difference in sludge strength between lined and unlined pits within a defined area, though sludge hardened with depth, regardless of the pit type or region. There was only limited association between cone penetration depth and household survey data. To promote the adoption of pit emptying, it is recommended that households be provided with information that supports pit emptying, such as latrine construction designs, local pit emptying options, and cost. This study indicates that the use of a penetrometer test in the field prior to pit latrine emptying may facilitate the selection of appropriate pit emptying technology

    Springing for Safe Water: Drinking Water Quality and Source Selection in Central Appalachian Communities

    No full text
    Issues surrounding water infrastructure, access, and quality are well documented in the Central Appalachian region of the United States. Even in cases where residents have in-home piped point-of-use (POU) water, some rely on alternative drinking water sources for daily needs—including water collection from roadside springs. This effort aims to better understand and document spring usage in this region by identifying the factors that influence drinking water source selection and comparing household and spring water quality to Safe Drinking Water Act (SDWA) health-based and aesthetic contaminant recommendations. Households were recruited from communities surrounding known springs in three states (Kentucky, Virginia, and West Virginia). First- and second-draw, in-home POU tap water samples were collected from participating households and compared to samples collected from local springs on the same day. Samples were analyzed for fecal indicator bacteria and inorganic ions. Study participants completed surveys to document perceptions of household drinking water and typical usage. The majority of survey participants (82.6%) did not trust their home tap water due to aesthetic issues. Water quality results suggested that fecal indicator bacteria were more common in spring water, while several metallic ions were recovered in higher concentrations from household samples. These observations highlight that health risks and perceptions may be different between sources

    Putting Corporate Social Responsibility to Work in Mining Communities: Exploring Community Needs for Central Appalachian Wastewater Treatment

    No full text
    Due to the finite nature of non-renewable mineral and energy resources such as coal, resource extraction is inherently unsustainable; however, mining and related activities can contribute to sustainable development. Indeed, the principles of corporate social responsibility (CSR) require that mine operators design and conduct their activities in ways that provide for net positive impacts on surrounding communities and environments. In Central Appalachia, there appears to be a particularly ripe opportunity for the coal industry to put CSR to work: participation in sustainable solutions to the long-standing problem of inadequately treated wastewater discharges—which not only represent a potential human health hazard, but also contribute to the relatively high incidence of bacterial impairments in surface waters in the region. In this paper, we outline the underlying factors of this problem and the advantages of industry-aided solutions in a region where limited economic and technical resources are not always aligned with social and environmental needs. We also suggest a framework for problem-solving, which necessarily involves all interested stakeholders, and identify the primary challenges that must be overcome in pursuit of sustainable solutions

    Assessing the Impact of Climate Change and Land Use Variation on Microbial Transport Using Watershed Scale-modeling

    No full text
    Uncertainty surrounding microbial fate and transport renders the assessment of climate change effects on waterborne pathogens complex and difficult to forecast. The objective of this study is to use watershed modeling to predict the impacts of future climate change and land management scenarios on microbial water quality.  Preliminary findings suggest an increased risk to human health due to direct consequences of climate change. Results of watershed-scale microbial load modeling can inform the adoption of pollution control measures required to protect human health and aid development of new water policy

    Challenges in providing safe drinking water in Central Appalachia

    No full text
    This presentation will focus on a review of drinking water access, as well as results from interconnected efforts investigating municipal water violations, private drinking water screenings, and efforts to understand water scavenging

    Recovery of lead, iron, and copper from point-of-use-filters to examine performance

    No full text
    Over the last few decades, reliance on point-of-use (POU) treatment for removing actual or perceived contaminants in drinking water has increased within the United States. Understanding POU treatment removal performance, and accurately estimating metals exposure at the tap, is critical for understanding POU water treatment device effectiveness and potential reductions in contaminant exposure. Previous bench-scale efforts have documented significant removal of dissolved Pb using faucet-mounted POU filters; however, limited efforts have challenged these filters with extreme water quality conditions which are more common in homes reliant on private well water. Characterization of typical rates of metals uptake by POU filters would support: improved exposure estimates and predictions, a better understanding of long-term filter performance under different conditions, and identification of conditions where POU use is recommended. In the current study, standard faucet-mount activated carbon POU filters were tested in a laboratory setting in order to: 1) determine removal of Pb, Cu, and Fe under low and high concentration conditions designed to reflect previous observations of residential water quality; and 2) evaluate the effectiveness of an acid flow-through procedure in recovering metals from used POU filters exposed to varying concentrations of Pb, Cu, and Fe. Although the filters tested here successfully removed Pb and Cu from waters of both high and low-level concentrations (>91% removal), Fe removal varied considerably. The acid flow-through procedure yielded mixed results: while 25.1-70.4% of influent Pb mass was recovered, recovery of Cu and Fe from the dosed filters was unpredictable. This was attributed in part to leaching from the filter media itself; in addition to Cu and Fe, concentrations of several other elements (e.g., Ti, Si, Al) increased and appeared to leach from control filters during the acid flow-through procedure. Given these results, alternative methods for assessing uptake of metals to POU filters should be explored
    corecore