11 research outputs found
Evaluation of Karst Spring Water Quality Using Water Quality Indices in Northeast Tennessee
Ensuring access to safe drinking water to protect public health in many communities underserved or unserved by centralized water systems in the US requires regular water quality testing and reporting. Following testing, access to easy-to-comprehend water quality information may be challenging. Households served by water utilities have access to water quality information. However, households depending on unregulated water systems like wells and springs are often unaware of their water quality. Therefore, this study utilized multiple water quality parameters to determine the quality of karst spring water using two Water Quality Index (WQI) methods.
In-situ measurements of physico-chemical parameters (pH, dissolved oxygen, temperature, turbidity, conductivity, specific conductance, total dissolved solids, oxidation reduction potential were taken at 50 karst springs in east Tennessee during Summer 2021. Water samples were analyzed for microbial (fecal coliform, and E. coli), nutrients (nitrate and nitrite), and radiological (radon) constituents using standard analytical methods. Springs generally met federal and state water quality safe limits for physicochemical parameters, but 100% of water samples contained fecal coliform and 90% contained E. coli revealing widespread fecal contamination; 60% of springs exceeded radon concentrations of 300 pCi/L.
WQI method 1 (Brown et al. 1972) rated 12 % of springs as very poor water quality and 88% as unfit for drinking. WQI method 2 (NSFWQI) rated 4% of the sampled springs as good, 92% as moderate and 4 % as bad. Water treatment procedures for microbial pollution purification are advised before the studied springs are used as a drinking water source
Using Spatial Regression to Model Potentially Toxic Metal (PTM) Mobility Based on Physicochemical Soil Properties
Mining processes generate waste rock, tailings, and slag that can increase potentially toxic metal (PTM) concentrations in soils. Un-reclaimed, abandoned mine sites are particularly prone to leaching these contaminants, which may accumulate and pose significant environmental and public health concerns. The characterization and spatial delineation of PTMs in soils is vital for risk assessment and soil reclamation. Bumpus Cove, a once active mining district of eastern Tennessee, is home to at least 47 abandoned, un-reclaimed mines, all permanently closed by the 1950s. This study evaluated soil physicochemical properties, determined the spatial extent of PTMs (Zn, Mn, Cu, Pb, and Cd), and examined the influence of soil properties on PTM distribution in Bumpus Cove, TN. Soil samples (n = 52) were collected from a 0.67 km2 study area containing 6 known abandoned Pb, Zn, and Mn mines at the headwaters of Bumpus Cove Creek. Samples were analyzed for Zn, Mn, Cu, Pb, and Cd by microwave-assisted acid digestion and flame atomic absorption spectrometry (FAAS) (12-1,354 mg/kg Zn, 6-2,574 mg/kg Mn, 1-65 mg/kg Cu, 33-2,271 mg/kg Pb, and 7-40 mg/kg Cd). Of the measured PTMs, only Pb exceeds permissible limits in soils. In addition to the PTM analyses, soil physical (texture, moisture content, and bulk density) and chemical (pH, cation exchange capacity (CEC), and total organic carbon (TOC)) properties were evaluated. Spatially weighted multivariate regression models developed for all PTMs using soil physicochemical properties produced improved results over ordinary least squares (OLS) regression models. Models for Zn (R2 = 0.71) and Pb (R2 = 0.69) retained covariates epH, moisture content, and CEC (Zn), and pH and CEC (Pb). This study will help define PTM concentration and transport and provide a reference for state and local entities responsible for contaminant monitoring in Bumpus Cove, TN
Identifying Untapped Potential: A Geospatial Analysis of Florida and California’s 2009 Recycled Water Production
Increased water demand attributed to population expansion and reduced freshwater availability caused by saltwater intrusion and drought, may lead to water shortages. These may be addressed, in part, by use of recycled water. Spatial patterns of recycled water use in Florida and California during 2009 were analyzed to detect gaps in distribution and identify potential areas for expansion. Databases of recycled water products and distribution centers for both states were developed by combining the 2008 Clean Water Needs Survey database with Florida’s 2009 Reuse Inventory and California’s 2009 Recycling Survey, respectively. Florida had over twice the number of distribution centers (n 1/4 426) than California (n 1/4 228) and produced a larger volume of recycled water (674.85 vs. 597.48 mgd (3.78 mL/d1/4 1 mgd), respectively). Kernel Density Estimation shows the majority of distribution in central Florida (Orlando and Tampa), California’s Central Valley region (Fresno and Bakersfield), and around major cities in California. Areas for growth were identified in the panhandle and southern regions of Florida, and northern, southwestern, and coastal California. Recycled water is an essential component of integrated water management and broader adoption of recycled water will increase water conservation in water-stressed coastal communities by allocating the recycled water for purposes that once used potable freshwater
Florida’s Recycled Water Footprint: A Geospatial Analysis of Distribution (2009 and 2015)
Water shortages resulting from increased demand or reduced supply may be addressed, in part, by redirecting recycled water for irrigation, industrial reuse, groundwater recharge, and as effluent discharge returned to streams. Recycled water is an essential component of integrated water management and broader adoption of recycled water will increase water conservation in water-stressed coastal communities. This study examined spatial patterns of recycled water use in Florida in 2009 and 2015 to detect gaps in distribution, quantify temporal change, and identify potential areas for expansion. Databases of recycled water products and distribution centers for Florida in 2009 and 2015 were developed by combining the 2008 and 2012 Clean Water Needs Survey databases with Florida’s 2009 and 2015 Reuse Inventory databases, respectively. Florida increased recycled water production from 674.85 mgd in 2009 to 738.15 mgd in 2015, an increase of 63.30 mgd. The increase was primarily allocated to use in public access areas, groundwater recharge, and industrial reuse, all within the South Florida Water Management District (WMD). In particular, Miami was identified in 2009 as an area of opportunity for recycled water development, and by 2015 it had increased production and reduced the production gap. Overall, South Florida WMD had the largest increase in production of 44.38 mgd (69%), while Southwest Florida WMD decreased production of recycled water by 1.68 mgd, or 3%. Overall increase in use of recycled water may be related to higher demand due to increased population coupled with public programs and policy changes that promote recycled water use at both the municipal and individual level
Emergence of COVID-19 and Patterns of Early Transmission in an Appalachian Sub-Region
Background: In mid-March 2020, very few cases of COVID-19 had been confirmed in the Central Blue Ridge Region, an area in Appalachia that includes 47 jurisdictions across northeast Tennessee, western North Carolina, and southwest Virginia. Authors described the emergence of cases and outbreaks in the region between March 18 and June 11, 2020.
Methods: Data were collected from the health department websites of Tennessee, North Carolina, and Virginia beginning in mid-March for an ongoing set of COVID-19 monitoring projects, including a newsletter for local healthcare providers and a Geographic Information Systems (GIS) dashboard. In Fall 2020, using these databases, authors conducted descriptive and geospatial cluster analyses to examine case incidence and fatalities over space and time.
Results: In the Central Blue Ridge Region, there were 4432 cases on June 11, or 163.22 cases per 100,000 residents in the region. Multiple days during which a particularly high number of cases were identified in the region were connected to outbreaks reported by local news outlets and health departments. Most of these outbreaks were linked to congregate settings such as schools, long-term care facilities, and food processing facilities.
Implications: By examining data available in a largely rural region that includes jurisdictions across three states, authors were able to describe and disseminate information about COVID-19 case incidence and fatalities and identify acute and prolonged local outbreaks. Continuing to follow, interpret, and report accurate and timely COVID-19 case data in regions like this one is vital to residents, businesses, healthcare providers, and policymakers
Factors Associated with COVID-19 Vaccine Hesitancy in South Central Appalachia
Introduction: The newly emergent COVID-19 virus reached pandemic levels in March 2020. By the middle of August 2020, there were over 1 million deaths attributed to COVID-19 in the U.S., with those in rural areas outpacing urban counterparts. Prior to emergency approval of the Pfizer, Moderna and Johnson & Johnson vaccine formulations, mitigation efforts addressing individual behavior were challenging. However, even with the entrance of these three new vaccines, herd immunity was not achieved in rural areas, as vaccine uptake remained low there. Although there has since been an abundance of COVID-19-related research addressing health literacy, vaccine hesitancy and overall medical mistrust, few of these studies focus on Appalachia.
Purpose: This study identifies barriers and facilitators to adherence with COVID-19 mitigation, focusing specifically on vaccine hesitancy in South Central Appalachia.
Methods: A secondary data study was conducted with a subset of Appalachian residents from the COVID-19 Public Health survey. Participants were grouped by county using ARC economic county designations for analysis. The dependent variable, vaccine hesitancy, was explored in relation to five categories of independent variable: (1) demographics (with four conceptual areas); (2) belief; (3) action; (4) medical mistrust; and (5) health literacy.
Results: Findings indicate vaccine hesitancy attributes include beliefs addressing COVID-19 threat, overstatement of severity of illness, risk of vaccines, vaccine safety information not present from manufacturer, and independent decision to vaccinate. Findings from this study are comparable to HPV vaccine studies in Appalachia.
Implications: As interventions are developed for Appalachia, it is paramount to focus vaccine administration at the individual and population level
Wake-up Call in East Tennessee? Correlating Flood Losses to National Flood Insurance Program Enrollment (1978-2006)
The National Flood Insurance Program (NFIP) provides federally-backed insurance for properties in Special Flood Hazard Areas, yet many property owners do not enroll in the program. I compared flood losses and flood insurance enrollment for three Tennessee communities: Chattanooga, Elizabethton and Pigeon Forge, to investigate the relationship between flooding and NFIP enrollment. Normalized flood losses and insurance purchases were cross-correlated using lags of zero through nine years to investigate the relationship between flood losses in one year and NFIP enrollment in subsequent years. The correlation between flood losses and NFIP enrollment is significant (r = 0.39 and 0.42 respectively, p\u3c0.05) in the year in which flood losses occurred for Chattanooga and Elizabethton. In Pigeon Forge, flood losses correlate to NFIP enrollment in the following year (r=0.43, p=0.02)
Spatiotemporal Analysis of the COVID-19 Pandemic in School-age Children (5-18 years) in Washington and Johnson County, TN
Abstract
COVID-19, as named by the World Health Organization, is a disease caused by severe acute respiratory syndrome, coronavirus 2 (SARS CoV-2). This study is a spatiotemporal analysis of the COVID-19 pandemic in school-age children (5-18 years) in Washington and Johnson County, Tennessee and the possible relationship between public policies and the rate of infection. The first cases in Tennessee were documented in March 2020, with data being collected since that time. Daily data are accessible on the Tennessee Health Department COVID-19 dashboard with the number of new cases, hospitalizations, and deaths grouped by county in ages 5-11 years and 12-18 years. As this disease spread, government officials mandated different policies: mask mandates, stay at home, restrictions of public gatherings, and school closure, but many schools eventually allowed physical attendance. Emerging spatiotemporal hotspots are analyzed to identify the spatial clustering patterns of hot and cold spots with statistical significance using the Moran I statistical model in ArcGIS. The Change point detection tool in ArcGIS makes inferences about significant changes in trends over time; it was used to identify when significant changes occur. This is an ongoing project that will inform the approach I will adopt for my thesis, statistical tools will be used to determine the correlation between the time the change occurred and the implementation of policies, with an estimated 14-day lag time. Finally, the findings from both age groups will be compared. This study aims to help policymakers make better-informed decisions when responding to future pandemics
Gully Morphology, Hillslope Erosion, and Precipitation Characteristics in the Appalachian Valley and Ridge Province, Southeastern USA
This study investigates gully erosion on an east Tennessee hillslope in a humid subtropical climate. The study area is deeply gullied in Ultisols (Acrisol, according to the World Reference Base for Soil), with thirty years of undisturbed erosional history with no efforts to correct or halt the erosion. The objectives are (1) to examine how different gully morphologies (channel, sidewall, and interfluve) behave in response to precipitation-driven erosion, and (2) to identify an appropriate temporal scale at which precipitation-driven erosion can be measured to improve soil loss prediction. Precipitation parameters (total accumulation, duration, average intensity, maximum intensity) extracted from data collected at an on-site weather station were statistically correlated with erosion data. Erosion data were collected from erosion pins installed in four gully systems at 78 locations spanning three different morphological settings: interfluves, channels, and sidewalls. Kruskal-Wallis non-parametric tests and Mann-Whitney U-tests indicated that different morphological settings within the gully system responded differently to precipitation (p\u3c0.00). For channels and sidewalls, regression models relating erosion and precipitation parameters retained antecedent precipitation and precipitation accumulation or duration (R2=0.50, p\u3c0.00 for channels, R2=0.28, p\u3c0.00 for sidewalls) but precipitation intensity variables were not retained in the models. For interfluves, less than 20% of variability in erosion data could be explained by precipitation parameters. Precipitation duration and accumulation (including antecedent precipitation accumulation) were more important than precipitation intensity in initiating and propagating erosion in this geomorphic and climatic setting, but other factors including mass wasting and eolian erosion are likely contributors to erosion. High correlation coefficients between aggregate precipitation parameters and erosion indicate that a suitable temporal scale to relate precipitation to soil erosion is the synoptic time-scale. This scale captures natural precipitation cycles and corresponding measurable soil erosion