15 research outputs found
Climate‐Induced Changes in the Risk of Hydrological Failure of Major Dams in California
Existing major reservoirs in California, with average age above 50 years, were built in the previous century with limited data records and flood hazard assessment. Changes in climate and land use are anticipated to alter statistical properties of inflow to these infrastructure systems and potentially increase their hydrological failure probability. Because of large socioeconomic repercussions of infrastructure incidents, revisiting dam failure risks associated with possible shifts in the streamflow regime is fundamental for societal resilience. Here we compute historical and projected flood return periods as a proxy for potential changes in the risk of hydrological failure of dams in a warming climate. Our results show that hydrological failure probability is likely to increase for most dams in California by 2100. Noticeably, the New Don Pedro, Shasta, Lewiston, and Trinity Dams are associated with highest potential changes in flood hazard
Changes in the Exposure of California’s Levee-Protected Critical Infrastructure to Flooding Hazard in a Warming Climate
Levee systems are an important part of California\u27s water infrastructure, engineered to provide resilience against flooding and reduce flood losses. The growth in California is partly associated with costly infrastructure developments that led to population expansion in the levee protected areas. Therefore, potential changes in the flood hazard could have significant socioeconomic consequences over levee protected areas, especially in the face of a changing climate. In this study, we examine the possible impacts of a warming climate on flood hazard over levee protected land in California. We use gridded maximum daily runoff from global circulation models (GCMs) that represent a wide range of variability among the climate projections, and are recommended by the California\u27s Fourth Climate Change Assessment Report, to investigate possible climate-induced changes. We also quantify the exposure of several critical infrastructure protected by the levee systems (e.g. roads, electric power transmission lines, natural gas pipelines, petroleum pipelines, and railroads) to flooding. Our results provide a detailed picture of change in flood risk for different levees and the potential societal consequences (e.g. exposure of people and critical infrastructure). Levee systems in the northern part of the Central Valley and coastal counties of Southern California are likely to observe the highest increase in flood hazard relative to the past. The most evident change is projected for the northern region of the Central Valley, including Butte, Glenn, Yuba, Sutter, Sacramento, and San Joaquin counties. In the leveed regions of these counties, based on the model simulations of the future, the historical 100-year runoff can potentially increase up to threefold under RCP8.5. We argue that levee operation and maintenance along with emergency preparation plans should take into account the changes in frequencies and intensities of flood hazard in a changing climate to ensure safety of levee systems and their protected infrastructure
A New Normal for Streamflow in California in a Warming Climate: Wetter Wet Seasons and Drier Dry Seasons
In this study, we investigate changes in future streamflows in California using bias-corrected and routed streamflows derived from global climate model (GCM) simulations under representative concentration pathways (RCPs): RCP4.5 and RCP8.5. Unlike previous studies that have focused mainly on the mean streamflow, annual maxima or seasonality, we focus on projected changes across the distribution of streamflow and the underlying causes. We report opposing trends in the two tails of the future streamflow simulations: lower low flows and higher high flows with no change in the overall mean of future flows relative to the historical baseline (statistically significant at 0.05 level). Furthermore, results show that streamflow is projected to increase during most of the rainy season (December to March) while it is expected to decrease in the rest of the year (i.e., wetter rainy seasons, and drier dry seasons). We argue that the projected changes to streamflow in California are driven by the expected changes to snow patterns and precipitation extremes in a warming climate. Changes to future low flows and extreme high flows can have significant implications for water resource planning, drought management, and infrastructure design and risk assessment
Evaluation of global impact models' ability to reproduce runoff characteristics over the central United States
The central United States experiences a wide array of hydrological extremes, with the 1993, 2008, 2013, and 2014 flooding events and the 1988 and 2012 droughts representing some of the most recent extremes, and is an area where water availability is critical for agricultural production. This study aims to evaluate the ability of a set of global impact models (GIMs) from the Water Model Intercomparison Project to reproduce the regional hydrology of the central United States for the period 1963–2001. Hydrological indices describing annual daily maximum, medium and minimum flow, and their timing are extracted from both modeled daily runoff data by nine GIMs and from observed daily streamflow measured at 252 river gauges. We compare trend patterns for these indices, and their ability to capture runoff volume differences for the 1988 drought and 1993 flood. In addition, we use a subset of 128 gauges and corresponding grid cells to perform a detailed evaluation of the models on a gauge-to-grid cell basis. Results indicate that these GIMs capture the overall trends in high, medium, and low flows well. However, the models differ from observations with respect to the timing of high and medium flows. More specifically, GIMs that only include water balance tend to be closer to the observations than GIMs that also include the energy balance. In general, as it would be expected, the performance of the GIMs is the best when describing medium flows, as opposed to the two ends of the runoff spectrum. With regards to low flows, some of the GIMs have considerably large pools of zeros or low values in their time series, undermining their ability in capturing low flow characteristics and weakening the ensemble's output. Overall, this study provides a valuable examination of the capability of GIMs to reproduce observed regional hydrology over a range of quantities for the central United States
Climate-informed environmental inflows to revive a drying lake facing meteorological and anthropogenic droughts
The rapid shrinkage of Lake Urmia, one of the world\u27s largest saline lakes located in northwestern Iran, is a tragic wake-up call to revisit the principles of water resources management based on the socio-economic and environmental dimensions of sustainable development. The overarching goal of this paper is to set a framework for deriving dynamic, climate-informed environmental inflows for drying lakes considering both meteorological/climatic and anthropogenic conditions. We report on the compounding effects of meteorological drought and unsustainable water resource management that contributed to Lake Urmia\u27s contemporary environmental catastrophe. Using rich datasets of hydrologic attributes, water demands and withdrawals, as well as water management infrastructure (i.e. reservoir capacity and operating policies), we provide a quantitative assessment of the basin\u27s water resources, demonstrating that Lake Urmia reached a tipping point in the early 2000s. The lake level failed to rebound to its designated ecological threshold (1274 m above sea level) during a relatively normal hydro-period immediately after the drought of record (1998–2002). The collapse was caused by a marked overshoot of the basin\u27s hydrologic capacity due to growing anthropogenic drought in the face of extreme climatological stressors. We offer a dynamic environmental inflow plan for different climate conditions (dry, wet and near normal), combined with three representative water withdrawal scenarios. Assuming effective implementation of the proposed 40% reduction in the current water withdrawals, the required environmental inflows range from 2900 million cubic meters per year (mcm yr−1) during dry conditions to 5400 mcm yr−1 during wet periods with the average being 4100 mcm yr−1. Finally, for different environmental inflow scenarios, we estimate the expected recovery time for re-establishing the ecological level of Lake Urmia
Recommended from our members
Accounting for Stream Bank Storage for a Seasonal Groundwater Model
In recent research on groundwater and surfacewater interaction, attention has focused on the study of water exchanges between the near-stream aquifer and stream. One of the important near stream processes is bank storage. The aim of this thesis is to document the procedure required to develop a bank storage model that can be linked into a MODFLOW groundwater model. For this purpose, a groundwater model and a MATLAB code that can simulate bank storage process was developed. These two models were linked through the well package of MODFLOW. Result indicated that the number of stage rise and shape of stage hydrograph entering to stream system, when they have the same average stream stage, produced similar net flux of water between surface water and groundwater. In addition, the results show that reaches, which were gaining during normal flow of the stream network, can become a losing stream during high flow periods
A simulation study to examine the sensitivity of the Pettitt test to detect abrupt changes in mean
<p>The Pettitt test is a non-parametric test that has been used in a number of hydroclimatological studies to detect abrupt changes in the mean of the distribution of the variable of interest. This test is based on the Mann-Whitney two-sample test (rank-based test), and allows the detection of a single shift at an unknown point in time. This test is often used to detect shifts in extremes because of the lack of distributional assumptions. However, the downside of not specifying a distribution is that the Pettitt test may be inefficient in detecting breaks when dealing with extremes. Here we adopt a Monte Carlo approach to examine the sensitivity of the Pettitt test in detecting shifts in the mean under different conditions (location of the break within the series, magnitude of the shift, record length, level of variability in the data, extreme <i>vs</i> non-extreme records, and pre-assigned significance level). These simulation results show that the sensitivity of this test in detecting abrupt changes increases with the increase in the magnitude of the shift and record length. The number of detections is higher when the time series represents the central part of the distribution (e.g. changes in the time series of medians), while the skill decreases as we move toward either low or high extremes (e.g. changes in the time series of maxima). Furthermore, the number of detections decreases as the variability in the data increases. Finally, abrupt changes are more easily detected when they occur toward the center of the time series.</p><p><b>Editor</b> D. Koutsoyiannis <b>Associate editor</b> K. Hamed</p><p></p> <p><b>Editor</b> D. Koutsoyiannis <b>Associate editor</b> K. Hamed</p
A Multi-Model Nonstationary Rainfall-Runoff Modeling Framework: Analysis and Toolbox
We present a framework and toolbox for multi-model (one at a time) nonstationary modeling of rainfall-runoff (RR) transformation. The designed time-varying nature of the five available conceptual RR models in the toolbox allows for modeling processes that are nonstationary in essence. Nonstationary Rainfall-Runoff Toolbox (NRRT) delivers insights about underlying watershed processes through interactive tuning of model parameters to reflect temporal nonstationarities. The toolbox includes a number of performance metrics, along with visual graphics to evaluate the goodness-of-fit of the model simulations. Our analysis shows that the proposed time-varying RR modeling framework successfully captures the nonstationary behavior of the Wights catchment in Australia. A multi-model analysis of this catchment, that has endured deforestation, provides insights on the functionality of different conceptual modules of RR models, and their representation of the real-world
Probabilistic Hazard Assessment of Contaminated Sediment in Rivers
We propose a probabilistic framework rooted in multivariate and copula theory to assess heavy metal hazard associated with contaminated sediment in freshwater rivers that provide crucial ecosystem services such as municipal water source, eco-tourism, and agricultural irrigation. Exploiting the dependence structure between suspended sediment concentration (SSC) and different heavy metals, we estimate the hazard probability associated with each heavy metal at different SSC levels. We derive these relationships for warm (spring-summer) and cold (fall-winter) seasons, as well as stormflow condition, to unpack their nonlinear associations under different environmental conditions. To demonstrate its efficacy, we apply our proposed generic framework to Fountain Creek, CO, and show heavy metal concentration in warm season and under stormflow condition bears a higher hazard likelihood compared to the cold season. Under both warm season and stormflow conditions, probability of exceeding maximum allowable threshold for all studied heavy metals (Cu, Zn, and Pb, in recoverable form) at a standard hardness of 100 mg/l CaCo3 and at a high level of SSC (95th percentile) is consistently more than 80% in our study site. Moreover, a longitudinal study along the Fountain Creek demonstrates that urban and agricultural land use considerably increase likelihoods of violating water quality standards compared to natural land cover. The novelty of this study lies in introducing a probabilistic hazard assessment framework that enables robust risk assessment with important policy implications about the likelihood of different heavy metals violating water quality standards under various SSC levels