61,854 research outputs found
Environment identification based memory scheme for estimation of distribution algorithms in dynamic environments
Copyright @ Springer-Verlag 2010.In estimation of distribution algorithms (EDAs), the joint probability distribution of high-performance solutions is presented by a probability model. This means that the priority search areas of the solution space are characterized by the probability model. From this point of view, an environment identification-based memory management scheme (EI-MMS) is proposed to adapt binary-coded EDAs to solve dynamic optimization problems (DOPs). Within this scheme, the probability models that characterize the search space of the changing environment are stored and retrieved to adapt EDAs according to environmental changes. A diversity loss correction scheme and a boundary correction scheme are combined to counteract the diversity loss during the static evolutionary process of each environment. Experimental results show the validity of the EI-MMS and indicate that the EI-MMS can be applied to any binary-coded EDAs. In comparison with three state-of-the-art algorithms, the univariate marginal distribution algorithm (UMDA) using the EI-MMS performs better when solving three decomposable DOPs. In order to understand the EI-MMS more deeply, the sensitivity analysis of parameters is also carried out in this paper.This work was supported by the National
Nature Science Foundation of China (NSFC) under Grant 60774064, the Engineering and Physical Sciences Research Council (EPSRC) of
UK under Grant EP/E060722/01
Simulating California reservoir operation using the classification and regression-tree algorithm combined with a shuffled cross-validation scheme
The controlled outflows from a reservoir or dam are highly dependent on the decisions made by the reservoir operators, instead of a natural hydrological process. Difference exists between the natural upstream inflows to reservoirs and the controlled outflows from reservoirs that supply the downstream users. With the decision maker's awareness of changing climate, reservoir management requires adaptable means to incorporate more information into decision making, such as water delivery requirement, environmental constraints, dry/wet conditions, etc. In this paper, a robust reservoir outflow simulation model is presented, which incorporates one of the well-developed data-mining models (Classification and Regression Tree) to predict the complicated human-controlled reservoir outflows and extract the reservoir operation patterns. A shuffled cross-validation approach is further implemented to improve CART's predictive performance. An application study of nine major reservoirs in California is carried out. Results produced by the enhanced CART, original CART, and random forest are compared with observation. The statistical measurements show that the enhanced CART and random forest overperform the CART control run in general, and the enhanced CART algorithm gives a better predictive performance over random forest in simulating the peak flows. The results also show that the proposed model is able to consistently and reasonably predict the expert release decisions. Experiments indicate that the release operation in the Oroville Lake is significantly dominated by SWP allocation amount and reservoirs with low elevation are more sensitive to inflow amount than others
Recommended from our members
Modeling and analysis of the variability of the water cycle in the upper Rio Grande basin at high resolution
Estimating the water budgets in a small-scale basin is a challenge, especially in the mountainous western United States, where the terrain is complex and observational data in the mountain areas are sparse. This manuscript reports on research that downscaled 5-yr (1999-2004) hydrometeorological fields over the upper Rio Grande basin from a 2.5° NCEP-NCAR reanalysis to a 4-km local scale using a regional climate model [fifth-generation Pennsylvania State University-National Center for Atmospheric Research Mesoscale Model (MM5), version 3]. The model can reproduce the terrain-related precipitation distribution - the trend of diurnal, seasonal, and interannual precipitation variability - although poor snow simulation caused it to overestimate precipitation and evapotranspiration in the cold season. The outcomes from the coupled model are also comparable to offline Variable Infiltration Capacity (VIC) and Land Data Assimilation System (LDAS)/Mosaic land surface simulations that are driven by observed and/or analyzed surface meteorological data. © 2007 American Meteorological Society
Recommended from our members
Modeling intraseasonal features of 2004 North American monsoon precipitation
This study examines the capabilities and limitations of the fifth-generation Pennsylvania State University-National Center for Atmospheric Research Mesoscale Model (MM5) in predicting the precipitation and circulation features that accompanied the 2004 North American monsoon (NAM). When the model is reinitialized every 5 days to restrain the growth of modeling errors, its results for precipitation checked at subseasonal time scales (not for individual rainfall events) become comparable with ground- and satellite-based observations as well as with the NAM's diagnostic characteristics. The modeled monthly precipitation illustrates the evolution patterns of monsoon rainfall, although it underestimates the rainfall amount and coverage area in comparison with observations. The modeled daily precipitation shows the transition from dry to wet episodes on the monsoon onset day over the Arizona-New Mexico region, and the multiday heavy rainfall (>1 mm day-1) and dry periods after the onset. All these modeling predictions agree with observed variations. The model also accurately simulated the onset and ending dates of four major moisture surges over the Gulf of California during the 2004 monsoon season. The model reproduced the strong diurnal variability of the NAM precipitation, but did not predict the observed diurnal feature of the precipitation peak's shift from the mountains to the coast during local afternoon to late night. In general, the model is able to reproduce the major, critical patterns and dynamic variations of the NAM rainfall at intraseasonal time scales, but still includes errors in precipitation quantity, pattern, and timing. The numerical study suggests that these errors are due largely to deficiencies in the model's cumulus convective parameterization scheme, which is responsible for the model's precipitation generation. © 2007 American Meteorological Society
A new evolutionary search strategy for global optimization of high-dimensional problems
Global optimization of high-dimensional problems in practical applications remains a major challenge to the research community of evolutionary computation. The weakness of randomization-based evolutionary algorithms in searching high-dimensional spaces is demonstrated in this paper. A new strategy, SP-UCI is developed to treat complexity caused by high dimensionalities. This strategy features a slope-based searching kernel and a scheme of maintaining the particle population's capability of searching over the full search space. Examinations of this strategy on a suite of sophisticated composition benchmark functions demonstrate that SP-UCI surpasses two popular algorithms, particle swarm optimizer (PSO) and differential evolution (DE), on high-dimensional problems. Experimental results also corroborate the argument that, in high-dimensional optimization, only problems with well-formative fitness landscapes are solvable, and slope-based schemes are preferable to randomization-based ones. © 2011 Elsevier Inc. All rights reserved
Recommended from our members
Estimation of surface longwave radiation components from ground-based historical net radiation and weather data
A methodology for estimating ground upwelling, clear-sky and cloud downwelling longwave radiations (L↑, Lsky ↓, and Lcld↓) and net shortwave radiation (Sn) at 30-min temporal scales based on long-term ground-based net radiations and meteorological observations is described. Components of surface radiation can be estimated from empirical models, cloud radiation models, and remote sensing observations. The proposed method combines the local calibration of empirical models and the radiative energy balance method to obtain the dual-directional, dual-spectral components of the surface radiation for the offline land surface process modeling and ecosystem study. By extracting information of radiation components from long-term net radiation and concurrent weather data, the utility of tower net radiation observations is maximized. Four test sites with multiyears' radiation records were used to evaluate the method. The results show that when compared with the results of empirical models using default parameters the proposed method is able to produce more accurate estimates of longwave surface components (Lg ↑, Lsky↓, Lcld↓) and net shortwave radiation (Sn). Overall, the estimated and observed surface radiation components show high correlations (>0.90), high index of agreement (>0.89), and low errors (root mean square error <30 W m-2 and bias <11 W m-2) at all of the test sites. The advantage of this scheme is that the refinement is achieved using the information from the historical net radiation and weather data at each observation site without additional measurements. This method is applicable for many existing observation sites worldwide which have long-term (at least 1 year) continuous net radiation records. Copyright 2008 by the American Geophysical Union
A solution to the crucial problem of population degeneration in high-dimensional evolutionary optimization
Three popular evolutionary optimization algorithms are tested on high-dimensional benchmark functions. An important phenomenon responsible for many failures - population degeneration - is discovered. That is, through evolution, the population of searching particles degenerates into a subspace of the search space, and the global optimum is exclusive from the subspace. Subsequently, the search will tend to be confined to this subspace and eventually miss the global optimum. Principal components analysis (PCA) is introduced to discover population degeneration and to remedy its adverse effects. The experiment results reveal that an algorithm's efficacy and efficiency are closely related to the population degeneration phenomenon. Guidelines for improving evolutionary algorithms for high-dimensional global optimization are addressed. An application to highly nonlinear hydrological models demonstrates the efficacy of improved evolutionary algorithms in solving complex practical problems. © 2011 IEEE
Handling boundary constraints for particle swarm optimization in high-dimensional search space
Despite the fact that the popular particle swarm optimizer (PSO) is currently being extensively applied to many real-world problems that often have high-dimensional and complex fitness landscapes, the effects of boundary constraints on PSO have not attracted adequate attention in the literature. However, in accordance with the theoretical analysis in [11], our numerical experiments show that particles tend to fly outside of the boundary in the first few iterations at a very high probability in high-dimensional search spaces. Consequently, the method used to handle boundary violations is critical to the performance of PSO. In this study, we reveal that the widely used random and absorbing bound-handling schemes may paralyze PSO for high-dimensional and complex problems. We also explore in detail the distinct mechanisms responsible for the failures of these two bound-handling schemes. Finally, we suggest that using high-dimensional and complex benchmark functions, such as the composition functions in [19], is a prerequisite to identifying the potential problems in applying PSO to many real-world applications because certain properties of standard benchmark functions make problems inexplicit. © 2011 Elsevier Inc. All rights reserved
Recommended from our members
Model performance of downscaling 1999-2004 hydrometeorological fields to the upper Rio Grande basin using different forcing datasets
This study downscaled more than five years of data (1999-2004) for hydrometeorological fields over the upper Rio Grande basin (URGB) to a 4-km resolution using a regional model [fifth-generation Pennsylvania State University-National Center for Atmospheric Research (NCAR) Mesoscale Model (MM5, version 3)] and two forcing datasets that include National Centers for Environmental Prediction (NCEP)-NCAR reanalysis-1 (R1) and North America Regional Reanalysis (NARR) data. The long-term high-resolution simulation results show detailed patterns of hydroclimatological fields that are highly related to the characteristics of the regional terrain; the most important of these patterns are precipitation localization features caused by the complex topography. In comparison with station observational data, the downscaling processing, on whichever forcing field is used, generated more accurate surface temperature and humidity fields than the Eta Model and NARR data, although it still included marked errors, such as a negative (positive) bias toward the daily maximum (minimum) temperature and overestimated precipitation, especially in the cold season. Comparing the downscaling results forced by the NARR and R1 with both the gridded and station observational data shows that under the NARR forcing, the MM5 model produced generally better results for precipitation, temperature, and humidity than it did under the R1 forcing. These improvements were more apparent in winter and spring. During the warm season, although the use of NARR improved the precipitation estimates statistically at the regional (basin) scale, it substantially underestimated them over the southern upper Rio Grande basin, partly because the NARR forcing data exhibited warm and dry biases in the monsoon-active region during the simulation period and improper domain selection. Analyses also indicate that over mountainous regions, both the Climate Prediction Center's (CPC's) gridded (0.25°) and NARR forcings underestimate precipitation in comparison with station gauge data. © 2008 American Meteorological Society
Recommended from our members
Impacts of model calibration on high-latitude land-surface processes: PILPS 2(e) calibration/validation experiments
In the PILPS 2(e) experiment, the Snow Atmosphere Soil Transfer (SAST) land-surface scheme developed from the Biosphere-Atmosphere Transfer Scheme (BATS) showed difficulty in accurately simulating the patterns and quantities of runoff resulting from heavy snowmelt in the high-latitude Torne-Kalix River basin (shared by Sweden and Finland). This difficulty exposes the model deficiency in runoff formations. After representing subsurface runoff and calibrating the parameters, the accuracy of hydrograph prediction improved substantially. However, even with the accurate precipitation and runoff, the predicted soil moisture and its variation were highly "model-dependent". Knowledge obtained from the experiment is discussed. © 2003 Elsevier Science B.V. All rights reserved
- …