2,869 research outputs found
Shuffled Complex Evolution Model Calibrating Algorithm: Enhancing its Robustness and Efficiency
Shuffled Complex Evolution—University of Arizona (SCE-UA) has been used extensively and proved to be a robust and
efficient global optimization method for the calibration of conceptual models. In this paper, two enhancements to the SCEUA
algorithm are proposed, one to improve its exploration and another to improve its exploitation of the search space.
A strategically located initial population is used to improve the exploration capability and a modification to the downhill
simplex search method enhances its exploitation capability. This enhanced version of SCE-UA is tested, first on a suite of test
functions and then on a conceptual rainfall-runoff model using synthetically generated runoff values. It is observed that the
strategically located initial population drastically reduces the number of failures and the modified simplex search also leads to
a significant reduction in the number of function evaluations to reach the global optimum, when compared with the original
SCE-UA. Thus, the two enhancements significantly improve the robustness and efficiency of the SCE-UA model calibrating
algorithm
Simulation models of technological innovation: A Review
The use of simulation modelling techniques in studies of technological innovation dates back to Nelson and Winter''s 1982 book "An Evolutionary Theory of Economic Change" and is an area which has been steadily expanding ever since. Four main issues are identified in reviewing the key contributions that have been made to this burgeoning literature. Firstly, a key driver in the construction of computer simulations has been the desire to develop more complicated theoretical models capable of dealing with the complex phenomena characteristic of technological innovation. Secondly, no single model captures all of the dimensions and stylised facts of innovative learning. Indeed this paper argues that one can usefully distinguish between the various contributions according to the particular dimensions of the learning process which they explore. To this end the paper develops a taxonomy which usefully distinguishes between these dimensions and also clarifies the quite different perspectives underpinning the contributions made by mainstream economists and non-mainstream, neo-Schumpeterian economists. This brings us to a third point highlighted in the paper. The character of simulation models which are developed are heavily influenced by the generic research questions of these different schools of thought. Finally, attention is drawn to an important distinction between the process of learning and adaptation within a static environment, and dynamic environments in which the introduction of new artefacts and patterns of behaviour change the selective pressure faced by agents. We show that modellers choosing to explore one or other of these settings reveal their quite different conceptual understandings of "technological innovation".economics of technology ;
Shape optimization in aeronautical applications using neural networks
An optimization methodology based on neural networks was developed for use in 2D optimal shape design problems. Neural networks were used as a parameterization scheme to represent the shape function, and an edge-based high-resolution scheme for the solution of the compressible Euler equations was used to model the flow around the shape. The global system incorporates neural networks and the Euler fluid solver into the C++ Flood optimization framework containing a library of optimization algorithms. The optimization scheme was applied to a minimal drag problem in an unconstrained optimization case and a constrained case in hypersonic flow using evolutionary training algorithms. The results indicate that the minimum drag problem is solved to a high degree of accuracy but at high computational cost. For more complex shapes, parallel computing methods are required to reduce computational time.Preprin
Shape optimization in aeronautical applications using neural networks
An optimization methodology based on neural networks was developed for use in 2D optimal shape design problems. Neural networks were used as a parameterization scheme to represent the shape function, and an edge-based high-resolution scheme for the solution of the compressible Euler equations was used to model the flow around the shape. The global system incorporates neural networks and the Euler fluid solver into the C++ Flood optimization framework containing a library of optimization algorithms. The optimization scheme was applied to a minimal drag problem in an unconstrained optimization case and a constrained case in hypersonic flow using evolutionary training algorithms. The results indicate that the minimum drag problem is solved to a high degree of accuracy but at high computational cost. For more complex shapes, parallel computing methods are required to reduce computational time
Contribution to the definition of non deterministic robust optimization in aeronautics accounting with variable uncertainties
Shape optimization is a largely studied problem in aeronautics. It can be applied to many disciplines in this field, namely efficiency improvement of engine blades, noise reduction of engine nozzles, or reduction of the fuel consumption of aircraft. Optimization for general purposes is also of increasing interest in many other fields. Traditionally, optimization procedures were based on deterministic methodologies as in Hamalainen et al (2000), where the optimum working point was fixed. However, not considering what happens in the vicinity of the defined working conditions can produce problems like loose of efficiency and performance. That is, in many cases, if the real working point differs from the original, even a little distance, efficiency is reduced considerably as pointed out in Huyse and
Lewis (2001). Non deterministic methodologies have been applied to many fields (Papadrakakis, Lagaros and Tsompanakis, 1998; Plevris, Lagaros and Papadrakakis, 2005). One of the most extended nondeterministic methodologies is the stochastic analysis. The time consuming calculations required on Computational Fluid Dynamics (CFD) has prevented an extensive application of the stochastic analysis to shape optimization. Stochastic analysis was firstly developed in structural mechanics, several years ago. Uncertainty quantification and variability studies can help to deal with intrinsic errors of the processes or methods. The result to consider for design optimization is no longer a point, but a range of values that defines the area where, in average, optimal output values are obtained. The optimal value could be worse than other optima, but considering its
vicinity, it is clearly the most robust regarding input variability. Uncertainty quantification is a topic of increasing interest from the last few years. It provides several techniques to evaluate uncertainty input parameters and their effects on the outcomes.
This research presents a methodology to integrate evolutionary algorithms and stochastic analysis, in order to deal with uncertainty and to obtain robust solutions
Recommended from our members
Adaptive evolution in static and dynamic environments
This thesis provides a framework for describing a canonical evolutionary system. Populations of individuals are envisaged as traversing a search space structured by genetic and developmental operators under the influence of selection. Selection acts on individuals' phenotypic expressions, guiding the population over an evaluation landscape, which describes an idealised evaluation surface over the phenotypic space. The corresponding valuation landscape describes evaluations over the genotypic space and may be transformed by within generation adaptive (learning) or maladaptive (fault induction) local search.
Populations subjected to particular genetic and selection operators are claimed to evolve towards a region of the valuation landscape with a characteristic local ruggedness, as given by the runtime operator correlation coefficient. This corresponds to the view of evolution discovering an evolutionarily stable population, or quasi-species, held in a state of dynamic equilibrium by the operator set and evaluation function. This is demonstrated by genetic algorithm experiments using the NK landscapes and a novel, evolvable evaluation function, The Tower of Babel. In fluctuating environments of varying temporal ruggedness, different operator sets are correspondingly more or less adapted.
Quantitative genetics analyses of populations in sinusoidally fluctuating conditions are shown to describe certain well known electronic filters. This observation suggests the notion of Evolutionary Signal Processing. Genetic algorithm experiments in which a population tracks a sinusoidally fluctuating optimum support this view. Using a self-adaptive mutation rate, it is possible to tune the evolutionary filter to the environmental frequency. For a time varying frequency, the mutation rate reacts accordingly. With local search, the valuation landscape is transformed through temporal smoothing. By coevolving modifier genes for individual learning and the rate at which the benefits may be directly transmitted to the next generation, the relative adaptedness of individual learning and cultural inheritance according to the rate of environmental change is demonstrated
- …