2,973 research outputs found

    Automatic surrogate model type selection during the optimization of expensive black-box problems

    Get PDF
    The use of Surrogate Based Optimization (SBO) has become commonplace for optimizing expensive black-box simulation codes. A popular SBO method is the Efficient Global Optimization (EGO) approach. However, the performance of SBO methods critically depends on the quality of the guiding surrogate. In EGO the surrogate type is usually fixed to Kriging even though this may not be optimal for all problems. In this paper the authors propose to extend the well-known EGO method with an automatic surrogate model type selection framework that is able to dynamically select the best model type (including hybrid ensembles) depending on the data available so far. Hence, the expected improvement criterion will always be based on the best approximation available at each step of the optimization process. The approach is demonstrated on a structural optimization problem, i.e., reducing the stress on a truss-like structure. Results show that the proposed algorithm consequently finds better optimums than traditional kriging-based infill optimization

    Evolutionary model type selection for global surrogate modeling

    Get PDF
    Due to the scale and computational complexity of currently used simulation codes, global surrogate (metamodels) models have become indispensable tools for exploring and understanding the design space. Due to their compact formulation they are cheap to evaluate and thus readily facilitate visualization, design space exploration, rapid prototyping, and sensitivity analysis. They can also be used as accurate building blocks in design packages or larger simulation environments. Consequently, there is great interest in techniques that facilitate the construction of such approximation models while minimizing the computational cost and maximizing model accuracy. Many surrogate model types exist ( Support Vector Machines, Kriging, Neural Networks, etc.) but no type is optimal in all circumstances. Nor is there any hard theory available that can help make this choice. In this paper we present an automatic approach to the model type selection problem. We describe an adaptive global surrogate modeling environment with adaptive sampling, driven by speciated evolution. Different model types are evolved cooperatively using a Genetic Algorithm ( heterogeneous evolution) and compete to approximate the iteratively selected data. In this way the optimal model type and complexity for a given data set or simulation code can be dynamically determined. Its utility and performance is demonstrated on a number of problems where it outperforms traditional sequential execution of each model type

    Temporal Feature Selection with Symbolic Regression

    Get PDF
    Building and discovering useful features when constructing machine learning models is the central task for the machine learning practitioner. Good features are useful not only in increasing the predictive power of a model but also in illuminating the underlying drivers of a target variable. In this research we propose a novel feature learning technique in which Symbolic regression is endowed with a ``Range Terminal\u27\u27 that allows it to explore functions of the aggregate of variables over time. We test the Range Terminal on a synthetic data set and a real world data in which we predict seasonal greenness using satellite derived temperature and snow data over a portion of the Arctic. On the synthetic data set we find Symbolic regression with the Range Terminal outperforms standard Symbolic regression and Lasso regression. On the Arctic data set we find it outperforms standard Symbolic regression, fails to beat the Lasso regression, but finds useful features describing the interaction between Land Surface Temperature, Snow, and seasonal vegetative growth in the Arctic

    Radial Basis Function Neural Networks : A Review

    Get PDF
    Radial Basis Function neural networks (RBFNNs) represent an attractive alternative to other neural network models. One reason is that they form a unifying link between function approximation, regularization, noisy interpolation, classification and density estimation. It is also the case that training RBF neural networks is faster than training multi-layer perceptron networks. RBFNN learning is usually split into an unsupervised part, where center and widths of the Gaussian basis functions are set, and a linear supervised part for weight computation. This paper reviews various learning methods for determining centers, widths, and synaptic weights of RBFNN. In addition, we will point to some applications of RBFNN in various fields. In the end, we name software that can be used for implementing RBFNNs

    Meta-heuristic algorithms in car engine design: a literature survey

    Get PDF
    Meta-heuristic algorithms are often inspired by natural phenomena, including the evolution of species in Darwinian natural selection theory, ant behaviors in biology, flock behaviors of some birds, and annealing in metallurgy. Due to their great potential in solving difficult optimization problems, meta-heuristic algorithms have found their way into automobile engine design. There are different optimization problems arising in different areas of car engine management including calibration, control system, fault diagnosis, and modeling. In this paper we review the state-of-the-art applications of different meta-heuristic algorithms in engine management systems. The review covers a wide range of research, including the application of meta-heuristic algorithms in engine calibration, optimizing engine control systems, engine fault diagnosis, and optimizing different parts of engines and modeling. The meta-heuristic algorithms reviewed in this paper include evolutionary algorithms, evolution strategy, evolutionary programming, genetic programming, differential evolution, estimation of distribution algorithm, ant colony optimization, particle swarm optimization, memetic algorithms, and artificial immune system

    Curses, Tradeoffs, and Scalable Management:Advancing Evolutionary Multiobjective Direct Policy Search to Improve Water Reservoir Operations

    Get PDF
    Optimal management policies for water reservoir operation are generally designed via stochastic dynamic programming (SDP). Yet, the adoption of SDP in complex real-world problems is challenged by the three curses of dimensionality, modeling, and multiple objectives. These three curses considerably limit SDP’s practical application. Alternatively, this study focuses on the use of evolutionary multiobjective direct policy search (EMODPS), a simulation-based optimization approach that combines direct policy search, nonlinear approximating networks, and multiobjective evolutionary algorithms to design Pareto-approximate closed-loop operating policies for multipurpose water reservoirs. This analysis explores the technical and practical implications of using EMODPS through a careful diagnostic assessment of the effectiveness and reliability of the overall EMODPS solution design as well as of the resulting Pareto-approximate operating policies. The EMODPS approach is evaluated using the multipurpose Hoa Binh water reservoir in Vietnam, where water operators are seeking to balance the conflicting objectives of maximizing hydropower production and minimizing flood risks. A key choice in the EMODPS approach is the selection of alternative formulations for flexibly representing reservoir operating policies. This study distinguishes between the relative performance of two widely-used nonlinear approximating networks, namely artificial neural networks (ANNs) and radial basis functions (RBFs). The results show that RBF solutions are more effective than ANN ones in designing Pareto approximate policies for the Hoa Binh reservoir. Given the approximate nature of EMODPS, the diagnostic benchmarking uses SDP to evaluate the overall quality of the attained Pareto-approximate results. Although the Hoa Binh test case’s relative simplicity should maximize the potential value of SDP, the results demonstrate that EMODPS successfully dominates the solutions derived via SDP

    A Prediction Modeling Framework For Noisy Welding Quality Data

    Get PDF
    Numerous and various research projects have been conducted to utilize historical manufacturing process data in product design. These manufacturing process data often contain data inconsistencies, and it causes challenges in extracting useful information from the data. In resistance spot welding (RSW), data inconsistency is a well-known issue. In general, such inconsistent data are treated as noise data and removed from the original dataset before conducting analyses or constructing prediction models. This may not be desirable for every design and manufacturing applications since every data can contain important information to further explain the process. In this research, we propose a prediction modeling framework, which employs bootstrap aggregating (bagging) with support vector regression (SVR) as the base learning algorithm to improve the prediction accuracy on such noisy data. Optimal hyper-parameters for SVR are selected by particle swarm optimization (PSO) with meta-modeling. Constructing bagging models require 114 more computational costs than a single model. Also, evolutionary computation algorithms, such as PSO, generally require a large number of candidate solution evaluations to achieve quality solutions. These two requirements greatly increase the overall computational cost in constructing effective bagging SVR models. Meta-modeling can be employed to reduce the computational cost when the fitness or constraints functions are associated with computationally expensive tasks or analyses. In our case, the objective function is associated with constructing bagging SVR models with candidate sets of hyper-parameters. Therefore, in regards to PSO, a large number of bagging SVR models have to be constructed and evaluated, which is computationally expensive. The meta-modeling approach, called MUGPSO, developed in this research assists PSO in evaluating these candidate solutions (i.e., sets of hyper-parameters). MUGPSO approximates the fitness function of candidate solutions. Through this method, the numbers of real fitness function evaluations (i.e., constructing bagging SVR models) are reduced, which also reduces the overall computational costs. Using the Meta2 framework, one can expect an improvement in the prediction accuracy with reduced computational time. Experiments are conducted on three artificially generated noisy datasets and a real RSW quality dataset. The results indicate that Meta2 is capable of providing promising solutions with noticeably reduced computational costs
    • …
    corecore