An adaptive sampling and weighted ensemble of surrogate models for high dimensional global optimization problems

Abstract

The modern engineering design optimization relies heavily on high- fidelity computer.  Even though, the computing ability of computers have increased drastically, design optimization based on high-fidelity simulations is still time consuming and impractical.  Surrogate modeling is a technique to replace the high-fidelity simulations.  This paper presents a novel approach, named weighted ensemble of surrogates (WESO) for computationally intensive optimization problems, The focus is on multi-modal functions to identify its global optima with relatively few function evaluations.  WESO search mechanism falls in two steps, explore and fit. The “explore” step is based on exploring the whole design region by generating sample points (agents) using Latin hypercube sampling (LHS) technique to gain prior knowledge about the function of interest (learning phase).  The “fit” step is to train and fit a weighted ensemble of surrogate models over the promising region (training phase) to mimic the computationally intensive true function and replace it with a surrogate model (cheap function).  The surrogates are then utilized to select candidates’ decision variable points at which the true objective function and constraints’ functions to be evaluated.  Weights are then determined, assigned and an ensemble of surrogate gets constructed using the candidate sample points where optimization can be carried out.  WESO has been evaluated on classical benchmark functions embedded in larger dimensional spaces.  WESO was also tested on the aerodynamic shape optimization of turbo-machinery airfoils to demonstrate its ability in handling computationally intensive optimization problems.  The results showed to what extent combinations of models can perform better than single surrogate models and provide insights into the scalability and robustness of the approach. WESO can successfully identify near global solutions, faster than other classical global optimization algorithms

    Similar works