313 research outputs found

    The True Destination of EGO is Multi-local Optimization

    Full text link
    Efficient global optimization is a popular algorithm for the optimization of expensive multimodal black-box functions. One important reason for its popularity is its theoretical foundation of global convergence. However, as the budgets in expensive optimization are very small, the asymptotic properties only play a minor role and the algorithm sometimes comes off badly in experimental comparisons. Many alternative variants have therefore been proposed over the years. In this work, we show experimentally that the algorithm instead has its strength in a setting where multiple optima are to be identified

    Metamodeling sampling criteria in a global optimization framework

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/76903/1/AIAA-2000-4921-115.pd

    A new Taxonomy of Continuous Global Optimization Algorithms

    Full text link
    Surrogate-based optimization, nature-inspired metaheuristics, and hybrid combinations have become state of the art in algorithm design for solving real-world optimization problems. Still, it is difficult for practitioners to get an overview that explains their advantages in comparison to a large number of available methods in the scope of optimization. Available taxonomies lack the embedding of current approaches in the larger context of this broad field. This article presents a taxonomy of the field, which explores and matches algorithm strategies by extracting similarities and differences in their search strategies. A particular focus lies on algorithms using surrogates, nature-inspired designs, and those created by design optimization. The extracted features of components or operators allow us to create a set of classification indicators to distinguish between a small number of classes. The features allow a deeper understanding of components of the search strategies and further indicate the close connections between the different algorithm designs. We present intuitive analogies to explain the basic principles of the search algorithms, particularly useful for novices in this research field. Furthermore, this taxonomy allows recommendations for the applicability of the corresponding algorithms.Comment: 35 pages total, 28 written pages, 4 figures, 2019 Reworked Versio

    A portfolio approach to massively parallel Bayesian optimization

    Full text link
    One way to reduce the time of conducting optimization studies is to evaluate designs in parallel rather than just one-at-a-time. For expensive-to-evaluate black-boxes, batch versions of Bayesian optimization have been proposed. They work by building a surrogate model of the black-box that can be used to select the designs to evaluate efficiently via an infill criterion. Still, with higher levels of parallelization becoming available, the strategies that work for a few tens of parallel evaluations become limiting, in particular due to the complexity of selecting more evaluations. It is even more crucial when the black-box is noisy, necessitating more evaluations as well as repeating experiments. Here we propose a scalable strategy that can keep up with massive batching natively, focused on the exploration/exploitation trade-off and a portfolio allocation. We compare the approach with related methods on deterministic and noisy functions, for mono and multiobjective optimization tasks. These experiments show similar or better performance than existing methods, while being orders of magnitude faster

    Improving the optimisation performance of an ensemble of radial basis functions

    No full text
    In this paper we investigate surrogate-based optimisation performance using two different ensemble approaches, and a novel update strategy based on the local Pearson correlation coefficient. The ?first ensemble, is based on a selective approach, where ns RBFs are constructed and the most accurate RBF is selected for prediction at each iteration, while the others are ignored. The secondensemble uses a combined approach, which takes advantage of ns different RBFs, in the hope of reducing errors in the prediction through a weighted combination of the RBFs used. The update strategy uses the local Pearson correlation coefficient as a constraint to ignore domain areas wherethere is disagreement between the surrogates. In total the performance of six different approaches are investigated, using ?five analytical test functions with 2 to 50 dimensions, and one engineering problem related to the frequency response of a satellite boom with 2 to 40 dimensions

    Efficient design optimization of high-performance MEMS based on a surrogate-assisted self-adaptive differential evolution

    Get PDF
    High-performance microelectromechanical systems (MEMS) are playing a critical role in modern engineering systems. Due to computationally expensive numerical analysis and stringent design specifications nowadays, both the optimization efficiency and quality of design solutions become challenges for available MEMS shape optimization methods. In this paper, a new method, called self-adaptive surrogate model-assisted differential evolution for MEMS optimization (ASDEMO), is presented to address these challenges. The main innovation of ASDEMO is a hybrid differential evolution mutation strategy combination and its self-adaptive adoption mechanism, which are proposed for online surrogate model-assisted MEMS optimization. The performance of ASDEMO is demonstrated by a high-performance electro-thermo-elastic micro-actuator, a high-performance corrugated membrane microactuator, and a highly multimodal mathematical benchmark problem. Comparisons with state-of-the-art methods verify the advantages of ASDEMO in terms of efficiency and optimization ability
    corecore