120 research outputs found

    Optimization of solder joints in embedded mechatronic systems via Kriging-assisted CMA-ES algorithm

    Get PDF
    In power electronics applications, embedded mechatronic systems (MSs) must meet the severe operating conditions and high levels of thermomechanical stress. The thermal fatigue of the solder joints remains the main mechanism leading to the rupture and a malfunction of the complete MS. It is the main failure to which the lifetime of embedded MS is often linked. Consequently, robust and inexpensive design optimization is needed to increase the number of life cycles of solder joints. This paper proposes an application of metamodel-assisted evolution strategy (MA-ES) which significantly reduces the computational cost of ES induced by the expensive finite element simulation, which is the objective function in optimization problems. The proposed method aims to couple the Kriging metamodel with the covariance matrix adaptation evolution strategy (CMA-ES). Kriging metamodel is used to replace the finite element simulation in order to overcome the computational cost of fitness function evaluations (finite element model). Kriging is used together with CMA-ES and sequentially updated and its fidelity (quality) is measured according to its ability in ranking of the population through approximate ranking procedure (ARP). The application of this method in the optimization of MS proves its efficiency and ability to avoid the problem of computational cost

    Self-Adaptive Surrogate-Assisted Covariance Matrix Adaptation Evolution Strategy

    Get PDF
    This paper presents a novel mechanism to adapt surrogate-assisted population-based algorithms. This mechanism is applied to ACM-ES, a recently proposed surrogate-assisted variant of CMA-ES. The resulting algorithm, saACM-ES, adjusts online the lifelength of the current surrogate model (the number of CMA-ES generations before learning a new surrogate) and the surrogate hyper-parameters. Both heuristics significantly improve the quality of the surrogate model, yielding a significant speed-up of saACM-ES compared to the ACM-ES and CMA-ES baselines. The empirical validation of saACM-ES on the BBOB-2012 noiseless testbed demonstrates the efficiency and the scalability w.r.t the problem dimension and the population size of the proposed approach, that reaches new best results on some of the benchmark problems.Comment: Genetic and Evolutionary Computation Conference (GECCO 2012) (2012

    A new Taxonomy of Continuous Global Optimization Algorithms

    Full text link
    Surrogate-based optimization, nature-inspired metaheuristics, and hybrid combinations have become state of the art in algorithm design for solving real-world optimization problems. Still, it is difficult for practitioners to get an overview that explains their advantages in comparison to a large number of available methods in the scope of optimization. Available taxonomies lack the embedding of current approaches in the larger context of this broad field. This article presents a taxonomy of the field, which explores and matches algorithm strategies by extracting similarities and differences in their search strategies. A particular focus lies on algorithms using surrogates, nature-inspired designs, and those created by design optimization. The extracted features of components or operators allow us to create a set of classification indicators to distinguish between a small number of classes. The features allow a deeper understanding of components of the search strategies and further indicate the close connections between the different algorithm designs. We present intuitive analogies to explain the basic principles of the search algorithms, particularly useful for novices in this research field. Furthermore, this taxonomy allows recommendations for the applicability of the corresponding algorithms.Comment: 35 pages total, 28 written pages, 4 figures, 2019 Reworked Versio

    Surrogate-Assisted Multiobjective Evolutionary Algorithms for Structural Shape and Sizing Optimisation

    Get PDF
    The work in this paper proposes the hybridisation of the well-established strength Pareto evolutionary algorithm (SPEA2) and some commonly used surrogate models. The surrogate models are introduced to an evolutionary optimisation process to enhance the performance of the optimiser when solving design problems with expensive function evaluation. Several surrogate models including quadratic function, radial basis function, neural network, and Kriging models are employed in combination with SPEA2 using real codes. The various hybrid optimisation strategies are implemented on eight simultaneous shape and sizing design problems of structures taking into account of structural weight, lateral bucking, natural frequency, and stress. Structural analysis is carried out by using a finite element procedure. The optimum results obtained are compared and discussed. The performance assessment is based on the hypervolume indicator. The performance of the surrogate models for estimating design constraints is investigated. It has been found that, by using a quadratic function surrogate model, the optimiser searching performance is greatly improved

    A survey on handling computationally expensive multiobjective optimization problems with evolutionary algorithms

    Get PDF
    This is the author accepted manuscript. The final version is available from Springer Verlag via the DOI in this record.Evolutionary algorithms are widely used for solving multiobjective optimization problems but are often criticized because of a large number of function evaluations needed. Approximations, especially function approximations, also referred to as surrogates or metamodels are commonly used in the literature to reduce the computation time. This paper presents a survey of 45 different recent algorithms proposed in the literature between 2008 and 2016 to handle computationally expensive multiobjective optimization problems. Several algorithms are discussed based on what kind of an approximation such as problem, function or fitness approximation they use. Most emphasis is given to function approximation-based algorithms. We also compare these algorithms based on different criteria such as metamodeling technique and evolutionary algorithm used, type and dimensions of the problem solved, handling constraints, training time and the type of evolution control. Furthermore, we identify and discuss some promising elements and major issues among algorithms in the literature related to using an approximation and numerical settings used. In addition, we discuss selecting an algorithm to solve a given computationally expensive multiobjective optimization problem based on the dimensions in both objective and decision spaces and the computation budget available.The research of Tinkle Chugh was funded by the COMAS Doctoral Program (at the University of Jyväskylä) and FiDiPro Project DeCoMo (funded by Tekes, the Finnish Funding Agency for Innovation), and the research of Dr. Karthik Sindhya was funded by SIMPRO project funded by Tekes as well as DeCoMo

    Guiding evolutionary search towards innovative solutions

    Get PDF
    The main goal of this work is to develop a method that, operating on top of an Evolutionary Algorithm, increases its likeliness of finding innovative solutions. This likeliness is laid out to be increased with the diversity of the solutions found, provided that they are of sufficient quality. The developed method needs to be applicable in a scenario in which the search is required to be started from a single, fixed solution. Therefore, a scheme is envisioned in which the search is performed in a sequential fashion, zooming in on a locally-optimal solution, and then exploring for a new potentially high-quality region based on a memory of solutions encountered earlier in the search. Two exploration criteria, one using an archive of earlier solutions as memory and the other deriving from a surrogate model trained on earlier solutions, were established to be worthwhile for integration into quality-based search. The resulting schemes were applied to a real-world airfoil optimization task, showing both to perform better than the baseline method of multiple standard optimization runs. The model-based approach delivers the best results, in the sense that it finds more solutions, more diverse solutions, and better-quality solutions than the baseline method.Honda Research Institute Europe (HRI-EU)Algorithms and the Foundations of Software technolog

    Towards Better Integration of Surrogate Models and Optimizers

    Get PDF
    Surrogate-Assisted Evolutionary Algorithms (SAEAs) have been proven to be very effective in solving (synthetic and real-world) computationally expensive optimization problems with a limited number of function evaluations. The two main components of SAEAs are: the surrogate model and the evolutionary optimizer, both of which use parameters to control their respective behavior. These parameters are likely to interact closely, and hence the exploitation of any such relationships may lead to the design of an enhanced SAEA. In this chapter, as a first step, we focus on Kriging and the Efficient Global Optimization (EGO) framework. We discuss potentially profitable ways of a better integration of model and optimizer. Furthermore, we investigate in depth how different parameters of the model and the optimizer impact optimization results. In particular, we determine whether there are any interactions between these parameters, and how the problem characteristics impact optimization results. In the experimental study, we use the popular Black-Box Optimization Benchmarking (BBOB) testbed. Interestingly, the analysis finds no evidence for significant interactions between model and optimizer parameters, but independently their performance has a significant interaction with the objective function. Based on our results, we make recommendations on how best to configure EGO

    Evolution strategies for robust optimization

    Get PDF
    Real-world (black-box) optimization problems often involve various types of uncertainties and noise emerging in different parts of the optimization problem. When this is not accounted for, optimization may fail or may yield solutions that are optimal in the classical strict notion of optimality, but fail in practice. Robust optimization is the practice of optimization that actively accounts for uncertainties and/or noise. Evolutionary Algorithms form a class of optimization algorithms that use the principle of evolution to find good solutions to optimization problems. Because uncertainty and noise are indispensable parts of nature, this class of optimization algorithms seems to be a logical choice for robust optimization scenarios. This thesis provides a clear definition of the term robust optimization and a comparison and practical guidelines on how Evolution Strategies, a subclass of Evolutionary Algorithms for real-parameter optimization problems, should be adapted for such scenarios.UBL - phd migration 201

    Efficient tuning in supervised machine learning

    Get PDF
    The tuning of learning algorithm parameters has become more and more important during the last years. With the fast growth of computational power and available memory databases have grown dramatically. This is very challenging for the tuning of parameters arising in machine learning, since the training can become very time-consuming for large datasets. For this reason efficient tuning methods are required, which are able to improve the predictions of the learning algorithms. In this thesis we incorporate model-assisted optimization techniques, for performing efficient optimization on noisy datasets with very limited budgets. Under this umbrella we also combine learning algorithms with methods for feature construction and selection. We propose to integrate a variety of elements into the learning process. E.g., can tuning be helpful in learning tasks like time series regression using state-of-the-art machine learning algorithms? Are statistical methods capable to reduce noise e ffects? Can surrogate models like Kriging learn a reasonable mapping of the parameter landscape to the quality measures, or are they deteriorated by disturbing factors? Summarizing all these parts, we analyze if superior learning algorithms can be created, with a special focus on efficient runtimes. Besides the advantages of systematic tuning approaches, we also highlight possible obstacles and issues of tuning. Di fferent tuning methods are compared and the impact of their features are exposed. It is a goal of this work to give users insights into applying state-of-the-art learning algorithms profitably in practiceBundesministerium f ür Bildung und Forschung (Germany), Cologne University of Applied Sciences (Germany), Kind-Steinm uller-Stiftung (Gummersbach, Germany)Algorithms and the Foundations of Software technolog
    corecore