4,630 research outputs found

    Rank-Based Learning and Local Model Based Evolutionary Algorithm for High-Dimensional Expensive Multi-Objective Problems

    Full text link
    Surrogate-assisted evolutionary algorithms have been widely developed to solve complex and computationally expensive multi-objective optimization problems in recent years. However, when dealing with high-dimensional optimization problems, the performance of these surrogate-assisted multi-objective evolutionary algorithms deteriorate drastically. In this work, a novel Classifier-assisted rank-based learning and Local Model based multi-objective Evolutionary Algorithm (CLMEA) is proposed for high-dimensional expensive multi-objective optimization problems. The proposed algorithm consists of three parts: classifier-assisted rank-based learning, hypervolume-based non-dominated search, and local search in the relatively sparse objective space. Specifically, a probabilistic neural network is built as classifier to divide the offspring into a number of ranks. The offspring in different ranks uses rank-based learning strategy to generate more promising and informative candidates for real function evaluations. Then, radial basis function networks are built as surrogates to approximate the objective functions. After searching non-dominated solutions assisted by the surrogate model, the candidates with higher hypervolume improvement are selected for real evaluations. Subsequently, in order to maintain the diversity of solutions, the most uncertain sample point from the non-dominated solutions measured by the crowding distance is selected as the guided parent to further infill in the uncertain region of the front. The experimental results of benchmark problems and a real-world application on geothermal reservoir heat extraction optimization demonstrate that the proposed algorithm shows superior performance compared with the state-of-the-art surrogate-assisted multi-objective evolutionary algorithms. The source code for this work is available at https://github.com/JellyChen7/CLMEA

    Evolutionary Multiobjective Optimization Driven by Generative Adversarial Networks (GANs)

    Get PDF
    Recently, increasing works have proposed to drive evolutionary algorithms using machine learning models. Usually, the performance of such model based evolutionary algorithms is highly dependent on the training qualities of the adopted models. Since it usually requires a certain amount of data (i.e. the candidate solutions generated by the algorithms) for model training, the performance deteriorates rapidly with the increase of the problem scales, due to the curse of dimensionality. To address this issue, we propose a multi-objective evolutionary algorithm driven by the generative adversarial networks (GANs). At each generation of the proposed algorithm, the parent solutions are first classified into real and fake samples to train the GANs; then the offspring solutions are sampled by the trained GANs. Thanks to the powerful generative ability of the GANs, our proposed algorithm is capable of generating promising offspring solutions in high-dimensional decision space with limited training data. The proposed algorithm is tested on 10 benchmark problems with up to 200 decision variables. Experimental results on these test problems demonstrate the effectiveness of the proposed algorithm

    Uncertainty And Evolutionary Optimization: A Novel Approach

    Full text link
    Evolutionary algorithms (EA) have been widely accepted as efficient solvers for complex real world optimization problems, including engineering optimization. However, real world optimization problems often involve uncertain environment including noisy and/or dynamic environments, which pose major challenges to EA-based optimization. The presence of noise interferes with the evaluation and the selection process of EA, and thus adversely affects its performance. In addition, as presence of noise poses challenges to the evaluation of the fitness function, it may need to be estimated instead of being evaluated. Several existing approaches attempt to address this problem, such as introduction of diversity (hyper mutation, random immigrants, special operators) or incorporation of memory of the past (diploidy, case based memory). However, these approaches fail to adequately address the problem. In this paper we propose a Distributed Population Switching Evolutionary Algorithm (DPSEA) method that addresses optimization of functions with noisy fitness using a distributed population switching architecture, to simulate a distributed self-adaptive memory of the solution space. Local regression is used in the pseudo-populations to estimate the fitness. Successful applications to benchmark test problems ascertain the proposed method's superior performance in terms of both robustness and accuracy.Comment: In Proceedings of the The 9th IEEE Conference on Industrial Electronics and Applications (ICIEA 2014), IEEE Press, pp. 988-983, 201

    An Ensemble Surrogate-Based Framework for Expensive Multiobjective Evolutionary Optimization

    Get PDF
    Surrogate-assisted evolutionary algorithms (SAEAs) have become very popular for tackling computationally expensive multiobjective optimization problems (EMOPs), as the surrogate models in SAEAs can approximate EMOPs well, thereby reducing the time cost of the optimization process. However, with the increased number of decision variables in EMOPs, the prediction accuracy of surrogate models will deteriorate, which inevitably worsens the performance of SAEAs. To deal with this issue, this article suggests an ensemble surrogate-based framework for tackling EMOPs. In this framework, a global surrogate model is trained under the entire search space to explore the global area, while a number of surrogate submodels are trained under different search subspaces to exploit the subarea, so as to enhance the prediction accuracy and reliability. Moreover, a new infill sampling criterion is designed based on a set of reference vectors to select promising samples for training the models. To validate the generality and effectiveness of our framework, three state-of-the-art evolutionary algorithms [nondominated sorting genetic algorithm III (NSGA-III), multiobjective evolutionary algorithm based on decomposition with differential evolution (MOEA/D-DE) and reference vector-guided evolutionary algorithm (RVEA)] are embedded, which significantly improve their performance for solving most of the test EMOPs adopted in this article. When compared to some competitive SAEAs for solving EMOPs with up to 30 decision variables, the experimental results also validate the advantages of our approach in most cases

    Parametric Model Order Reduction of Guided Ultrasonic Wave Propagation in Fiber Metal Laminates with Damage

    Get PDF
    This paper focuses on parametric model order reduction (PMOR) of guided ultrasonic wave propagation and its interaction with damage in a fiber metal laminate (FML). Structural health monitoring in FML seeks to detect, localize and characterize the damage with high accuracy and minimal use of sensors. This can be achieved by the inverse problem analysis approach, which employs the signal measurement data recorded by the embedded sensors in the structure. The inverse analysis requires us to solve the forward simulation of the underlying system several thousand times. These simulations are often exorbitantly expensive and trigger the need for improving their computational efficiency. A PMOR approach hinged on the proper orthogonal decomposition method is presented in this paper. An adaptive parameter sampling technique is established with the aid of a surrogate model to efficiently update the reduced-order basis in a greedy fashion. A numerical experiment is conducted to illustrate the parametric training of the reduced-order model. The results show that the reduced-order solution based on the PMOR approach is accurately complying with that of the high fidelity solution
    • …
    corecore