14 research outputs found

    Application of artificial intelligence techniques in the optimization of single screw polymer extrusion

    Get PDF
    As with most real optimization problems, polymer processing technologies can be seen as multi-objective optimization problems. Due to the high computation times required by the numerical modelling routines usually available to calculate the values of the objective function, as a function of the decision variables, it is necessary to develop alternative optimization methodologies able to reduce the number of solutions to be evaluated, when compared with the technics normally employed, such as evolutionary algorithms. Therefore, in this work is proposed the use of artificial intelligence based on a data analysis technique designated by DAMICORE surpasses those limitations. An example from single screw polymer extrusion is used to illustrate the efficient use of a methodology proposed.This research was partially funded by NAWA-Narodowa Agencja Wymiany Akademickiej, under grant PPN/ULM/2020/1/00125 and European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie Grant Agreement No 734205–H2020-MSCA-RISE-2016. The authors also acknowledge the funding by FEDER funds through the COMPETE 2020 Programme and National Funds through FCT (Portuguese Foundation for Science and Technology) under the projects UIDB/05256/2020, and UID-P/05256/2020, the Center for Mathematical Sciences Applied to Industry (CeMEAI) and the support from the São Paulo Research Foundation (FAPESP grant No 2013/07375-0, the Center for Artificial Intelligence (C4AI-USP), the support from the São Paulo Research Foundation (FAPESP grant No 2019/07665-4) and the IBM Corporation

    Evolving Neural Networks through a Reverse Encoding Tree

    Full text link
    NeuroEvolution is one of the most competitive evolutionary learning frameworks for designing novel neural networks for use in specific tasks, such as logic circuit design and digital gaming. However, the application of benchmark methods such as the NeuroEvolution of Augmenting Topologies (NEAT) remains a challenge, in terms of their computational cost and search time inefficiency. This paper advances a method which incorporates a type of topological edge coding, named Reverse Encoding Tree (RET), for evolving scalable neural networks efficiently. Using RET, two types of approaches -- NEAT with Binary search encoding (Bi-NEAT) and NEAT with Golden-Section search encoding (GS-NEAT) -- have been designed to solve problems in benchmark continuous learning environments such as logic gates, Cartpole, and Lunar Lander, and tested against classical NEAT and FS-NEAT as baselines. Additionally, we conduct a robustness test to evaluate the resilience of the proposed NEAT algorithms. The results show that the two proposed strategies deliver improved performance, characterized by (1) a higher accumulated reward within a finite number of time steps; (2) using fewer episodes to solve problems in targeted environments, and (3) maintaining adaptive robustness under noisy perturbations, which outperform the baselines in all tested cases. Our analysis also demonstrates that RET expends potential future research directions in dynamic environments. Code is available from https://github.com/HaolingZHANG/ReverseEncodingTree.Comment: Accepted to IEEE Congress on Evolutionary Computation (IEEE CEC) 2020. Lecture Presentatio

    Particle swarm optimization with Monte-Carlo simulation and hypothesis testing for network reliability problem

    Get PDF
    The performance of Monte-Carlo Simulation(MCS) is highly related to the number of simulation. This paper introduces a hypothesis testing technique and incorporated into a Particle Swarm Optimization(PSO) based Monte-Carlo Simulation(MCS) algorithm to solve the complex network reliability problem. The function of hypothesis testing technique is to reduce the dispensable simulation in network system reliability estimation. The proposed technique contains three components: hypothesis testing, network reliability calculation and PSO algorithm for finding solutions. The function of hypothesis testing is to abandon unpromising solutions; we use Monte-Carlo simulation to obtain network reliability; since the network reliability problem is NP-hard, PSO algorithm is applied. Since the execution time can be better decreased with the decrease of Confidence level of hypothesis testing in a range, but the solution becomes worse when the confidence level exceed a critical value, the experiment are carried out on different confidence levels for finding the critical value. The experimental results show that the proposed method can reduce the computational cost without any loss of its performance under a certain confidence level

    Data-Driven Surrogate-Assisted Multiobjective Evolutionary Optimization of a Trauma System

    Full text link

    A survey on handling computationally expensive multiobjective optimization problems with evolutionary algorithms

    Get PDF
    This is the author accepted manuscript. The final version is available from Springer Verlag via the DOI in this record.Evolutionary algorithms are widely used for solving multiobjective optimization problems but are often criticized because of a large number of function evaluations needed. Approximations, especially function approximations, also referred to as surrogates or metamodels are commonly used in the literature to reduce the computation time. This paper presents a survey of 45 different recent algorithms proposed in the literature between 2008 and 2016 to handle computationally expensive multiobjective optimization problems. Several algorithms are discussed based on what kind of an approximation such as problem, function or fitness approximation they use. Most emphasis is given to function approximation-based algorithms. We also compare these algorithms based on different criteria such as metamodeling technique and evolutionary algorithm used, type and dimensions of the problem solved, handling constraints, training time and the type of evolution control. Furthermore, we identify and discuss some promising elements and major issues among algorithms in the literature related to using an approximation and numerical settings used. In addition, we discuss selecting an algorithm to solve a given computationally expensive multiobjective optimization problem based on the dimensions in both objective and decision spaces and the computation budget available.The research of Tinkle Chugh was funded by the COMAS Doctoral Program (at the University of Jyväskylä) and FiDiPro Project DeCoMo (funded by Tekes, the Finnish Funding Agency for Innovation), and the research of Dr. Karthik Sindhya was funded by SIMPRO project funded by Tekes as well as DeCoMo

    A Prediction Modeling Framework For Noisy Welding Quality Data

    Get PDF
    Numerous and various research projects have been conducted to utilize historical manufacturing process data in product design. These manufacturing process data often contain data inconsistencies, and it causes challenges in extracting useful information from the data. In resistance spot welding (RSW), data inconsistency is a well-known issue. In general, such inconsistent data are treated as noise data and removed from the original dataset before conducting analyses or constructing prediction models. This may not be desirable for every design and manufacturing applications since every data can contain important information to further explain the process. In this research, we propose a prediction modeling framework, which employs bootstrap aggregating (bagging) with support vector regression (SVR) as the base learning algorithm to improve the prediction accuracy on such noisy data. Optimal hyper-parameters for SVR are selected by particle swarm optimization (PSO) with meta-modeling. Constructing bagging models require 114 more computational costs than a single model. Also, evolutionary computation algorithms, such as PSO, generally require a large number of candidate solution evaluations to achieve quality solutions. These two requirements greatly increase the overall computational cost in constructing effective bagging SVR models. Meta-modeling can be employed to reduce the computational cost when the fitness or constraints functions are associated with computationally expensive tasks or analyses. In our case, the objective function is associated with constructing bagging SVR models with candidate sets of hyper-parameters. Therefore, in regards to PSO, a large number of bagging SVR models have to be constructed and evaluated, which is computationally expensive. The meta-modeling approach, called MUGPSO, developed in this research assists PSO in evaluating these candidate solutions (i.e., sets of hyper-parameters). MUGPSO approximates the fitness function of candidate solutions. Through this method, the numbers of real fitness function evaluations (i.e., constructing bagging SVR models) are reduced, which also reduces the overall computational costs. Using the Meta2 framework, one can expect an improvement in the prediction accuracy with reduced computational time. Experiments are conducted on three artificially generated noisy datasets and a real RSW quality dataset. The results indicate that Meta2 is capable of providing promising solutions with noticeably reduced computational costs

    Reducing Fitness Evaluations Using Clustering Techniques and Neural Network Ensembles

    No full text
    Jin Y, Sendhoff B. Reducing Fitness Evaluations Using Clustering Techniques and Neural Network Ensembles. In: Deb K, ed. Genetic and Evolutionary Computation – GECCO 2004. Genetic and Evolutionary Computation Conference, Seattle, WA, USA, June 26-30, 2004. Proceedings, Part I. Lecture Notes in Computer Science. Vol 3102. Berlin, Heidelberg: Springer; 2004: 688-699
    corecore