11 research outputs found

    A General Framework for Statistical Performance Comparison of Evolutionary Computation Algorithms

    Get PDF
    This paper proposes a statistical methodology for comparing the performance of evolutionary computation algorithms. A two-fold sampling scheme for collecting performance data is introduced, and these data are analyzed using bootstrap-based multiple hypothesis testing procedures. The proposed method is sufficiently flexible to allow the researcher to choose how performance is measured, does not rely upon distributional assumptions, and can be extended to analyze many other randomized numeric optimization routines. As a result, this approach offers a convenient, flexible, and reliable technique for comparing algorithms in a wide variety of applications

    Firefly Algorithm, Stochastic Test Functions and Design Optimisation

    Full text link
    Modern optimisation algorithms are often metaheuristic, and they are very promising in solving NP-hard optimization problems. In this paper, we show how to use the recently developed Firefly Algorithm to solve nonlinear design problems. For the standard pressure vessel design optimisation, the optimal solution found by FA is far better than the best solution obtained previously in literature. In addition, we also propose a few new test functions with either singularity or stochastic components but with known global optimality, and thus they can be used to validate new optimisation algorithms. Possible topics for further research are also discussed.Comment: 12 pages, 11 figure

    Review of Metaheuristics and Generalized Evolutionary Walk Algorithm

    Full text link
    Metaheuristic algorithms are often nature-inspired, and they are becoming very powerful in solving global optimization problems. More than a dozen of major metaheuristic algorithms have been developed over the last three decades, and there exist even more variants and hybrid of metaheuristics. This paper intends to provide an overview of nature-inspired metaheuristic algorithms, from a brief history to their applications. We try to analyze the main components of these algorithms and how and why they works. Then, we intend to provide a unified view of metaheuristics by proposing a generalized evolutionary walk algorithm (GEWA). Finally, we discuss some of the important open questions.Comment: 14 page

    Deterministic Mutation-Based Algorithm for Model Structure Selection in Discrete-Time System Identification

    Get PDF
    System identification is a method of determining a mathematical relation between variables and terms of a process based on observed input-output data. Model structure selection is one of the important steps in a system identification process. Evolutionary computation (EC) is known to be an effective search and optimization method and in this paper EC is proposed as a model structure selection algorithm. Since EC, like genetic algorithm, relies on randomness and probabilities, it is cumbersome when constraints are present in the search. In this regard, EC requires the incorporation of additional evaluation functions, hence, additional computation time. A deterministic mutation-based algorithm is introduced to overcome this problem. Identification studies using NARX (Nonlinear AutoRegressive with eXogenous input) models employing simulated systems and real plant data are used to demonstrate that the algorithm is able to detect significant variables and terms faster and to select a simpler model structure than other well-known EC methods

    Measuring performance of optimization algorithms in evolutionary computation

    Get PDF
    Reporting the results of optimization algorithms in evolutionary computation is a challenging task with many potential pitfalls. The source of problems is their stochastic nature and inability to guarantee an optimal solution in polynomial time. One of the basic questions that is often not addressed concerns the method of summarizing the entire distribution of solutions into a single value. Although the mean value is used by default for that purpose, the best solution obtained is also occasionally used in addition to or instead of it. Based on our analysis of different possibilities for measuring the performance of stochastic optimization algorithms presented in this paper we propose quantiles as the standard measure of performance. Quantiles can be naturally interpreted for the designated purpose. Besides, they are defined even when the arithmetic mean is not, and are applicable in cases of multiple executions of an algorithm. Our study also showed that, on the contrary to many other fields, in the case of stochastic optimization algorithms the greater variability in measured data can be considered as an advantage

    Designs of Digital Filters and Neural Networks using Firefly Algorithm

    Get PDF
    Firefly algorithm is an evolutionary algorithm that can be used to solve complex multi-parameter problems in less time. The algorithm was applied to design digital filters of different orders as well as to determine the parameters of complex neural network designs. Digital filters have several applications in the fields of control systems, aerospace, telecommunication, medical equipment and applications, digital appliances, audio recognition processes etc. An Artificial Neural Network (ANN) is an information processing paradigm that is inspired by the way biological nervous systems, such as the brain, processes information and can be simulated using a computer to perform certain specific tasks like clustering, classification, and pattern recognition etc. The results of the designs using Firefly algorithm was compared to the state of the art algorithms and found that the digital filter designs produce results close to the Parks McClellan method which shows the algorithm’s capability of handling complex problems. Also, for the neural network designs, Firefly algorithm was able to efficiently optimize a number of parameter values. The performance of the algorithm was tested by introducing various input noise levels to the training inputs of the neural network designs and it produced the desired output with negligible error in a time-efficient manner. Overall, Firefly algorithm was found to be competitive in solving the complex design optimization problems like other popular optimization algorithms such as Differential Evolution, Particle Swarm Optimization and Genetic Algorithm. It provides a number of adjustable parameters which can be tuned according to the specified problem so that it can be applied to a number of optimization problems and is capable of producing quality results in a reasonable amount of time

    Continuous flow optimisation of nanoparticle catalysts

    Get PDF
    Continuous flow reactors offer a host of advantages over their more traditional batch counterparts. These include more controlled mixing, enhanced heat transfer and increased safety when handling hazardous reagents as only a small volume of material is present within the reactor at any one time. For these reasons, flow reactors are becoming increasingly popular for the synthesis of nanoparticle catalysts. Recent advances in reactor technology and automation have transformed how chemical products are developed and tested. Automated continuous flow reactors have been coupled with machine learning algorithms in closed feedback loops, allowing vast areas of multi-dimensional experimental space to be explored quickly and efficiently, significantly accelerating the identification of optimum synthesis conditions. While both reducing costs and improving the sustainability of process development. This work describes the development of a novel two-stage autonomous reactor for the optimisation of nanoparticle catalysts by direct observation of their performance in a catalysed chemical reaction. The key advantage of this performance directed system is that no offline processing or analysis of the nanoparticles is required. Allowing both the nanoparticle properties and the nanoparticle catalysed reaction conditions to be optimised in tandem by an automated system with zero human intervention. Chapter 1 introduces the principles and methods underlying this work with a focus on nanoparticle catalysts, flow reactor technologies and optimisation algorithms. Chapter 2 describes a self-optimising reactor capable of nanoparticle catalysed reaction optimisation. Chapter 3 shows the development of a reactor which was able to produce alloyed nanoparticle catalysts with tuneable composition. Chapter 4 describes a body of work surrounding the computational modelling of nanoparticle catalysed reactions for the evaluation of different optimisation algorithms. Chapter 5 concludes this project by presenting a two-stage reactor which was able to optimise both the physical properties of the nanoparticles as well as the conditions under which they were used to catalyse a reaction

    A general framework for statistical performance comparison of evolutionary computation algorithms

    No full text
    This paper proposes a statistical methodology for comparing the performance of evolutionary computation algorithms. A two-fold sampling scheme for collecting performance data is introduced, and this data is assessed using a multiple hypothesis testing framework relying on a bootstrap resampling procedure. The proposed method offers a convenient, flexible, and reliable approach to comparing algorithms in a wide variety of applications. KEY WORDS Evolutionary computation, statistics, performanc
    corecore