95,431 research outputs found

    Theory and practice of population diversity in evolutionary computation

    Get PDF
    Divergence of character is a cornerstone of natural evolution. On the contrary, evolutionary optimization processes are plagued by an endemic lack of population diversity: all candidate solutions eventually crowd the very same areas in the search space. The problem is usually labeled with the oxymoron “premature convergence” and has very different consequences on the different applications, almost all deleterious. At the same time, case studies from theoretical runtime analyses irrefutably demonstrate the benefits of diversity. This tutorial will give an introduction into the area of “diversity promotion”: we will define the term “diversity” in the context of Evolutionary Computation, showing how practitioners tried, with mixed results, to promote it. Then, we will analyze the benefits brought by population diversity in specific contexts, namely global exploration and enhancing the power of crossover. To this end, we will survey recent results from rigorous runtime analysis on selected problems. The presented analyses rigorously quantify the performance of evolutionary algorithms in the light of population diversity, laying the foundation for a rigorous understanding of how search dynamics are affected by the presence or absence of diversity and the introduction of diversity mechanisms

    Advances in Evolutionary Algorithms

    Get PDF
    With the recent trends towards massive data sets and significant computational power, combined with evolutionary algorithmic advances evolutionary computation is becoming much more relevant to practice. Aim of the book is to present recent improvements, innovative ideas and concepts in a part of a huge EA field

    Automatic General of a Neural Network Architecture Using Evolutionary Computation

    Get PDF
    This paper reports the application of evolutionary computation in the automatic generation of a neural network architecture. It is a usual practice to use trial and error to find a suitable neural network architecture. This is not only time consuming but may not generate an optimal solution for a given problem. The use of evolutionary computation is a step towards automation in architecture generation. In this paper a brief introduction to the field is given as well as an implementation of automatic neural network generation using genetic programmin

    The 1/5-th Rule with Rollbacks: On Self-Adjustment of the Population Size in the (1+(λ,λ))(1+(\lambda,\lambda)) GA

    Full text link
    Self-adjustment of parameters can significantly improve the performance of evolutionary algorithms. A notable example is the (1+(λ,λ))(1+(\lambda,\lambda)) genetic algorithm, where the adaptation of the population size helps to achieve the linear runtime on the OneMax problem. However, on problems which interfere with the assumptions behind the self-adjustment procedure, its usage can lead to performance degradation compared to static parameter choices. In particular, the one fifth rule, which guides the adaptation in the example above, is able to raise the population size too fast on problems which are too far away from the perfect fitness-distance correlation. We propose a modification of the one fifth rule in order to have less negative impact on the performance in scenarios when the original rule reduces the performance. Our modification, while still having a good performance on OneMax, both theoretically and in practice, also shows better results on linear functions with random weights and on random satisfiable MAX-SAT instances.Comment: 17 pages, 2 figures, 1 table. An extended two-page abstract of this work will appear in proceedings of the Genetic and Evolutionary Computation Conference, GECCO'1

    EvoPER-An R package for applying evolutionary computation methods in the parameter estimation of individual-based models implemented in Repast

    Get PDF
    Individual-based models are complex and they normally have an elevated number of input parameters which must be tuned in order to reproduce the experimental or observed data as accurately as possible. Hence one of the weakest points of such kind of models is the fact that rarely the modeler has the enough information about the correct values or even the acceptable range for the input parameters. Therefore, several parameter combinations must be checked to find an acceptable set of input factors minimizing the deviations of simulated and observed data. In practice, most of the times, is computationally unfeasible to traverse the complete search space to check all parameter combination in order to find the best of them. That is precisely the kind of combinatorial problem suitable for evolutionary computation techniques. In this work we present the EvoPER, an R package for simplifying the parameter estimation using evolutionary computation techniques. The current version of EvoPER includes implementations of PSO, SA and ACO algorithms for parameter estimation of models generated with the open source agent-based modeling toolkit Repast
    • 

    corecore