6,023 research outputs found

    Optimal Parameter Choices Through Self-Adjustment: Applying the 1/5-th Rule in Discrete Settings

    Full text link
    While evolutionary algorithms are known to be very successful for a broad range of applications, the algorithm designer is often left with many algorithmic choices, for example, the size of the population, the mutation rates, and the crossover rates of the algorithm. These parameters are known to have a crucial influence on the optimization time, and thus need to be chosen carefully, a task that often requires substantial efforts. Moreover, the optimal parameters can change during the optimization process. It is therefore of great interest to design mechanisms that dynamically choose best-possible parameters. An example for such an update mechanism is the one-fifth success rule for step-size adaption in evolutionary strategies. While in continuous domains this principle is well understood also from a mathematical point of view, no comparable theory is available for problems in discrete domains. In this work we show that the one-fifth success rule can be effective also in discrete settings. We regard the (1+(λ,λ))(1+(\lambda,\lambda))~GA proposed in [Doerr/Doerr/Ebel: From black-box complexity to designing new genetic algorithms, TCS 2015]. We prove that if its population size is chosen according to the one-fifth success rule then the expected optimization time on \textsc{OneMax} is linear. This is better than what \emph{any} static population size λ\lambda can achieve and is asymptotically optimal also among all adaptive parameter choices.Comment: This is the full version of a paper that is to appear at GECCO 201

    Use of Statistical Outlier Detection Method in Adaptive\ud Evolutionary Algorithms

    Get PDF
    In this paper, the issue of adapting probabilities for Evolutionary Algorithm (EA) search operators is revisited. A framework is devised for distinguishing between measurements of performance and the interpretation of those measurements for purposes of adaptation. Several examples of measurements and statistical interpretations are provided. Probability value adaptation is tested using an EA with 10 search operators against 10 test problems with results indicating that both the type of measurement and its statistical interpretation play significant roles in EA performance. We also find that selecting operators based on the prevalence of outliers rather than on average performance is able to provide considerable improvements to\ud adaptive methods and soundly outperforms the non-adaptive\ud case

    Use of statistical outlier detection method in adaptive evolutionary algorithms

    Full text link
    In this paper, the issue of adapting probabilities for Evolutionary Algorithm (EA) search operators is revisited. A framework is devised for distinguishing between measurements of performance and the interpretation of those measurements for purposes of adaptation. Several examples of measurements and statistical interpretations are provided. Probability value adaptation is tested using an EA with 10 search operators against 10 test problems with results indicating that both the type of measurement and its statistical interpretation play significant roles in EA performance. We also find that selecting operators based on the prevalence of outliers rather than on average performance is able to provide considerable improvements to adaptive methods and soundly outperforms the non-adaptive case

    First Steps Towards a Runtime Analysis of Neuroevolution

    Full text link
    We consider a simple setting in neuroevolution where an evolutionary algorithm optimizes the weights and activation functions of a simple artificial neural network. We then define simple example functions to be learned by the network and conduct rigorous runtime analyses for networks with a single neuron and for a more advanced structure with several neurons and two layers. Our results show that the proposed algorithm is generally efficient on two example problems designed for one neuron and efficient with at least constant probability on the example problem for a two-layer network. In particular, the so-called harmonic mutation operator choosing steps of size jj with probability proportional to 1/j1/j turns out as a good choice for the underlying search space. However, for the case of one neuron, we also identify situations with hard-to-overcome local optima. Experimental investigations of our neuroevolutionary algorithm and a state-of-the-art CMA-ES support the theoretical findings.Comment: 27 pages; full version of paper published at FOGA 2023 and available at AC

    08051 Abstracts Collection -- Theory of Evolutionary Algorithms

    Get PDF
    From Jan. 27, 2008 to Feb. 1, 2008, the Dagstuhl Seminar 08051 ``Theory of Evolutionary Algorithms\u27\u27 was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available
    corecore