167 research outputs found

    Convergence Properties of Two ({\mu} + {\lambda}) Evolutionary Algorithms On OneMax and Royal Roads Test Functions

    Get PDF
    We present a number of bounds on convergence time for two elitist population-based Evolutionary Algorithms using a recombination operator k-Bit-Swap and a mainstream Randomized Local Search algorithm. We study the effect of distribution of elite species and population size.Comment: accepted for ECTA 201

    Elitism Levels Traverse Mechanism For The Derivation of Upper Bounds on Unimodal Functions

    Get PDF
    In this article we present an Elitism Levels Traverse Mechanism that we designed to find bounds on population-based Evolutionary algorithms solving unimodal functions. We prove its efficiency theoretically and test it on OneMax function deriving bounds c{\mu}n log n - O({\mu} n). This analysis can be generalized to any similar algorithm using variants of tournament selection and genetic operators that flip or swap only 1 bit in each string.Comment: accepted to Congress on Evolutionary Computation (WCCI/CEC) 201

    Analysis of Noisy Evolutionary Optimization When Sampling Fails

    Full text link
    In noisy evolutionary optimization, sampling is a common strategy to deal with noise. By the sampling strategy, the fitness of a solution is evaluated multiple times (called \emph{sample size}) independently, and its true fitness is then approximated by the average of these evaluations. Previous studies on sampling are mainly empirical. In this paper, we first investigate the effect of sample size from a theoretical perspective. By analyzing the (1+1)-EA on the noisy LeadingOnes problem, we show that as the sample size increases, the running time can reduce from exponential to polynomial, but then return to exponential. This suggests that a proper sample size is crucial in practice. Then, we investigate what strategies can work when sampling with any fixed sample size fails. By two illustrative examples, we prove that using parent or offspring populations can be better. Finally, we construct an artificial noisy example to show that when using neither sampling nor populations is effective, adaptive sampling (i.e., sampling with an adaptive sample size) can work. This, for the first time, provides a theoretical support for the use of adaptive sampling

    Upper Bounds on the Runtime of the Univariate Marginal Distribution Algorithm on OneMax

    Full text link
    A runtime analysis of the Univariate Marginal Distribution Algorithm (UMDA) is presented on the OneMax function for wide ranges of its parameters μ\mu and λ\lambda. If μclogn\mu\ge c\log n for some constant c>0c>0 and λ=(1+Θ(1))μ\lambda=(1+\Theta(1))\mu, a general bound O(μn)O(\mu n) on the expected runtime is obtained. This bound crucially assumes that all marginal probabilities of the algorithm are confined to the interval [1/n,11/n][1/n,1-1/n]. If μcnlogn\mu\ge c' \sqrt{n}\log n for a constant c>0c'>0 and λ=(1+Θ(1))μ\lambda=(1+\Theta(1))\mu, the behavior of the algorithm changes and the bound on the expected runtime becomes O(μn)O(\mu\sqrt{n}), which typically even holds if the borders on the marginal probabilities are omitted. The results supplement the recently derived lower bound Ω(μn+nlogn)\Omega(\mu\sqrt{n}+n\log n) by Krejca and Witt (FOGA 2017) and turn out as tight for the two very different values μ=clogn\mu=c\log n and μ=cnlogn\mu=c'\sqrt{n}\log n. They also improve the previously best known upper bound O(nlognloglogn)O(n\log n\log\log n) by Dang and Lehre (GECCO 2015).Comment: Version 4: added illustrations and experiments; improved presentation in Section 2.2; to appear in Algorithmica; the final publication is available at Springer via http://dx.doi.org/10.1007/s00453-018-0463-
    corecore