7 research outputs found

    Noisy Optimization: Convergence with a Fixed Number of Resamplings

    Get PDF
    It is known that evolution strategies in continuous domains might not converge in the presence of noise. It is also known that, under mild assumptions, and using an increasing number of resamplings, one can mitigate the effect of additive noise and recover convergence. We show new sufficient conditions for the convergence of an evolutionary algorithm with constant number of resamplings; in particular, we get fast rates (log-linear convergence) provided that the variance decreases around the optimum slightly faster than in the so-called multiplicative noise model. Keywords: Noisy optimization, evolutionary algorithm, theory.Comment: EvoStar (2014

    A new selection ratio for large population sizes

    Get PDF
    International audienceMotivated by parallel optimization, we study the Self-Adaptation algorithm for large population sizes. We first show that the current version of this algorithm does not reach the theoretical bounds, then we propose a very simple modification, in the selection part of the evolution process. We show that this simple modification leads to big improvement of the speed-up when the population size is large

    Bias and variance in continuous EDA

    Get PDF
    International audienceEstimation of Distribution Algorithms are based on statistical estimates. We show that when combining classical tools from statistics, namely bias/variance decomposition, reweighting and quasi-randomization, we can strongly improve the convergence rate. All modifications are easy, compliant with most algorithms, and experimentally very efficient in particular in the parallel case (large offsprings)

    A new selection ratio for large population sizes

    Get PDF
    International audienceMotivated by parallel optimization, we study the Self-Adaptation algorithm for large population sizes. We first show that the current version of this algorithm does not reach the theoretical bounds, then we propose a very simple modification, in the selection part of the evolution process. We show that this simple modification leads to big improvement of the speed-up when the population size is large

    Noisy Optimization: Convergence with a Fixed Number of Resamplings

    Get PDF
    International audienceIt is known that evolution strategies in continuous domains might not converge in the presence of noise. It is also known that, under mild assumptions, and using an increasing number of resamplings, one can mitigate the effect of additive noise and recover convergence. We show new sufficient conditions for the convergence of an evolutionary algorithm with constant number of resamplings; in particular, we get fast rates (log-linear convergence) provided that the variance decreases around the optimum slightly faster than in the so-called multiplicative noise model

    Lower bounds for evolution strategies using VC-dimension

    Get PDF
    We derive lower bounds for comparison-based or selection-based algorithms, improving existing results in the continuous setting, and extending them to non-trivial results in the discrete case. We introduce for that the use of the VC-dimension of the level sets of the fitness functions; results are then obtained through the use of Sauer’s lemma. In the special case of optmization of the sphere function, improved lower bounds are obtained by bounding the possible number of sign conditions realized by some systems of equations. The results include several applications to the parametrization of sequential or parallel algorithms of type (µ +, λ)-ES
    corecore