Evolution Strategies with Additive Noise: A Convergence Rate Lower Bound

Abstract

International audienceWe consider the problem of optimizing functions corrupted with additive noise. It is known that evolutionary algo-rithms can reach a simple regret O(1/ √ n) within logarith-mic factors, when n is the number of function evaluations. We show mathematically that this bound is tight, at least for a wide family of evolution strategies without large mutations

    Similar works