1,650 research outputs found
An Exponential Lower Bound for the Runtime of the cGA on Jump Functions
In the first runtime analysis of an estimation-of-distribution algorithm
(EDA) on the multi-modal jump function class, Hasen\"ohrl and Sutton (GECCO
2018) proved that the runtime of the compact genetic algorithm with suitable
parameter choice on jump functions with high probability is at most polynomial
(in the dimension) if the jump size is at most logarithmic (in the dimension),
and is at most exponential in the jump size if the jump size is
super-logarithmic. The exponential runtime guarantee was achieved with a
hypothetical population size that is also exponential in the jump size.
Consequently, this setting cannot lead to a better runtime.
In this work, we show that any choice of the hypothetical population size
leads to a runtime that, with high probability, is at least exponential in the
jump size. This result might be the first non-trivial exponential lower bound
for EDAs that holds for arbitrary parameter settings.Comment: To appear in the Proceedings of FOGA 2019. arXiv admin note: text
overlap with arXiv:1903.1098
Fourier Analysis Meets Runtime Analysis: Precise Runtimes on Plateaus
We propose a new method based on discrete Fourier analysis to analyze the
time evolutionary algorithms spend on plateaus. This immediately gives a
concise proof of the classic estimate of the expected runtime of the
evolutionary algorithm on the Needle problem due to Garnier, Kallel, and
Schoenauer (1999).
We also use this method to analyze the runtime of the evolutionary
algorithm on a new benchmark consisting of plateaus of effective size
which have to be optimized sequentially in a LeadingOnes fashion.
Using our new method, we determine the precise expected runtime both for
static and fitness-dependent mutation rates. We also determine the
asymptotically optimal static and fitness-dependent mutation rates. For , the optimal static mutation rate is approximately . The optimal
fitness dependent mutation rate, when the first fitness-relevant bits have
been found, is asymptotically . These results, so far only proven for
the single-instance problem LeadingOnes, are thus true in a much broader
respect. We expect similar extensions to be true for other important results on
LeadingOnes. We are also optimistic that our Fourier analysis approach can be
applied to other plateau problems as well.Comment: 40 page
Runtime Analysis of the Genetic Algorithm on Random Satisfiable 3-CNF Formulas
The genetic algorithm, first proposed at GECCO 2013,
showed a surprisingly good performance on so me optimization problems. The
theoretical analysis so far was restricted to the OneMax test function, where
this GA profited from the perfect fitness-distance correlation. In this work,
we conduct a rigorous runtime analysis of this GA on random 3-SAT instances in
the planted solution model having at least logarithmic average degree, which
are known to have a weaker fitness distance correlation.
We prove that this GA with fixed not too large population size again obtains
runtimes better than , which is a lower bound for most
evolutionary algorithms on pseudo-Boolean problems with unique optimum.
However, the self-adjusting version of the GA risks reaching population sizes
at which the intermediate selection of the GA, due to the weaker
fitness-distance correlation, is not able to distinguish a profitable offspring
from others. We show that this problem can be overcome by equipping the
self-adjusting GA with an upper limit for the population size. Apart from
sparse instances, this limit can be chosen in a way that the asymptotic
performance does not worsen compared to the idealistic OneMax case. Overall,
this work shows that the GA can provably have a good
performance on combinatorial search and optimization problems also in the
presence of a weaker fitness-distance correlation.Comment: An extended abstract of this report will appear in the proceedings of
the 2017 Genetic and Evolutionary Computation Conference (GECCO 2017
A Tight Runtime Analysis for the cGA on Jump Functions---EDAs Can Cross Fitness Valleys at No Extra Cost
We prove that the compact genetic algorithm (cGA) with hypothetical
population size with high
probability finds the optimum of any -dimensional jump function with jump
size in iterations. Since it is known
that the cGA with high probability needs at least iterations to optimize the unimodal OneMax function, our result shows that
the cGA in contrast to most classic evolutionary algorithms here is able to
cross moderate-sized valleys of low fitness at no extra cost.
Our runtime guarantee improves over the recent upper bound valid for of Hasen\"ohrl and
Sutton (GECCO 2018). For the best choice of the hypothetical population size,
this result gives a runtime guarantee of , whereas ours
gives .
We also provide a simple general method based on parallel runs that, under
mild conditions, (i)~overcomes the need to specify a suitable population size,
but gives a performance close to the one stemming from the best-possible
population size, and (ii)~transforms EDAs with high-probability performance
guarantees into EDAs with similar bounds on the expected runtime.Comment: 25 pages, full version of a paper to appear at GECCO 201
Probabilistic Tools for the Analysis of Randomized Optimization Heuristics
This chapter collects several probabilistic tools that proved to be useful in
the analysis of randomized search heuristics. This includes classic material
like Markov, Chebyshev and Chernoff inequalities, but also lesser known topics
like stochastic domination and coupling or Chernoff bounds for geometrically
distributed random variables and for negatively correlated random variables.
Most of the results presented here have appeared previously, some, however,
only in recent conference publications. While the focus is on collecting tools
for the analysis of randomized search heuristics, many of these may be useful
as well in the analysis of classic randomized algorithms or discrete random
structures.Comment: 91 page
Upper Bounds on the Runtime of the Univariate Marginal Distribution Algorithm on OneMax
A runtime analysis of the Univariate Marginal Distribution Algorithm (UMDA)
is presented on the OneMax function for wide ranges of its parameters and
. If for some constant and
, a general bound on the expected runtime
is obtained. This bound crucially assumes that all marginal probabilities of
the algorithm are confined to the interval . If for a constant and , the
behavior of the algorithm changes and the bound on the expected runtime becomes
, which typically even holds if the borders on the marginal
probabilities are omitted.
The results supplement the recently derived lower bound
by Krejca and Witt (FOGA 2017) and turn out as
tight for the two very different values and . They also improve the previously best known upper bound by Dang and Lehre (GECCO 2015).Comment: Version 4: added illustrations and experiments; improved presentation
in Section 2.2; to appear in Algorithmica; the final publication is available
at Springer via http://dx.doi.org/10.1007/s00453-018-0463-
- …