107 research outputs found
Optimal Parameter Choices Through Self-Adjustment: Applying the 1/5-th Rule in Discrete Settings
While evolutionary algorithms are known to be very successful for a broad
range of applications, the algorithm designer is often left with many
algorithmic choices, for example, the size of the population, the mutation
rates, and the crossover rates of the algorithm. These parameters are known to
have a crucial influence on the optimization time, and thus need to be chosen
carefully, a task that often requires substantial efforts. Moreover, the
optimal parameters can change during the optimization process. It is therefore
of great interest to design mechanisms that dynamically choose best-possible
parameters. An example for such an update mechanism is the one-fifth success
rule for step-size adaption in evolutionary strategies. While in continuous
domains this principle is well understood also from a mathematical point of
view, no comparable theory is available for problems in discrete domains.
In this work we show that the one-fifth success rule can be effective also in
discrete settings. We regard the ~GA proposed in
[Doerr/Doerr/Ebel: From black-box complexity to designing new genetic
algorithms, TCS 2015]. We prove that if its population size is chosen according
to the one-fifth success rule then the expected optimization time on
\textsc{OneMax} is linear. This is better than what \emph{any} static
population size can achieve and is asymptotically optimal also among
all adaptive parameter choices.Comment: This is the full version of a paper that is to appear at GECCO 201
Runtime Analysis for Self-adaptive Mutation Rates
We propose and analyze a self-adaptive version of the
evolutionary algorithm in which the current mutation rate is part of the
individual and thus also subject to mutation. A rigorous runtime analysis on
the OneMax benchmark function reveals that a simple local mutation scheme for
the rate leads to an expected optimization time (number of fitness evaluations)
of when is at least for
some constant . For all values of , this
performance is asymptotically best possible among all -parallel
mutation-based unbiased black-box algorithms.
Our result shows that self-adaptation in evolutionary computation can find
complex optimal parameter settings on the fly. At the same time, it proves that
a relatively complicated self-adjusting scheme for the mutation rate proposed
by Doerr, Gie{\ss}en, Witt, and Yang~(GECCO~2017) can be replaced by our simple
endogenous scheme.
On the technical side, the paper contributes new tools for the analysis of
two-dimensional drift processes arising in the analysis of dynamic parameter
choices in EAs, including bounds on occupation probabilities in processes with
non-constant drift
Simple Max-Min Ant Systems and the Optimization of Linear Pseudo-Boolean Functions
With this paper, we contribute to the understanding of ant colony
optimization (ACO) algorithms by formally analyzing their runtime behavior. We
study simple MAX-MIN ant systems on the class of linear pseudo-Boolean
functions defined on binary strings of length 'n'. Our investigations point out
how the progress according to function values is stored in pheromone. We
provide a general upper bound of O((n^3 \log n)/ \rho) for two ACO variants on
all linear functions, where (\rho) determines the pheromone update strength.
Furthermore, we show improved bounds for two well-known linear pseudo-Boolean
functions called OneMax and BinVal and give additional insights using an
experimental study.Comment: 19 pages, 2 figure
Analysis of Ant Colony Optimization and Population-Based Evolutionary Algorithms on Dynamic Problems
The Right Mutation Strength for Multi-Valued Decision Variables
The most common representation in evolutionary computation are bit strings.
This is ideal to model binary decision variables, but less useful for variables
taking more values. With very little theoretical work existing on how to use
evolutionary algorithms for such optimization problems, we study the run time
of simple evolutionary algorithms on some OneMax-like functions defined over
. More precisely, we regard a variety of
problem classes requesting the component-wise minimization of the distance to
an unknown target vector . For such problems we see a crucial
difference in how we extend the standard-bit mutation operator to these
multi-valued domains. While it is natural to select each position of the
solution vector to be changed independently with probability , there are
various ways to then change such a position. If we change each selected
position to a random value different from the original one, we obtain an
expected run time of . If we change each selected position
by either or (random choice), the optimization time reduces to
. If we use a random mutation strength with probability inversely proportional to and change
the selected position by either or (random choice), then the
optimization time becomes , bringing down
the dependence on from linear to polylogarithmic. One of our results
depends on a new variant of the lower bounding multiplicative drift theorem.Comment: an extended abstract of this work is to appear at GECCO 201
Offspring Population Size Matters when Comparing Evolutionary Algorithms with Self-Adjusting Mutation Rates
We analyze the performance of the 2-rate Evolutionary Algorithm
(EA) with self-adjusting mutation rate control, its 3-rate counterpart, and a
~EA variant using multiplicative update rules on the OneMax
problem. We compare their efficiency for offspring population sizes ranging up
to and problem sizes up to .
Our empirical results show that the ranking of the algorithms is very
consistent across all tested dimensions, but strongly depends on the population
size. While for small values of the 2-rate EA performs best, the
multiplicative updates become superior for starting for some threshold value of
between 50 and 100. Interestingly, for population sizes around 50,
the ~EA with static mutation rates performs on par with the best
of the self-adjusting algorithms.
We also consider how the lower bound for the mutation rate
influences the efficiency of the algorithms. We observe that for the 2-rate EA
and the EA with multiplicative update rules the more generous bound
gives better results than when is
small. For both algorithms the situation reverses for large~.Comment: To appear at Genetic and Evolutionary Computation Conference
(GECCO'19). v2: minor language revisio
OneMax in Black-Box Models with Several Restrictions
Black-box complexity studies lower bounds for the efficiency of
general-purpose black-box optimization algorithms such as evolutionary
algorithms and other search heuristics. Different models exist, each one being
designed to analyze a different aspect of typical heuristics such as the memory
size or the variation operators in use. While most of the previous works focus
on one particular such aspect, we consider in this work how the combination of
several algorithmic restrictions influence the black-box complexity. Our
testbed are so-called OneMax functions, a classical set of test functions that
is intimately related to classic coin-weighing problems and to the board game
Mastermind.
We analyze in particular the combined memory-restricted ranking-based
black-box complexity of OneMax for different memory sizes. While its isolated
memory-restricted as well as its ranking-based black-box complexity for bit
strings of length is only of order , the combined model does not
allow for algorithms being faster than linear in , as can be seen by
standard information-theoretic considerations. We show that this linear bound
is indeed asymptotically tight. Similar results are obtained for other memory-
and offspring-sizes. Our results also apply to the (Monte Carlo) complexity of
OneMax in the recently introduced elitist model, in which only the best-so-far
solution can be kept in the memory. Finally, we also provide improved lower
bounds for the complexity of OneMax in the regarded models.
Our result enlivens the quest for natural evolutionary algorithms optimizing
OneMax in iterations.Comment: This is the full version of a paper accepted to GECCO 201
- …