372 research outputs found
Optimal Parameter Choices Through Self-Adjustment: Applying the 1/5-th Rule in Discrete Settings
While evolutionary algorithms are known to be very successful for a broad
range of applications, the algorithm designer is often left with many
algorithmic choices, for example, the size of the population, the mutation
rates, and the crossover rates of the algorithm. These parameters are known to
have a crucial influence on the optimization time, and thus need to be chosen
carefully, a task that often requires substantial efforts. Moreover, the
optimal parameters can change during the optimization process. It is therefore
of great interest to design mechanisms that dynamically choose best-possible
parameters. An example for such an update mechanism is the one-fifth success
rule for step-size adaption in evolutionary strategies. While in continuous
domains this principle is well understood also from a mathematical point of
view, no comparable theory is available for problems in discrete domains.
In this work we show that the one-fifth success rule can be effective also in
discrete settings. We regard the ~GA proposed in
[Doerr/Doerr/Ebel: From black-box complexity to designing new genetic
algorithms, TCS 2015]. We prove that if its population size is chosen according
to the one-fifth success rule then the expected optimization time on
\textsc{OneMax} is linear. This is better than what \emph{any} static
population size can achieve and is asymptotically optimal also among
all adaptive parameter choices.Comment: This is the full version of a paper that is to appear at GECCO 201
Improved Approximation Algorithms for the Min-Max Selecting Items Problem
We give a simple deterministic approximation
algorithm for the Min-Max Selecting Items problem, where is the number of
scenarios. While our main goal is simplicity, this result also improves over
the previous best approximation ratio of due to Kasperski, Kurpisz,
and Zieli\'nski (Information Processing Letters (2013)). Despite using the
method of pessimistic estimators, the algorithm has a polynomial runtime also
in the RAM model of computation. We also show that the LP formulation for this
problem by Kasperski and Zieli\'nski (Annals of Operations Research (2009)),
which is the basis for the previous work and ours, has an integrality gap of at
least
Unbiased Black-Box Complexities of Jump Functions
We analyze the unbiased black-box complexity of jump functions with small,
medium, and large sizes of the fitness plateau surrounding the optimal
solution.
Among other results, we show that when the jump size is , that is, only a small constant fraction of the fitness values
is visible, then the unbiased black-box complexities for arities and higher
are of the same order as those for the simple \textsc{OneMax} function. Even
for the extreme jump function, in which all but the two fitness values
and are blanked out, polynomial-time mutation-based (i.e., unary unbiased)
black-box optimization algorithms exist. This is quite surprising given that
for the extreme jump function almost the whole search space (all but a
fraction) is a plateau of constant fitness.
To prove these results, we introduce new tools for the analysis of unbiased
black-box complexities, for example, selecting the new parent individual not by
comparing the fitnesses of the competing search points, but also by taking into
account the (empirical) expected fitnesses of their offspring.Comment: This paper is based on results presented in the conference versions
[GECCO 2011] and [GECCO 2014
Simple and Optimal Randomized Fault-Tolerant Rumor Spreading
We revisit the classic problem of spreading a piece of information in a group
of fully connected processors. By suitably adding a small dose of
randomness to the protocol of Gasienic and Pelc (1996), we derive for the first
time protocols that (i) use a linear number of messages, (ii) are correct even
when an arbitrary number of adversarially chosen processors does not
participate in the process, and (iii) with high probability have the
asymptotically optimal runtime of when at least an arbitrarily
small constant fraction of the processors are working. In addition, our
protocols do not require that the system is synchronized nor that all
processors are simultaneously woken up at time zero, they are fully based on
push-operations, and they do not need an a priori estimate on the number of
failed nodes.
Our protocols thus overcome the typical disadvantages of the two known
approaches, algorithms based on random gossip (typically needing a large number
of messages due to their unorganized nature) and algorithms based on fair
workload splitting (which are either not {time-efficient} or require intricate
preprocessing steps plus synchronization).Comment: This is the author-generated version of a paper which is to appear in
Distributed Computing, Springer, DOI: 10.1007/s00446-014-0238-z It is
available online from
http://link.springer.com/article/10.1007/s00446-014-0238-z This version
contains some new results (Section 6
Runtime Analysis of the Genetic Algorithm on Random Satisfiable 3-CNF Formulas
The genetic algorithm, first proposed at GECCO 2013,
showed a surprisingly good performance on so me optimization problems. The
theoretical analysis so far was restricted to the OneMax test function, where
this GA profited from the perfect fitness-distance correlation. In this work,
we conduct a rigorous runtime analysis of this GA on random 3-SAT instances in
the planted solution model having at least logarithmic average degree, which
are known to have a weaker fitness distance correlation.
We prove that this GA with fixed not too large population size again obtains
runtimes better than , which is a lower bound for most
evolutionary algorithms on pseudo-Boolean problems with unique optimum.
However, the self-adjusting version of the GA risks reaching population sizes
at which the intermediate selection of the GA, due to the weaker
fitness-distance correlation, is not able to distinguish a profitable offspring
from others. We show that this problem can be overcome by equipping the
self-adjusting GA with an upper limit for the population size. Apart from
sparse instances, this limit can be chosen in a way that the asymptotic
performance does not worsen compared to the idealistic OneMax case. Overall,
this work shows that the GA can provably have a good
performance on combinatorial search and optimization problems also in the
presence of a weaker fitness-distance correlation.Comment: An extended abstract of this report will appear in the proceedings of
the 2017 Genetic and Evolutionary Computation Conference (GECCO 2017
- …