46 research outputs found
Simple Max-Min Ant Systems and the Optimization of Linear Pseudo-Boolean Functions
With this paper, we contribute to the understanding of ant colony
optimization (ACO) algorithms by formally analyzing their runtime behavior. We
study simple MAX-MIN ant systems on the class of linear pseudo-Boolean
functions defined on binary strings of length 'n'. Our investigations point out
how the progress according to function values is stored in pheromone. We
provide a general upper bound of O((n^3 \log n)/ \rho) for two ACO variants on
all linear functions, where (\rho) determines the pheromone update strength.
Furthermore, we show improved bounds for two well-known linear pseudo-Boolean
functions called OneMax and BinVal and give additional insights using an
experimental study.Comment: 19 pages, 2 figure
Improved Fixed-Budget Results via Drift Analysis
Fixed-budget theory is concerned with computing or bounding the fitness value
achievable by randomized search heuristics within a given budget of fitness
function evaluations. Despite recent progress in fixed-budget theory, there is
a lack of general tools to derive such results. We transfer drift theory, the
key tool to derive expected optimization times, to the fixed-budged
perspective. A first and easy-to-use statement concerned with iterating drift
in so-called greed-admitting scenarios immediately translates into bounds on
the expected function value. Afterwards, we consider a more general tool based
on the well-known variable drift theorem. Applications of this technique to the
LeadingOnes benchmark function yield statements that are more precise than the
previous state of the art.Comment: 25 pages. An extended abstract of this paper will be published in the
proceedings of PPSN 202
Emergence of Diversity and its Benefits for Crossover in Genetic Algorithms
Population diversity is essential for avoiding premature convergence
in Genetic Algorithms (GAs) and for the effective use of
crossover. Yet the dynamics of how diversity emerges in populations are
not well understood. We use rigorous runtime analysis to gain insight into
population dynamics and GA performance for a standard (µ+1) GA and
the Jumpk
test function. By studying the stochastic process underlying
the size of the largest collection of identical genotypes we show that the
interplay of crossover followed by mutation may serve as a catalyst leading
to a sudden burst of diversity. This leads to improvements of the
expected optimisation time of order Ω(n/ log n) compared to mutationonly
algorithms like the (1+1) EA
Escaping Local Optima Using Crossover with Emergent Diversity
Population diversity is essential for avoiding premature
convergence in Genetic Algorithms and for the effective
use of crossover. Yet the dynamics of how diversity emerges in
populations are not well understood. We use rigorous run time
analysis to gain insight into population dynamics and Genetic
Algorithm performance for the (μ+1) Genetic Algorithm and the
Jump test function. We show that the interplay of crossover
followed by mutation may serve as a catalyst leading to a
sudden burst of diversity. This leads to significant improvements
of the expected optimisation time compared to mutation-only
algorithms like the (1+1) Evolutionary Algorithm. Moreover,
increasing the mutation rate by an arbitrarily small constant
factor can facilitate the generation of diversity, leading to even
larger speedups. Experiments were conducted to complement our
theoretical findings and further highlight the benefits of crossover
on the function class
Analysis of the (1 + 1) EA on subclasses of linear functions under uniform and linear constraints
Linear functions have gained great attention in the run time analysis of evolutionary computation methods. The corresponding investigations have provided many effective tools for analyzing more complex problems. So far, the runtime analysis of evolutionary algorithms has mainly focused on unconstrained problems, but problems occurring in applications frequently involve constraints. Therefore, there is a strong need to extend the current analyses and used methods for analyzing unconstrained problems to a setting involving constraints. In this paper, we consider the behavior of the classical Evolutionary Algorithm on linear functions under linear constraint. We show tight bounds in the case where the constraint is given by the OneMax function and the objective function is given by either the OneMax or the BinVal function. For the general case we present upper and lower bounds.Tobias Friedrich, Timo Kötzing, J.A. Gregor Lagodzinski, Frank Neumann, Martin Schirnec
Feature-based diversity optimization for problem instance classification
Parallel Problem Solving from Nature – PPSN XIVUnderstanding the behaviour of heuristic search methods is a challenge. This even holds for simple local search methods such as 2-OPT for the Traveling Salesperson problem. In this paper, we present a general framework that is able to construct a diverse set of instances that are hard or easy for a given search heuristic. Such a diverse set is obtained by using an evolutionary algorithm for constructing hard or easy instances that are diverse with respect to different features of the underlying problem. Examining the constructed instance sets, we show that many combinations of two or three features give a good classification of the TSP instances in terms of whether they are hard to be solved by 2-OPT.Wanru Gao, Samadhi Nallaperuma, and Frank Neuman
On the Analysis of Simple Genetic Programming for Evolving Boolean Functions
This work presents a first step towards a systematic time and space complexity analysis of genetic programming (GP) for evolving functions with desired input/output behaviour. Two simple GP algorithms, called (1+1) GP and (1+1) GP*, equipped with minimal function (F) and terminal (L) sets are considered for evolving two standard classes of Boolean functions. It is rigorously proved that both algorithms are efficient for the easy problem of evolving conjunctions of Boolean variables with the minimal sets. However, if an extra function (i.e. NOT) is added to F, then the algorithms require at least exponential time to evolve the conjunction of n variables. On the other hand, it is proved that both algorithms fail at evolving the difficult parity function in polynomial time with probability at least exponentially close to 1. Concerning generalisation, it is shown how the quality of the evolved conjunctions depends on the size of the training set s while the evolved exclusive disjunctions generalize equally badly independent of s
First-Hitting Times Under Additive Drift
For the last ten years, almost every theoretical result concerning the
expected run time of a randomized search heuristic used drift theory, making it
the arguably most important tool in this domain. Its success is due to its ease
of use and its powerful result: drift theory allows the user to derive bounds
on the expected first-hitting time of a random process by bounding expected
local changes of the process -- the drift. This is usually far easier than
bounding the expected first-hitting time directly.
Due to the widespread use of drift theory, it is of utmost importance to have
the best drift theorems possible. We improve the fundamental additive,
multiplicative, and variable drift theorems by stating them in a form as
general as possible and providing examples of why the restrictions we keep are
still necessary. Our additive drift theorem for upper bounds only requires the
process to be nonnegative, that is, we remove unnecessary restrictions like a
finite, discrete, or bounded search space. As corollaries, the same is true for
our upper bounds in the case of variable and multiplicative drift
Unbiased Black-Box Complexities of Jump Functions
International audienc