243 research outputs found

    Escaping Local Optima Using Crossover with Emergent Diversity

    Get PDF
    Population diversity is essential for avoiding premature convergence in Genetic Algorithms and for the effective use of crossover. Yet the dynamics of how diversity emerges in populations are not well understood. We use rigorous run time analysis to gain insight into population dynamics and Genetic Algorithm performance for the (μ+1) Genetic Algorithm and the Jump test function. We show that the interplay of crossover followed by mutation may serve as a catalyst leading to a sudden burst of diversity. This leads to significant improvements of the expected optimisation time compared to mutation-only algorithms like the (1+1) Evolutionary Algorithm. Moreover, increasing the mutation rate by an arbitrarily small constant factor can facilitate the generation of diversity, leading to even larger speedups. Experiments were conducted to complement our theoretical findings and further highlight the benefits of crossover on the function class

    Standard Steady State Genetic Algorithms Can Hillclimb Faster than Mutation-only Evolutionary Algorithms

    Get PDF
    Explaining to what extent the real power of genetic algorithms lies in the ability of crossover to recombine individuals into higher quality solutions is an important problem in evolutionary computation. In this paper we show how the interplay between mutation and crossover can make genetic algorithms hillclimb faster than their mutation-only counterparts. We devise a Markov Chain framework that allows to rigorously prove an upper bound on the runtime of standard steady state genetic algorithms to hillclimb the ONEMAX function. The bound establishes that the steady-state genetic algorithms are 25% faster than all standard bit mutation-only evolutionary algorithms with static mutation rate up to lower order terms for moderate population sizes. The analysis also suggests that larger populations may be faster than populations of size 2. We present a lower bound for a greedy (2+1) GA that matches the upper bound for populations larger than 2, rigorously proving that 2 individuals cannot outperform larger population sizes under greedy selection and greedy crossover up to lower order terms. In complementary experiments the best population size is greater than 2 and the greedy genetic algorithms are faster than standard ones, further suggesting that the derived lower bound also holds for the standard steady state (2+1) GA

    A self-organizing random immigrants genetic algorithm for dynamic optimization problems

    Get PDF
    This is the post-print version of the article. The official published version can be obtained from the link below - Copyright @ 2007 SpringerIn this paper a genetic algorithm is proposed where the worst individual and individuals with indices close to its index are replaced in every generation by randomly generated individuals for dynamic optimization problems. In the proposed genetic algorithm, the replacement of an individual can affect other individuals in a chain reaction. The new individuals are preserved in a subpopulation which is defined by the number of individuals created in the current chain reaction. If the values of fitness are similar, as is the case with small diversity, one single replacement can affect a large number of individuals in the population. This simple approach can take the system to a self-organizing behavior, which can be useful to control the diversity level of the population and hence allows the genetic algorithm to escape from local optima once the problem changes due to the dynamics.This work was supported by FAPESP (Proc. 04/04289-6)

    An Exponential Lower Bound for the Runtime of the cGA on Jump Functions

    Full text link
    In the first runtime analysis of an estimation-of-distribution algorithm (EDA) on the multi-modal jump function class, Hasen\"ohrl and Sutton (GECCO 2018) proved that the runtime of the compact genetic algorithm with suitable parameter choice on jump functions with high probability is at most polynomial (in the dimension) if the jump size is at most logarithmic (in the dimension), and is at most exponential in the jump size if the jump size is super-logarithmic. The exponential runtime guarantee was achieved with a hypothetical population size that is also exponential in the jump size. Consequently, this setting cannot lead to a better runtime. In this work, we show that any choice of the hypothetical population size leads to a runtime that, with high probability, is at least exponential in the jump size. This result might be the first non-trivial exponential lower bound for EDAs that holds for arbitrary parameter settings.Comment: To appear in the Proceedings of FOGA 2019. arXiv admin note: text overlap with arXiv:1903.1098

    Do sophisticated evolutionary algorithms perform better than simple ones?

    Get PDF
    Evolutionary algorithms (EAs) come in all shapes and sizes. Theoretical investigations focus on simple, bare-bones EAs while applications often use more sophisticated EAs that perform well on the problem at hand. What is often unclear is whether a large degree of algorithm sophistication is necessary, and if so, how much performance is gained by adding complexity to an EA. We address this question by comparing the performance of a wide range of theory-driven EAs, from bare-bones algorithms like the (1+1) EA, a (2+1) GA and simple population-based algorithms to more sophisticated ones like the (1+(λ,λ)) GA and algorithms using fast (heavy-tailed) mutation operators, against sophisticated and highly effective EAs from specific applications. This includes a famous and highly cited Genetic Algorithm for the Multidimensional Knapsack Problem and the Parameterless Population Pyramid for Ising Spin Glasses and MaxSat. While for the Multidimensional Knapsack Problem the sophisticated algorithm performs best, surprisingly, for large Ising and MaxSat instances the simplest algorithm performs best. We also derive conclusions about the usefulness of populations, crossover and fast mutation operators. Empirical results are supported by statistical tests and contrasted against theoretical work in an attempt to link theoretical and empirical results on EAs

    Digital Ecosystems: Ecosystem-Oriented Architectures

    Full text link
    We view Digital Ecosystems to be the digital counterparts of biological ecosystems. Here, we are concerned with the creation of these Digital Ecosystems, exploiting the self-organising properties of biological ecosystems to evolve high-level software applications. Therefore, we created the Digital Ecosystem, a novel optimisation technique inspired by biological ecosystems, where the optimisation works at two levels: a first optimisation, migration of agents which are distributed in a decentralised peer-to-peer network, operating continuously in time; this process feeds a second optimisation based on evolutionary computing that operates locally on single peers and is aimed at finding solutions to satisfy locally relevant constraints. The Digital Ecosystem was then measured experimentally through simulations, with measures originating from theoretical ecology, evaluating its likeness to biological ecosystems. This included its responsiveness to requests for applications from the user base, as a measure of the ecological succession (ecosystem maturity). Overall, we have advanced the understanding of Digital Ecosystems, creating Ecosystem-Oriented Architectures where the word ecosystem is more than just a metaphor.Comment: 39 pages, 26 figures, journa

    Runtime Analysis of a Heavy-Tailed (1+(λ,λ))(1+(\lambda,\lambda)) Genetic Algorithm on Jump Functions

    Full text link
    It was recently observed that the (1+(λ,λ))(1+(\lambda,\lambda)) genetic algorithm can comparably easily escape the local optimum of the jump functions benchmark. Consequently, this algorithm can optimize the jump function with jump size kk in an expected runtime of only n(k+1)/2kk/2eO(k)n^{(k + 1)/2}k^{-k/2}e^{O(k)} fitness evaluations (Antipov, Doerr, Karavaev (GECCO 2020)). To obtain this performance, however, a non-standard parameter setting depending on the jump size kk was used. To overcome this difficulty, we propose to choose two parameters of the (1+(λ,λ))(1+(\lambda,\lambda)) genetic algorithm randomly from a power-law distribution. Via a mathematical runtime analysis, we show that this algorithm with natural instance-independent choices of the distribution parameters on all jump functions with jump size at most n/4n/4 has a performance close to what the best instance-specific parameters in the previous work obtained. This price for instance-independence can be made as small as an O(nlog(n))O(n\log(n)) factor. Given the difficulty of the jump problem and the runtime losses from using mildly suboptimal fixed parameters (also discussed in this work), this appears to be a fair price.Comment: An extended version of the same-titled paper from PPSN 202

    When move acceptance selection hyper-heuristics outperform Metropolis and elitist evolutionary algorithms and when not

    Get PDF
    Selection hyper-heuristics (HHs) are automated algorithm selection methodologies that choose between different heuristics during the optimisation process. Recently, selection HHs choosing between a collection of elitist randomised local search heuristics with different neighbourhood sizes have been shown to optimise standard unimodal benchmark functions from evolutionary computation in the optimal expected runtime achievable with the available low-level heuristics. In this paper, we extend our understanding of the performance of HHs to the domain of multimodal optimisation by considering a Move Acceptance HH (MAHH) from the literature that can switch between elitist and non-elitist heuristics during the run. In essence, MAHH is a non-elitist search heuristic that differs from other search heuristics in the source of non-elitism. We first identify the range of parameters that allow MAHH to hillclimb efficiently and prove that it can optimise the standard hillclimbing benchmark function OneMax in the best expected asymptotic time achievable by unbiased mutation-based randomised search heuristics. Afterwards, we use standard multimodal benchmark functions to highlight function characteristics where MAHH outperforms elitist evolutionary algorithms and the well-known Metropolis non-elitist algorithm by quickly escaping local optima, and ones where it does not. Since MAHH is essentially a non-elitist random local search heuristic, the paper is of independent interest to researchers in the fields of artificial intelligence and randomised search heuristics
    corecore