294 research outputs found

    Evolutionary computation and Wright's equation

    Get PDF
    AbstractIn this paper, Wright's equation formulated in 1931 is proven and applied to evolutionary computation. Wright's equation shows that evolution is doing gradient ascent in a landscape defined by the average fitness of the population. The average fitness W is defined in terms of marginal gene frequencies pi. Wright's equation is only approximately valid in population genetics, but it exactly describes the behavior of our univariate marginal distribution algorithm (UMDA). We apply Wright's equation to a specific fitness function defined by Wright. Furthermore we introduce mutation into Wright's equation and UMDA. We show that mutation moves the stable attractors from the boundary into the interior. We compare Wright's equation with the diversified replicator equation. We show that a fast version of Wright's equation gives very good results for optimizing a class of binary fitness functions

    A probabilistic evolutionary optimization approach to compute quasiparticle braids

    Full text link
    Topological quantum computing is an alternative framework for avoiding the quantum decoherence problem in quantum computation. The problem of executing a gate in this framework can be posed as the problem of braiding quasiparticles. Because these are not Abelian, the problem can be reduced to finding an optimal product of braid generators where the optimality is defined in terms of the gate approximation and the braid's length. In this paper we propose the use of different variants of estimation of distribution algorithms to deal with the problem. Furthermore, we investigate how the regularities of the braid optimization problem can be translated into statistical regularities by means of the Boltzmann distribution. We show that our best algorithm is able to produce many solutions that approximates the target gate with an accuracy in the order of 10610^{-6}, and have lengths up to 9 times shorter than those expected from braids of the same accuracy obtained with other methods.Comment: 9 pages,7 figures. Accepted at SEAL 201

    Level-Based Analysis of the Population-Based Incremental Learning Algorithm

    Get PDF
    The Population-Based Incremental Learning (PBIL) algorithm uses a convex combination of the current model and the empirical model to construct the next model, which is then sampled to generate offspring. The Univariate Marginal Distribution Algorithm (UMDA) is a special case of the PBIL, where the current model is ignored. Dang and Lehre (GECCO 2015) showed that UMDA can optimise LeadingOnes efficiently. The question still remained open if the PBIL performs equally well. Here, by applying the level-based theorem in addition to Dvoretzky--Kiefer--Wolfowitz inequality, we show that the PBIL optimises function LeadingOnes in expected time O(nλlogλ+n2)\mathcal{O}(n\lambda \log \lambda + n^2) for a population size λ=Ω(logn)\lambda = \Omega(\log n), which matches the bound of the UMDA. Finally, we show that the result carries over to BinVal, giving the fist runtime result for the PBIL on the BinVal problem.Comment: To appea

    Replica Symmetry Breaking in the Random Replicant Model

    Full text link
    We study the statistical mechanics of a model describing the coevolution of species interacting in a random way. We find that at high competition replica symmetry is broken. We solve the model in the approximation of one step replica symmetry breaking and we compare our findings with accurate numerical simulations.Comment: 12 pages, TeX, 5 postscript figures are avalaible upon request, submitted to Journal of Physics A: Mathematical and Genera

    Implementation of Standard Genetic Algorithm on MIMD machines

    Full text link
    Genetic Algorithms (GAs) have been implemented on a number of multiprocessor machines. In many cases the GA has been adapted to the hardware structure of the system. This paper describes the implementation of a standard genetic algorithm on several MIMD multiprocessor systems. It discusses the data dependencies of the different parts of the algorithm and the changes necessary to adapt the serial version to the parallel versions. Timing measurements and speedups are given for a common problem implemented on all machines

    Explicit memory schemes for evolutionary algorithms in dynamic environments

    Get PDF
    Copyright @ 2007 Springer-VerlagProblem optimization in dynamic environments has atrracted a growing interest from the evolutionary computation community in reccent years due to its importance in real world optimization problems. Several approaches have been developed to enhance the performance of evolutionary algorithms for dynamic optimization problems, of which the memory scheme is a major one. This chapter investigates the application of explicit memory schemes for evolutionary algorithms in dynamic environments. Two kinds of explicit memory schemes: direct memory and associative memory, are studied within two classes of evolutionary algorithms: genetic algorithms and univariate marginal distribution algorithms for dynamic optimization problems. Based on a series of systematically constructed dynamic test environments, experiments are carried out to investigate these explicit memory schemes and the performance of direct and associative memory schemes are campared and analysed. The experimental results show the efficiency of the memory schemes for evolutionary algorithms in dynamic environments, especially when the environment changes cyclically. The experimental results also indicate that the effect of the memory schemes depends not only on the dynamic problems and dynamic environments but also on the evolutionary algorithm used

    Renormalization for Discrete Optimization

    Full text link
    The renormalization group has proven to be a very powerful tool in physics for treating systems with many length scales. Here we show how it can be adapted to provide a new class of algorithms for discrete optimization. The heart of our method uses renormalization and recursion, and these processes are embedded in a genetic algorithm. The system is self-consistently optimized on all scales, leading to a high probability of finding the ground state configuration. To demonstrate the generality of such an approach, we perform tests on traveling salesman and spin glass problems. The results show that our ``genetic renormalization algorithm'' is extremely powerful.Comment: 4 pages, no figur

    Analytical study of the effect of recombination on evolution via DNA shuffling

    Full text link
    We investigate a multi-locus evolutionary model which is based on the DNA shuffling protocol widely applied in \textit{in vitro} directed evolution. This model incorporates selection, recombination and point mutations. The simplicity of the model allows us to obtain a full analytical treatment of both its dynamical and equilibrium properties, for the case of an infinite population. We also briefly discuss finite population size corrections
    corecore