598 research outputs found

    Cumulative Step-size Adaptation on Linear Functions

    Get PDF
    The CSA-ES is an Evolution Strategy with Cumulative Step size Adaptation, where the step size is adapted measuring the length of a so-called cumulative path. The cumulative path is a combination of the previous steps realized by the algorithm, where the importance of each step decreases with time. This article studies the CSA-ES on composites of strictly increasing functions with affine linear functions through the investigation of its underlying Markov chains. Rigorous results on the change and the variation of the step size are derived with and without cumulation. The step-size diverges geometrically fast in most cases. Furthermore, the influence of the cumulation parameter is studied.Comment: arXiv admin note: substantial text overlap with arXiv:1206.120

    Self-adaptation in evolution strategies

    Get PDF
    In this thesis, an analysis of self-adaptative evolution strategies (ES) is provided. Evolution strategies are population-based search heuristics usually applied in continuous search spaces which ultilize the evolutionary principles of recombination, mutation, and selection. Self-Adaptation in evolution strategies usually aims at steering the mutation process. The mutation process depends on several parameters, most notably, on the mutation strength. In a sense, this parameter controls the spread of the population due to random mutation. The mutation strength has to be varied during the optimization process: A mutation strength that was advantageous in the beginning of the run, for instance, when the ES was far away from the optimizer, may become unsuitable when the ES is close to optimizer. Self-Adaptation is one of the means applied. In short, self-adaptation means that the adaptation of the mutation strength is left to the ES itself. The mutation strength becomes a part of an individual’s genome and is also subject to recombination and mutation. Provided that the resulting offspring has a sufficiently “good” fitness, it is selected into the parent population. Two types of evolution strategies are considered in this thesis: The (1,lambda)-ES with one parent and lambda offspring and intermediate ES with a parental population with mu individuals. The latter ES-type applies intermediate recombination in the creation of the offspring. Furthermore, the analysis is restricted to two types of fitness functions: the sphere model and ridge functions. The thesis uses a dynamic systems approach, the evolution equations first introduced by Hans-Georg Beyer, and analyzes the mean value dynamics of the ES

    Cumulative Step-size Adaptation on Linear Functions: Technical Report

    Get PDF
    The CSA-ES is an Evolution Strategy with Cumulative Step size Adaptation, where the step size is adapted measuring the length of a so-called cumulative path. The cumulative path is a combination of the previous steps realized by the algorithm, where the importance of each step decreases with time. This article studies the CSA-ES on composites of strictly increasing with affine linear functions through the investigation of its underlying Markov chains. Rigorous results on the change and the variation of the step size are derived with and without cumulation. The step-size diverges geometrically fast in most cases. Furthermore, the influence of the cumulation parameter is studied.Comment: Parallel Problem Solving From Nature (2012

    On the Benefits of Distributed Populations for Noisy Optimization

    Get PDF
    While in the absence of noise, no improvement in local performance can be gained from retaining but the best candidate solution found so far, it has been shown experimentally that in the presence of noise, operating with a non-singular population of candidate solutions can have a marked and positive effect on the local performance of evolution strategies. So as to determine the reasons for the improved performance, we study the evolutionary dynamics of the -ES in the presence of noise. Considering a simple, idealized environment, a moment-based approach that utilizes recent results involving concomitants of selected order statistics is developed. This approach yields an intuitive explanation for the performance advantage of multi-parent strategies in the presence of noise. It is then shown that the idealized dynamic process considered does bear relevance to optimization problems in high-dimensional search spaces

    Bandit-based Estimation of Distribution Algorithms for Noisy Optimization: Rigorous Runtime Analysis

    Get PDF
    International audienceWe show complexity bounds for noisy optimization, in frame- works in which noise is stronger than in previously published papers[19]. We also propose an algorithm based on bandits (variants of [16]) that reaches the bound within logarithmic factors. We emphasize the differ- ences with empirical derived published algorithms

    Cumulative Step-size Adaptation on Linear Functions: Technical Report

    Get PDF
    The CSA-ES is an Evolution Strategy with Cumulative Step size Adaptation, where the step size is adapted measuring the length of a so-called cumulative path. The cumulative path is a combination of the previous steps realized by the algorithm, where the importance of each step decreases with time. This article studies the CSA-ES on composites of strictly increasing with affine linear functions through the investigation of its underlying Markov chains. Rigorous results on the change and the variation of the step size are derived with and without cumulation. The step-size diverges geometrically fast in most cases. Furthermore, the influence of the cumulation parameter is studied

    Noisy Optimization Complexity Under Locality Assumption

    Get PDF
    International audienceIn spite of various recent publications on the subject, there are still gaps between upper and lower bounds in evolutionary optimization for noisy objective function. In this paper we reduce the gap, and get tight bounds within logarithmic factors in the case of small noise and no long-distance influence on the objective function

    Biologically inspired evolutionary temporal neural circuits

    Get PDF
    Biological neural networks have always motivated creation of new artificial neural networks, and in this case a new autonomous temporal neural network system. Among the more challenging problems of temporal neural networks are the design and incorporation of short and long-term memories as well as the choice of network topology and training mechanism. In general, delayed copies of network signals can form short-term memory (STM), providing a limited temporal history of events similar to FIR filters, whereas the synaptic connection strengths as well as delayed feedback loops (ER circuits) can constitute longer-term memories (LTM). This dissertation introduces a new general evolutionary temporal neural network framework (GETnet) through automatic design of arbitrary neural networks with STM and LTM. GETnet is a step towards realization of general intelligent systems that need minimum or no human intervention and can be applied to a broad range of problems. GETnet utilizes nonlinear moving average/autoregressive nodes and sub-circuits that are trained by enhanced gradient descent and evolutionary search in terms of architecture, synaptic delay, and synaptic weight spaces. The mixture of Lamarckian and Darwinian evolutionary mechanisms facilitates the Baldwin effect and speeds up the hybrid training. The ability to evolve arbitrary adaptive time-delay connections enables GETnet to find novel answers to many classification and system identification tasks expressed in the general form of desired multidimensional input and output signals. Simulations using Mackey-Glass chaotic time series and fingerprint perspiration-induced temporal variations are given to demonstrate the above stated capabilities of GETnet
    • …
    corecore