22 research outputs found

    Bounding Bloat in Genetic Programming

    Full text link
    While many optimization problems work with a fixed number of decision variables and thus a fixed-length representation of possible solutions, genetic programming (GP) works on variable-length representations. A naturally occurring problem is that of bloat (unnecessary growth of solutions) slowing down optimization. Theoretical analyses could so far not bound bloat and required explicit assumptions on the magnitude of bloat. In this paper we analyze bloat in mutation-based genetic programming for the two test functions ORDER and MAJORITY. We overcome previous assumptions on the magnitude of bloat and give matching or close-to-matching upper and lower bounds for the expected optimization time. In particular, we show that the (1+1) GP takes (i) Θ(Tinit+nlogn)\Theta(T_{init} + n \log n) iterations with bloat control on ORDER as well as MAJORITY; and (ii) O(TinitlogTinit+n(logn)3)O(T_{init} \log T_{init} + n (\log n)^3) and Ω(Tinit+nlogn)\Omega(T_{init} + n \log n) (and Ω(TinitlogTinit)\Omega(T_{init} \log T_{init}) for n=1n=1) iterations without bloat control on MAJORITY.Comment: An extended abstract has been published at GECCO 201

    Destructiveness of Lexicographic Parsimony Pressure and Alleviation by a Concatenation Crossover in Genetic Programming

    Full text link
    For theoretical analyses there are two specifics distinguishing GP from many other areas of evolutionary computation. First, the variable size representations, in particular yielding a possible bloat (i.e. the growth of individuals with redundant parts). Second, the role and realization of crossover, which is particularly central in GP due to the tree-based representation. Whereas some theoretical work on GP has studied the effects of bloat, crossover had a surprisingly little share in this work. We analyze a simple crossover operator in combination with local search, where a preference for small solutions minimizes bloat (lexicographic parsimony pressure); the resulting algorithm is denoted Concatenation Crossover GP. For this purpose three variants of the well-studied MAJORITY test function with large plateaus are considered. We show that the Concatenation Crossover GP can efficiently optimize these test functions, while local search cannot be efficient for all three variants independent of employing bloat control.Comment: to appear in PPSN 201

    On the Analysis of Simple Genetic Programming for Evolving Boolean Functions

    Get PDF
    This work presents a first step towards a systematic time and space complexity analysis of genetic programming (GP) for evolving functions with desired input/output behaviour. Two simple GP algorithms, called (1+1) GP and (1+1) GP*, equipped with minimal function (F) and terminal (L) sets are considered for evolving two standard classes of Boolean functions. It is rigorously proved that both algorithms are efficient for the easy problem of evolving conjunctions of Boolean variables with the minimal sets. However, if an extra function (i.e. NOT) is added to F, then the algorithms require at least exponential time to evolve the conjunction of n variables. On the other hand, it is proved that both algorithms fail at evolving the difficult parity function in polynomial time with probability at least exponentially close to 1. Concerning generalisation, it is shown how the quality of the evolved conjunctions depends on the size of the training set s while the evolved exclusive disjunctions generalize equally badly independent of s

    Ants Easily Solve Stochastic Shortest Path Problems

    No full text

    Theoretical Study of Optimizing Rugged Landscapes with the cGA

    No full text
    Estimation of distribution algorithms (EDAs) provide a distribution-based approach for optimization which adapts its probability distribution during the run of the algorithm. We contribute to the theoretical understanding of EDAs and point out that their distribution approach makes them more suitable to deal with rugged fitness landscapes than classical local search algorithms. Concretely, we make the OneMax function rugged by adding noise to each fitness value. The cGA can nevertheless find solutions with n(1−ε) many 1s, even for high variance of noise. In contrast to this, RLS and the (1+1) EA, with high probability, only find solutions with n(1/2+o(1)) many 1s, even for noise with small variance.Tobias Friedrich, Timo Kötzing, Frank Neumann, Aishwarya Radhakrishna

    Analysis of the (1+1) EA on LeadingOnes with Constraints

    Get PDF
    Understanding how evolutionary algorithms perform on constrained problems has gained increasing attention in recent years. In this paper, we study how evolutionary algorithms optimize constrained versions of the classical LeadingOnes problem. We first provide a run time analysis for the classical (1+1) EA on the LeadingOnes problem with a deterministic cardinality constraint, giving Θ((-) log() + ²) as the tight bound. Our results show that the be- haviour of the algorithm is highly dependent on the constraint bound of the uniform constraint. Afterwards, we consider the prob- lem in the context of stochastic constraints and provide insights tudies on how the (+1) EA is able to deal with se constraints in a sampling-based setting.Tobias Friedrich, Timo Kötzing, Aneta Neumann, Frank Neumann, Aishwarya Radhakrishna
    corecore