65 research outputs found

    OneMax in Black-Box Models with Several Restrictions

    Full text link
    Black-box complexity studies lower bounds for the efficiency of general-purpose black-box optimization algorithms such as evolutionary algorithms and other search heuristics. Different models exist, each one being designed to analyze a different aspect of typical heuristics such as the memory size or the variation operators in use. While most of the previous works focus on one particular such aspect, we consider in this work how the combination of several algorithmic restrictions influence the black-box complexity. Our testbed are so-called OneMax functions, a classical set of test functions that is intimately related to classic coin-weighing problems and to the board game Mastermind. We analyze in particular the combined memory-restricted ranking-based black-box complexity of OneMax for different memory sizes. While its isolated memory-restricted as well as its ranking-based black-box complexity for bit strings of length nn is only of order n/lognn/\log n, the combined model does not allow for algorithms being faster than linear in nn, as can be seen by standard information-theoretic considerations. We show that this linear bound is indeed asymptotically tight. Similar results are obtained for other memory- and offspring-sizes. Our results also apply to the (Monte Carlo) complexity of OneMax in the recently introduced elitist model, in which only the best-so-far solution can be kept in the memory. Finally, we also provide improved lower bounds for the complexity of OneMax in the regarded models. Our result enlivens the quest for natural evolutionary algorithms optimizing OneMax in o(nlogn)o(n \log n) iterations.Comment: This is the full version of a paper accepted to GECCO 201

    Complexity Theory for Discrete Black-Box Optimization Heuristics

    Full text link
    A predominant topic in the theory of evolutionary algorithms and, more generally, theory of randomized black-box optimization techniques is running time analysis. Running time analysis aims at understanding the performance of a given heuristic on a given problem by bounding the number of function evaluations that are needed by the heuristic to identify a solution of a desired quality. As in general algorithms theory, this running time perspective is most useful when it is complemented by a meaningful complexity theory that studies the limits of algorithmic solutions. In the context of discrete black-box optimization, several black-box complexity models have been developed to analyze the best possible performance that a black-box optimization algorithm can achieve on a given problem. The models differ in the classes of algorithms to which these lower bounds apply. This way, black-box complexity contributes to a better understanding of how certain algorithmic choices (such as the amount of memory used by a heuristic, its selective pressure, or properties of the strategies that it uses to create new solution candidates) influences performance. In this chapter we review the different black-box complexity models that have been proposed in the literature, survey the bounds that have been obtained for these models, and discuss how the interplay of running time analysis and black-box complexity can inspire new algorithmic solutions to well-researched problems in evolutionary computation. We also discuss in this chapter several interesting open questions for future work.Comment: This survey article is to appear (in a slightly modified form) in the book "Theory of Randomized Search Heuristics in Discrete Search Spaces", which will be published by Springer in 2018. The book is edited by Benjamin Doerr and Frank Neumann. Missing numbers of pointers to other chapters of this book will be added as soon as possibl

    Playing Mastermind With Constant-Size Memory

    Get PDF
    We analyze the classic board game of Mastermind with nn holes and a constant number of colors. A result of Chv\'atal (Combinatorica 3 (1983), 325-329) states that the codebreaker can find the secret code with Θ(n/logn)\Theta(n / \log n) questions. We show that this bound remains valid if the codebreaker may only store a constant number of guesses and answers. In addition to an intrinsic interest in this question, our result also disproves a conjecture of Droste, Jansen, and Wegener (Theory of Computing Systems 39 (2006), 525-544) on the memory-restricted black-box complexity of the OneMax function class.Comment: 23 page

    Black-Box Complexity of the Binary Value Function

    Full text link
    The binary value function, or BinVal, has appeared in several studies in theory of evolutionary computation as one of the extreme examples of linear pseudo-Boolean functions. Its unbiased black-box complexity was previously shown to be at most log2n+2\lceil \log_2 n \rceil + 2, where nn is the problem size. We augment it with an upper bound of log2n+2.42141558o(1)\log_2 n + 2.42141558 - o(1), which is more precise for many values of nn. We also present a lower bound of log2n+1.1186406o(1)\log_2 n + 1.1186406 - o(1). Additionally, we prove that BinVal is an easiest function among all unimodal pseudo-Boolean functions at least for unbiased algorithms.Comment: 24 pages, one figure. An extended two-page abstract of this work will appear in proceedings of the Genetic and Evolutionary Computation Conference, GECCO'1

    Unbiased Black-Box Complexities of Jump Functions

    Full text link
    We analyze the unbiased black-box complexity of jump functions with small, medium, and large sizes of the fitness plateau surrounding the optimal solution. Among other results, we show that when the jump size is (1/2ε)n(1/2 - \varepsilon)n, that is, only a small constant fraction of the fitness values is visible, then the unbiased black-box complexities for arities 33 and higher are of the same order as those for the simple \textsc{OneMax} function. Even for the extreme jump function, in which all but the two fitness values n/2n/2 and nn are blanked out, polynomial-time mutation-based (i.e., unary unbiased) black-box optimization algorithms exist. This is quite surprising given that for the extreme jump function almost the whole search space (all but a Θ(n1/2)\Theta(n^{-1/2}) fraction) is a plateau of constant fitness. To prove these results, we introduce new tools for the analysis of unbiased black-box complexities, for example, selecting the new parent individual not by comparing the fitnesses of the competing search points, but also by taking into account the (empirical) expected fitnesses of their offspring.Comment: This paper is based on results presented in the conference versions [GECCO 2011] and [GECCO 2014

    Reducing the Arity in Unbiased Black-Box Complexity

    Full text link
    We show that for all 1<klogn1<k \leq \log n the kk-ary unbiased black-box complexity of the nn-dimensional \onemax function class is O(n/k)O(n/k). This indicates that the power of higher arity operators is much stronger than what the previous O(n/logk)O(n/\log k) bound by Doerr et al. (Faster black-box algorithms through higher arity operators, Proc. of FOGA 2011, pp. 163--172, ACM, 2011) suggests. The key to this result is an encoding strategy, which might be of independent interest. We show that, using kk-ary unbiased variation operators only, we may simulate an unrestricted memory of size O(2k)O(2^k) bits.Comment: An extended abstract of this paper has been accepted for inclusion in the proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2012

    Better Fixed-Arity Unbiased Black-Box Algorithms

    Full text link
    In their GECCO'12 paper, Doerr and Doerr proved that the kk-ary unbiased black-box complexity of OneMax on nn bits is O(n/k)O(n/k) for 2kO(logn)2\le k\le O(\log n). We propose an alternative strategy for achieving this unbiased black-box complexity when 3klog2n3\le k\le\log_2 n. While it is based on the same idea of block-wise optimization, it uses kk-ary unbiased operators in a different way. For each block of size 2k112^{k-1}-1 we set up, in O(k)O(k) queries, a virtual coordinate system, which enables us to use an arbitrary unrestricted algorithm to optimize this block. This is possible because this coordinate system introduces a bijection between unrestricted queries and a subset of kk-ary unbiased operators. We note that this technique does not depend on OneMax being solved and can be used in more general contexts. This together constitutes an algorithm which is conceptually simpler than the one by Doerr and Doerr, and at the same time achieves better constant factors in the asymptotic notation. Our algorithm works in (2+o(1))n/(k1)(2+o(1))\cdot n/(k-1), where o(1)o(1) relates to kk. Our experimental evaluation of this algorithm shows its efficiency already for 3k63\le k\le6.Comment: An extended abstract will appear at GECCO'1

    Better Fixed-Arity Unbiased Black-Box Algorithms

    Full text link
    In their GECCO'12 paper, Doerr and Doerr proved that the kk-ary unbiased black-box complexity of OneMax on nn bits is O(n/k)O(n/k) for 2kO(logn)2\le k\le O(\log n). We propose an alternative strategy for achieving this unbiased black-box complexity when 3klog2n3\le k\le\log_2 n. While it is based on the same idea of block-wise optimization, it uses kk-ary unbiased operators in a different way. For each block of size 2k112^{k-1}-1 we set up, in O(k)O(k) queries, a virtual coordinate system, which enables us to use an arbitrary unrestricted algorithm to optimize this block. This is possible because this coordinate system introduces a bijection between unrestricted queries and a subset of kk-ary unbiased operators. We note that this technique does not depend on OneMax being solved and can be used in more general contexts. This together constitutes an algorithm which is conceptually simpler than the one by Doerr and Doerr, and at the same time achieves better constant factors in the asymptotic notation. Our algorithm works in (2+o(1))n/(k1)(2+o(1))\cdot n/(k-1), where o(1)o(1) relates to kk. Our experimental evaluation of this algorithm shows its efficiency already for 3k63\le k\le6.Comment: An extended abstract will appear at GECCO'1
    corecore