30 research outputs found

    Performance analysis of randomised search heuristics operating with a fixed budget

    Get PDF
    Jansen, T., Zarges, C. (2013). Performance analysis of randomised search heuristics operating with a fixed budget. Theoretical Computer Science, 545, 39-58When for a difficult real-world optimisation problem no good problem-specific algorithm is available often randomised search heuristics are used. They are hoped to deliver good solutions in acceptable time. The theoretical analysis usually concentrates on the average time needed to find an optimal or approximately optimal solution. This matches neither the application in practice nor the empirical analysis since usually optimal solutions are not known and even if found cannot be recognised. More often the algorithms are stopped after some time. This motivates a theoretical analysis to concentrate on the quality of the best solution obtained after a pre-specified number of function evaluations called budget. Using this perspective two simple randomised search heuristics, random local search and the (1+1) evolutionary algorithm, are analysed on some well-known example problems. Upper and lower bounds on the expected quality of a solution for a fixed budget of function evaluations are proven. The analysis shows novel and challenging problems in the study of randomised search heuristics. It demonstrates the potential of this shift in perspective from expected run time to expected solution quality.authorsversionPeer reviewe

    Analysis of Randomised Search Heuristics for Dynamic Optimisation

    Get PDF
    Dynamic optimisation is an area of application where randomised search heuristics like evolutionary algorithms and artificial immune systems are often successful. The theoretical foundation of this important topic suffers from a lack of a generally accepted analytical framework as well as a lack of widely accepted example problems. This article tackles both problems by discussing necessary conditions for useful and practically relevant theoretical analysis as well as introducing a concrete family of dynamic example problems that draws inspiration from a well-known static example problem and exhibits a bi-stable dynamic. After the stage has been set this way, the framework is made concrete by presenting the results of thorough theoretical and statistical analysis for mutation-based evolutionary algorithms and artificial immune systems. </jats:p

    Lexicase selection in Learning Classifier Systems

    Full text link
    The lexicase parent selection method selects parents by considering performance on individual data points in random order instead of using a fitness function based on an aggregated data accuracy. While the method has demonstrated promise in genetic programming and more recently in genetic algorithms, its applications in other forms of evolutionary machine learning have not been explored. In this paper, we investigate the use of lexicase parent selection in Learning Classifier Systems (LCS) and study its effect on classification problems in a supervised setting. We further introduce a new variant of lexicase selection, called batch-lexicase selection, which allows for the tuning of selection pressure. We compare the two lexicase selection methods with tournament and fitness proportionate selection methods on binary classification problems. We show that batch-lexicase selection results in the creation of more generic rules which is favorable for generalization on future data. We further show that batch-lexicase selection results in better generalization in situations of partial or missing data.Comment: Genetic and Evolutionary Computation Conference, 201

    On Easiest Functions for Mutation Operators in Bio-Inspired Optimisation

    Get PDF
    Understanding which function classes are easy and which are hard for a given algorithm is a fundamental question for the analysis and design of bio-inspired search heuristics. A natural starting point is to consider the easiest and hardest functions for an algorithm. For the (1+1) EA using standard bit mutation (SBM) it is well known that OneMax is an easiest function with unique optimum while Trap is a hardest. In this paper we extend the analysis of easiest function classes to the contiguous somatic hypermutation (CHM) operator used in artificial immune systems. We define a function MinBlocks and prove that it is an easiest function for the (1+1) EA using CHM, presenting both a runtime and a fixed budget analysis. Since MinBlocks is, up to a factor of 2, a hardest function for standard bit mutations, we consider the effects of combining both operators into a hybrid algorithm. We rigorously prove that by combining the advantages of k operators, several hybrid algorithmic schemes have optimal asymptotic performance on the easiest functions for each individual operator. In particular, the hybrid algorithms using CHM and SBM have optimal asymptotic performance on both OneMax and MinBlocks. We then investigate easiest functions for hybrid schemes and show that an easiest function for an hybrid algorithm is not just a trivial weighted combination of the respective easiest functions for each operator.publishersversionPeer reviewe

    Understanding randomised search heuristics lessons from the evolution of theory: A case study

    Get PDF
    This is the third and final installment of an article by Mr. James A. Kinnison. Mr. Kinnison is a political science graduate student from the University of New Mexico and specializes in international relations and national security. This installment provides opinions on terrorism challenges for intelligence and law enforcement. (Correction from last week. The second installment’s note (xx) was left off the text. It is Vandenko, I. (November 16, 1995.) Kuda edut radioaktivnye konteynery iz Chechnya? Izvestiya, p. 2
    corecore