1,673 research outputs found

    Modelling Genetic Improvement Landscapes with Local Optima Networks

    Get PDF
    Local optima networks are a compact representation of the global structure of a search space. They can be used for analysis and visualisation. This paper provides one of the first analyses of program search spaces using local optima networks. These are generated by sampling the search space by recording the progress of an Iterated Local Search algorithm. Source code mutations in comparison and Boolean operators are considered. The search spaces of two small benchmark programs, the triangle and TCAS programs, are analysed and visualised. Results show a high level of neutrality, i.e. connected test-equivalent mutants. It is also generally relatively easy to find a path from a random mutant to a mutant that passes all test cases

    Quantifying the Impact of Parameter Tuning on Nature-Inspired Algorithms

    Full text link
    The problem of parameterization is often central to the effective deployment of nature-inspired algorithms. However, finding the optimal set of parameter values for a combination of problem instance and solution method is highly challenging, and few concrete guidelines exist on how and when such tuning may be performed. Previous work tends to either focus on a specific algorithm or use benchmark problems, and both of these restrictions limit the applicability of any findings. Here, we examine a number of different algorithms, and study them in a "problem agnostic" fashion (i.e., one that is not tied to specific instances) by considering their performance on fitness landscapes with varying characteristics. Using this approach, we make a number of observations on which algorithms may (or may not) benefit from tuning, and in which specific circumstances.Comment: 8 pages, 7 figures. Accepted at the European Conference on Artificial Life (ECAL) 2013, Taormina, Ital

    Visualising the Global Structure of Search Landscapes: Genetic Improvement as a Case Study

    Get PDF
    The search landscape is a common metaphor to describe the structure of computational search spaces. Different landscape metrics can be computed and used to predict search difficulty. Yet, the metaphor falls short in visualisation terms because it is hard to represent complex landscapes, both in terms of size and dimensionality. This paper combines Local Optima Networks, as a compact representation of the global structure of a search space, and dimensionality reduction, using the t-Distributed Stochastic Neighbour Embedding (t-SNE) algorithm, in order to both bring the metaphor to life and convey new insight into the search process. As a case study, two benchmark programs, under a Genetic Improvement bug-fixing scenario, are analysed and visualised using the proposed method. Local Optima Networks for both iterated local search and a hybrid genetic algorithm, across different neighbourhoods, are compared, highlighting the differences in how the landscape is explored

    The Local Optima Level in Chemotherapy Schedule Optimisation

    Get PDF
    In this paper a multi-drug Chemotherapy Schedule Optimisation Problem (CSOP) is subject to Local Optima Network (LON) analysis. LONs capture global patterns in fitness landscapes. CSOPs have not previously been subject to fitness landscape analysis. We fill this gap: LONs are constructed and studied for meaningful structure. The CSOP formulation presents novel challenges and questions for the LON model because there are infeasible regions in the fitness landscape and an unknown global optimum; it also brings a topic from healthcare to LON analysis. Two LON Construction algorithms are proposed for sampling CSOP fitness landscapes: a Markov-Chain Construction Algorithm and a Hybrid Construction Algorithm. The results provide new insight into LONs of highly-constrained spaces, and into the proficiency of search operators on the CSOP. Iterated Local Search and Memetic Search, which are the foundations for the LON algorithms, are found to markedly out-perform a Genetic Algorithm from the literature

    Biology of Applied Digital Ecosystems

    Full text link
    A primary motivation for our research in Digital Ecosystems is the desire to exploit the self-organising properties of biological ecosystems. Ecosystems are thought to be robust, scalable architectures that can automatically solve complex, dynamic problems. However, the biological processes that contribute to these properties have not been made explicit in Digital Ecosystems research. Here, we discuss how biological properties contribute to the self-organising features of biological ecosystems, including population dynamics, evolution, a complex dynamic environment, and spatial distributions for generating local interactions. The potential for exploiting these properties in artificial systems is then considered. We suggest that several key features of biological ecosystems have not been fully explored in existing digital ecosystems, and discuss how mimicking these features may assist in developing robust, scalable self-organising architectures. An example architecture, the Digital Ecosystem, is considered in detail. The Digital Ecosystem is then measured experimentally through simulations, with measures originating from theoretical ecology, to confirm its likeness to a biological ecosystem. Including the responsiveness to requests for applications from the user base, as a measure of the 'ecological succession' (development).Comment: 9 pages, 4 figure, conferenc

    The consideration of surrogate model accuracy in single-objective electromagnetic design optimization

    No full text
    The computational cost of evaluating the objective function in electromagnetic optimal design problems necessitates the use of cost-effective techniques. This paper describes how one popular technique, surrogate modelling, has been used in the single-objective optimization of electromagnetic devices. Three different types of surrogate model are considered, namely polynomial approximation, artificial neural networks and kriging. The importance of considering surrogate model accuracy is emphasised, and techniques used to improve accuracy for each type of model are discussed. Developments in this area outside the field of electromagnetic design optimization are also mentioned. It is concluded that surrogate model accuracy is an important factor which should be considered during an optimization search, and that developments have been made elsewhere in this area which are yet to be implemented in electromagnetic design optimization

    Computationally Efficient Local Optima Network Construction

    Get PDF
    The codebase for this paper is available at https://github.com/fieldsend/local_optima_networksThere has been an increasing amount of research on the visualisation of search landscapes through the use of exact and approximate local optima networks (LONs). Although there are many papers available describing the construction of a LON, there is a dearth of code released to support the general practitioner constructing a LON for their problem. Furthermore, a naive implementation of the algorithms described in work on LONs will lead to inefficient and costly code, due to the possibility of repeatedly reevaluating neighbourhood members, and partially overlapping greedy paths. Here we discuss algorithms for the efficient computation of both exact and approximate LONs, and provide open source code online. We also provide some empirical illustrations of the reduction in the number of recursive greedy calls, and quality function calls that can be obtained on NK model landscapes, and discretised versions of the IEEE CEC 2013 niching competition tests functions, using the developed framework compared to naive implementations. In many instances multiple order of magnitude improvements are observed.This work was supported by the Engineering and Physical Sciences Research Council [grant number EP/N017846/1]. The author would like to thank SĂ©bastien VĂ©rel and Gabriela Ochoa for providing inspirational invited talks on LONs at the University of Exeter during this grant, and also Ozgur Akman, Khulood Alyahya and Kevin Doherty
    • 

    corecore