17 research outputs found

    Understanding Phase Transitions with Local Optima Networks: Number Partitioning as a Case Study

    Get PDF
    Phase transitions play an important role in understanding search difficulty in combinatorial optimisation. However, previous attempts have not revealed a clear link between fitness landscape properties and the phase transition. We explore whether the global landscape structure of the number partitioning problem changes with the phase transition. Using the local optima network model, we analyse a number of instances before, during, and after the phase transition. We compute relevant network and neutrality metrics; and importantly, identify and visualise the funnel structure with an approach (monotonic sequences) inspired by theoretical chemistry. While most metrics remain oblivious to the phase transition, our results reveal that the funnel structure clearly changes. Easy instances feature a single or a small number of dominant funnels leading to global optima; hard instances have a large number of suboptimal funnels attracting the search. Our study brings new insights and tools to the study of phase transitions in combinatorial optimisation

    How to Escape Local Optima in Black Box Optimisation: When Non-elitism Outperforms Elitism

    Get PDF
    Escaping local optima is one of the major obstacles to function optimisation. Using the metaphor of a fitness landscape, local optima correspond to hills separated by fitness valleys that have to be overcome. We define a class of fitness valleys of tunable difficulty by considering their length, representing the Hamming path between the two optima and their depth, the drop in fitness. For this function class we present a runtime comparison between stochastic search algorithms using different search strategies. The ((Formula presented.)) EA is a simple and well-studied evolutionary algorithm that has to jump across the valley to a point of higher fitness because it does not accept worsening moves (elitism). In contrast, the Metropolis algorithm and the Strong Selection Weak Mutation (SSWM) algorithm, a famous process in population genetics, are both able to cross the fitness valley by accepting worsening moves. We show that the runtime of the ((Formula presented.)) EA depends critically on the length of the valley while the runtimes of the non-elitist algorithms depend crucially on the depth of the valley. Moreover, we show that both SSWM and Metropolis can also efficiently optimise a rugged function consisting of consecutive valleys

    Local Optima Networks of the Permutation Flow-Shop Problem

    Full text link
    International audienceThis article extracts and analyzes local optima networks for the permutation flow-shop problem. Two widely used move operators for permutation representations, namely, swap and insertion, are incorporated into the network landscape model. The performance of a heuristic search algorithm on this problem is also analyzed. In particular, we study the correlation between local optima network features and the performance of an iterated local search heuristic. Our analysis reveals that network features can explain and predict problem difficulty. The evidence confirms the superiority of the insertion operator for this problem

    Testing, Evaluation and Performance of Optimization and Learning Systems

    No full text
    Benchmarks and test suites are widely used to evaluate optimization and learning systems. The advantage is that these test problems provide an objective means of comparing systems. The potential disadvantage is that systems can become overfitted to work well on benchmarks and therefore that good performance on benchmarks does not generalize to real world problems. The meaning and significance of benchmarks is examined in light of theoretical results such as "No Free Lunch." The "structure" of common benchmarks is also explored

    Genetic and Local Search Algorithms as Robust and Simple Optimization Tools

    No full text
    : One of the attractive features of recent metaheuristics is in its robustness and simplicity. To investigate this direction, the single machine scheduling problem is solved by various genetic algorithms (GA) and random multi-start local search algorithms (MLS), using rather simple definitions of neighbors, mutations and crossovers. The results indicate that: (1) the performance of GA is not sensitive about crossovers if implemented with mutations, (2) simple implementation of MLS is usually competitive with (or even better than) GA, (3) GRASP type modification of MLS improves its performance to some extent, and (4) GA combined with local search is quite effective if longer computational time is allowed. Key Words: combinatorial optimization, metaheuristics, genetic algorithm, local search, GRASP, single machine scheduling. 1. Introduction There are numerous combinatorial optimization problems, for which computing exact optimal solutions is computationally intractable, e.g., those known..
    corecore