295 research outputs found

    When hypermutations and ageing enable artificial immune systems to outperform evolutionary algorithms

    Get PDF
    We present a time complexity analysis of the Opt-IA artificial immune system (AIS). We first highlight the power and limitations of its distinguishing operators (i.e., hypermutations with mutation potential and ageing) by analysing them in isolation. Recent work has shown that ageing combined with local mutations can help escape local optima on a dynamic optimisation benchmark function. We generalise this result by rigorously proving that, compared to evolutionary algorithms (EAs), ageing leads to impressive speed-ups on the standard Image 1 benchmark function both when using local and global mutations. Unless the stop at first constructive mutation (FCM) mechanism is applied, we show that hypermutations require exponential expected runtime to optimise any function with a polynomial number of optima. If instead FCM is used, the expected runtime is at most a linear factor larger than the upper bound achieved for any random local search algorithm using the artificial fitness levels method. Nevertheless, we prove that algorithms using hypermutations can be considerably faster than EAs at escaping local optima. An analysis of the complete Opt-IA reveals that it is efficient on the previously considered functions and highlights problems where the use of the full algorithm is crucial. We complete the picture by presenting a class of functions for which Opt-IA fails with overwhelming probability while standard EAs are efficient

    Artificial Immune Systems for Combinatorial Optimisation: A Theoretical Investigation

    Get PDF
    We focus on the clonal selection inspired computational models of the immune system developed for general-purpose optimisation. Our aim is to highlight when these artificial immune systems (AIS) are more efficient than evolutionary algorithms (EAs). Compared to traditional EAs, AIS use considerably higher mutation rates (hypermutations) for variation, give higher selection probabilities to more recent solutions and lower selection probabilities to older ones (ageing). We consider the standard Opt-IA that includes both of the AIS distinguishing features and argue why it is of greater applicability than other popular AIS. Our first result is the proof that the stop at first constructive mutation version of its hypermutation operator is essential. Without it, the hypermutations cannot optimise any function with an arbitrary polynomial number of optima. Afterwards we show that the hypermutations are exponentially faster than the standard bit mutation operator used in traditional EAs at escaping from local optima of standard benchmark function classes and of the NP-hard Partition problem. If the basin of attraction of the local optima is not too large, then ageing allows even greater speed-ups. For the Cliff benchmark function this can make the difference between exponential and quasi-linear expected time. If the basin of attraction is too large, then ageing can implicitly detect the local optimum and escape it by automatically restarting the search process. The described power of hypermutations and ageing allows us to prove that they guarantee (1+epsilon) approximations for Partition in expected polynomial time for any constant epsilon. These features come at the expense of the hypermutations being a linear factor slower than EAs for standard unimodal benchmark functions and of eliminating the power of ageing at escaping local optima in the complete Opt-IA. We show that hypermutating with inversely proportional rates mitigates such drawbacks at the expense of losing the explorative advantages of the standard operator. We conclude the thesis by designing fast hypermutation operators that are provably a linear factor faster than the traditional ones for the unimodal benchmark functions and Partition, while maintaining explorative power and working in harmony together with ageing

    On inversely proportional hypermutations with mutation potential

    Get PDF
    Artificial Immune Systems (AIS) employing hypermutations with linear static mutation potential have recently been shown to be very effective at escaping local optima of combinatorial optimisation problems at the expense of being slower during the exploitation phase compared to standard evolutionary algorithms. In this paper we prove that considerable speed-ups in the exploitation phase may be achieved with dynamic inversely proportional mutation potentials (IPM) and argue that the potential should decrease inversely to the distance to the optimum rather than to the difference in fitness. Afterwards we define a simple (1+1)~Opt-IA, that uses IPM hypermutations and ageing, for realistic applications where optimal solutions are unknown. The aim of the AIS is to approximate the ideal behaviour of the inversely proportional hypermutations better and better as the search space is explored. We prove that such desired behaviour, and related speed-ups, occur for a well-studied bimodal benchmark function called \textsc{TwoMax}. Furthermore, we prove that the (1+1)~Opt-IA with IPM efficiently optimises a third bimodal function, \textsc{Cliff}, by escaping its local optima while Opt-IA with static potential cannot, thus requires exponential expected runtime in the distance between the cliff and the optimum

    The (1+(λ,λ)) Genetic Algorithm on the Vertex Cover Problem:Crossover Helps Leaving Plateaus

    Get PDF
    • …
    corecore