13 research outputs found

    On Easiest Functions for Mutation Operators in Bio-Inspired Optimisation

    Get PDF
    Understanding which function classes are easy and which are hard for a given algorithm is a fundamental question for the analysis and design of bio-inspired search heuristics. A natural starting point is to consider the easiest and hardest functions for an algorithm. For the (1+1) EA using standard bit mutation (SBM) it is well known that OneMax is an easiest function with unique optimum while Trap is a hardest. In this paper we extend the analysis of easiest function classes to the contiguous somatic hypermutation (CHM) operator used in artificial immune systems. We define a function MinBlocks and prove that it is an easiest function for the (1+1) EA using CHM, presenting both a runtime and a fixed budget analysis. Since MinBlocks is, up to a factor of 2, a hardest function for standard bit mutations, we consider the effects of combining both operators into a hybrid algorithm. We rigorously prove that by combining the advantages of k operators, several hybrid algorithmic schemes have optimal asymptotic performance on the easiest functions for each individual operator. In particular, the hybrid algorithms using CHM and SBM have optimal asymptotic performance on both OneMax and MinBlocks. We then investigate easiest functions for hybrid schemes and show that an easiest function for an hybrid algorithm is not just a trivial weighted combination of the respective easiest functions for each operator.publishersversionPeer reviewe

    Artificial immune systems can find arbitrarily good approximations for the NP-hard number partitioning problem

    Get PDF
    Typical artificial immune system (AIS) operators such as hypermutations with mutation potential and ageing allow to efficiently overcome local optima from which evolutionary algorithms (EAs) struggle to escape. Such behaviour has been shown for artificial example functions constructed especially to show difficulties that EAs may encounter during the optimisation process. However, no evidence is available indicating that these two operators have similar behaviour also in more realistic problems. In this paper we perform an analysis for the standard NP-hard Partition problem from combinatorial optimisation and rigorously show that hypermutations and ageing allow AISs to efficiently escape from local optima where standard EAs require exponential time. As a result we prove that while EAs and random local search (RLS) may get trapped on 4/3 approximations, AISs find arbitrarily good approximate solutions of ratio (1+) within n(−(2/)−1)(1 − )−2e322/ + 2n322/ + 2n3 function evaluations in expectation. This expectation is polynomial in the problem size and exponential only in 1/

    Artificial Immune Systems for Combinatorial Optimisation: A Theoretical Investigation

    Get PDF
    We focus on the clonal selection inspired computational models of the immune system developed for general-purpose optimisation. Our aim is to highlight when these artificial immune systems (AIS) are more efficient than evolutionary algorithms (EAs). Compared to traditional EAs, AIS use considerably higher mutation rates (hypermutations) for variation, give higher selection probabilities to more recent solutions and lower selection probabilities to older ones (ageing). We consider the standard Opt-IA that includes both of the AIS distinguishing features and argue why it is of greater applicability than other popular AIS. Our first result is the proof that the stop at first constructive mutation version of its hypermutation operator is essential. Without it, the hypermutations cannot optimise any function with an arbitrary polynomial number of optima. Afterwards we show that the hypermutations are exponentially faster than the standard bit mutation operator used in traditional EAs at escaping from local optima of standard benchmark function classes and of the NP-hard Partition problem. If the basin of attraction of the local optima is not too large, then ageing allows even greater speed-ups. For the Cliff benchmark function this can make the difference between exponential and quasi-linear expected time. If the basin of attraction is too large, then ageing can implicitly detect the local optimum and escape it by automatically restarting the search process. The described power of hypermutations and ageing allows us to prove that they guarantee (1+epsilon) approximations for Partition in expected polynomial time for any constant epsilon. These features come at the expense of the hypermutations being a linear factor slower than EAs for standard unimodal benchmark functions and of eliminating the power of ageing at escaping local optima in the complete Opt-IA. We show that hypermutating with inversely proportional rates mitigates such drawbacks at the expense of losing the explorative advantages of the standard operator. We conclude the thesis by designing fast hypermutation operators that are provably a linear factor faster than the traditional ones for the unimodal benchmark functions and Partition, while maintaining explorative power and working in harmony together with ageing

    When hypermutations and ageing enable artificial immune systems to outperform evolutionary algorithms

    Get PDF
    We present a time complexity analysis of the Opt-IA artificial immune system (AIS). We first highlight the power and limitations of its distinguishing operators (i.e., hypermutations with mutation potential and ageing) by analysing them in isolation. Recent work has shown that ageing combined with local mutations can help escape local optima on a dynamic optimisation benchmark function. We generalise this result by rigorously proving that, compared to evolutionary algorithms (EAs), ageing leads to impressive speed-ups on the standard Image 1 benchmark function both when using local and global mutations. Unless the stop at first constructive mutation (FCM) mechanism is applied, we show that hypermutations require exponential expected runtime to optimise any function with a polynomial number of optima. If instead FCM is used, the expected runtime is at most a linear factor larger than the upper bound achieved for any random local search algorithm using the artificial fitness levels method. Nevertheless, we prove that algorithms using hypermutations can be considerably faster than EAs at escaping local optima. An analysis of the complete Opt-IA reveals that it is efficient on the previously considered functions and highlights problems where the use of the full algorithm is crucial. We complete the picture by presenting a class of functions for which Opt-IA fails with overwhelming probability while standard EAs are efficient

    Artificial Immune Systems can find arbitrarily good approximations for the NP-Hard partition problem

    Get PDF
    Typical Artificial Immune System (AIS) operators such as hypermutations with mutation potential and ageing allow to efficiently overcome local optima from which Evolutionary Algorithms (EAs) struggle to escape. Such behaviour has been shown for artificial example functions such as Jump, Cliff or Trap constructed especially to show difficulties that EAs may encounter during the optimisation process. However, no evidence is available indicating that similar effects may also occur in more realistic problems. In this paper we perform an analysis for the standard NP-Hard Partition problem from combinatorial optimisation and rigorously show that hypermutations and ageing allow AISs to efficiently escape from local optima where standard EAs require exponential time. As a result we prove that while EAs and Random Local Search may get trapped on 4/3 approximations, AISs find arbitrarily good approximate solutions of ratio ( 1+ϵ ) for any constant ϵ within a time that is polynomial in the problem size and exponential only in 1/ϵ
    corecore