2 research outputs found

    A non-revisiting particle swarm optimization

    Get PDF
    In this article, a non-revisiting particle swarm optimization (NrPSO) is proposed. NrPSO is an integration of the non-revisiting scheme and a standard particle swarm optimization (PSO). It guarantees that all updated positions are not evaluated before. This property leads to two advantages: 1) it undisputedly reduces the computation cost on evaluating a time consuming and expensive objective function and 2) It helps prevent premature convergence. The non-revisiting scheme acts as a self-adaptive mutation. Particles genericly switch between local search and global search. In addition, since the adaptive mutation scheme of NrPSO involves no parameter, comparing with other variants of PSO which involve at least two performance sensitive parameters, the performance of NrPSO is more reliable. The simulation results show that NrPSO outperforms four variants of PSOs on optimizing both uni-modal and multi-modal functions with dimensions up to 40. We also illustrate that the overhead and archive size of NrPSO are insignificant. Thus NrPSO is practical for real world applications. In addition, it is shown that the performance of NrPSO is insensitive to the specific chosen values of parameters. © 2008 IEEE.published_or_final_versio

    Continuous non-revisiting genetic algorithm

    Get PDF
    The non-revisiting genetic algorithm (NrGA) is extended to handle continuous search space. The extended NrGA model, Continuous NrGA (cNrGA), employs the same tree-structure archive of NrGA to memorize the evaluated solutions, in which the search space is divided into non-overlapped partitions according to the distribution of the solutions. cNrGA is a bi-modulus evolutionary algorithm consisting of the genetic algorithm module (GAM) and the adaptive mutation module (AMM). When GAM generates an offspring, the offspring is sent to AMM and is mutated according to the density of the solutions stored in the memory archive. For a point in the search space with high solution-density, it infers a high probability that the point is close to the optimum and hence a near search is suggested. Alternatively, a far search is recommended for a point with low solution-density. Benefitting from the space partitioning scheme, a fast solution-density approximation is obtained. Also, the adaptive mutation scheme naturally avoid the generation of out-of-bound solutions. The performance of cNrGA is tested on 14 benchmark functions on dimensions ranging from 2 to 40. It is compared with real coded GA, differential evolution, covariance matrix adaptation evolution strategy and two improved particle swarm optimization. The simulation results show that cNrGA outperforms the other algorithms for multi-modal function optimization.published_or_final_versio
    corecore