16,474 research outputs found

    An island based hybrid evolutionary algorithm for optimization

    Get PDF
    This is a post-print version of the article - Copyright @ 2008 Springer-VerlagEvolutionary computation has become an important problem solving methodology among the set of search and optimization techniques. Recently, more and more different evolutionary techniques have been developed, especially hybrid evolutionary algorithms. This paper proposes an island based hybrid evolutionary algorithm (IHEA) for optimization, which is based on Particle swarm optimization (PSO), Fast Evolutionary Programming (FEP), and Estimation of Distribution Algorithm (EDA). Within IHEA, an island model is designed to cooperatively search for the global optima in search space. By combining the strengths of the three component algorithms, IHEA greatly improves the optimization performance of the three basic algorithms. Experimental results demonstrate that IHEA outperforms all the three component algorithms on the test problems.This work was supported by the Engineering and Physical Sciences Research Council (EPSRC) of UK under Grant EP/E060722/1

    A new approach to particle swarm optimization algorithm

    Get PDF
    Particularly interesting group consists of algorithms that implement co-evolution or co-operation in natural environments, giving much more powerful implementations. The main aim is to obtain the algorithm which operation is not influenced by the environment. An unusual look at optimization algorithms made it possible to develop a new algorithm and its metaphors define for two groups of algorithms. These studies concern the particle swarm optimization algorithm as a model of predator and prey. New properties of the algorithm resulting from the co-operation mechanism that determines the operation of algorithm and significantly reduces environmental influence have been shown. Definitions of functions of behavior scenarios give new feature of the algorithm. This feature allows self controlling the optimization process. This approach can be successfully used in computer games. Properties of the new algorithm make it worth of interest, practical application and further research on its development. This study can be also an inspiration to search other solutions that implementing co-operation or co-evolution.Angeline, P. (1998). Using selection to improve particle swarm optimization. In Proceedings of the IEEE congress on evolutionary computation, Anchorage (pp. 84ā€“89).Arquilla, J., & Ronfeldt, D. (2000). Swarming and the future of conflict, RAND National Defense Research Institute, Santa Monica, CA, US.Bessaou, M., & Siarry, P. (2001). A genetic algorithm with real-value coding to optimize multimodal continuous functions. Structural and Multidiscipline Optimization, 23, 63ā€“74.Bird, S., & Li, X. (2006). Adaptively choosing niching parameters in a PSO. In Proceedings of the 2006 genetic and evolutionary computation conference (pp. 3ā€“10).Bird, S., & Li, X. (2007). Using regression to improve local convergence. In Proceedings of the 2007 IEEE congress on evolutionary computation (pp. 592ā€“599).Blackwell, T., & Bentley, P. (2002). Dont push me! Collision-avoiding swarms. In Proceedings of the IEEE congress on evolutionary computation, Honolulu (pp. 1691ā€“1696).Brits, R., Engelbrecht, F., & van den Bergh, A. P. (2002). Solving systems of unconstrained equations using particle swarm optimization. In Proceedings of the 2002 IEEE conference on systems, man, and cybernetics (pp. 102ā€“107).Brits, R., Engelbrecht, A., & van den Bergh, F. (2002). A niching particle swarm optimizer. In Proceedings of the fourth asia-pacific conference on simulated evolution and learning (pp. 692ā€“696).Chelouah, R., & Siarry, P. (2000). A continuous genetic algorithm designed for the global optimization of multimodal functions. Journal of Heuristics, 6(2), 191ā€“213.Chelouah, R., & Siarry, P. (2000). Tabu search applied to global optimization. European Journal of Operational Research, 123, 256ā€“270.Chelouah, R., & Siarry, P. (2003). Genetic and Nelderā€“Mead algorithms hybridized for a more accurate global optimization of continuous multiminima function. European Journal of Operational Research, 148(2), 335ā€“348.Chelouah, R., & Siarry, P. (2005). A hybrid method combining continuous taboo search and Nelderā€“Mead simplex algorithms for the global optimization of multiminima functions. European Journal of Operational Research, 161, 636ā€“654.Chen, T., & Chi, T. (2010). On the improvements of the particle swarm optimization algorithm. Advances in Engineering Software, 41(2), 229ā€“239.Clerc, M., & Kennedy, J. (2002). The particle swarm-explosion, stability, and convergence in a multidimensional complex space. IEEE Transactions on Evolutionary Computation, 6(1), 58ā€“73.Fan, H., & Shi, Y. (2001). Study on Vmax of particle swarm optimization. In Proceedings of the workshop particle swarm optimization, Indianapolis.Gao, H., & Xu, W. (2011). Particle swarm algorithm with hybrid mutation strategy. Applied Soft Computing, 11(8), 5129ā€“5142.Gosciniak, I. (2008). Immune algorithm in non-stationary optimization task. In Proceedings of the 2008 international conference on computational intelligence for modelling control & automation, CIMCA ā€™08 (pp. 750ā€“755). Washington, DC, USA: IEEE Computer Society.He, Q., & Wang, L. (2007). An effective co-evolutionary particle swarm optimization for constrained engineering design problems. Engineering Applications of Artificial Intelligence, 20(1), 89ā€“99.Higashitani, M., Ishigame, A., & Yasuda, K., (2006). Particle swarm optimization considering the concept of predatorā€“prey behavior. In 2006 IEEE congress on evolutionary computation (pp. 434ā€“437).Higashitani, M., Ishigame, A., & Yasuda, K. (2008). Pursuit-escape particle swarm optimization. IEEJ Transactions on Electrical and Electronic Engineering, 3(1), 136ā€“142.Hu, X., & Eberhart, R. (2002). Multiobjective optimization using dynamic neighborhood particle swarm optimization. In Proceedings of the evolutionary computation on 2002. CEC ā€™02. Proceedings of the 2002 congress (Vol. 02, pp. 1677ā€“1681). Washington, DC, USA: IEEE Computer Society.Hu, X., Eberhart, R., & Shi, Y. (2003). Engineering optimization with particle swarm. In IEEE swarm intelligence symposium, SIS 2003 (pp. 53ā€“57). Indianapolis: IEEE Neural Networks Society.Jang, W., Kang, H., Lee, B., Kim, K., Shin, D., & Kim, S. (2007). Optimized fuzzy clustering by predator prey particle swarm optimization. In IEEE congress on evolutionary computation, CEC2007 (pp. 3232ā€“3238).Kennedy, J. (2000). Stereotyping: Improving particle swarm performance with cluster analysis. In Proceedings of the 2000 congress on evolutionary computation (pp. 1507ā€“1512).Kennedy, J., & Mendes, R. (2002). Population structure and particle swarm performance. In IEEE congress on evolutionary computation (pp. 1671ā€“1676).Kuo, H., Chang, J., & Shyu, K. (2004). A hybrid algorithm of evolution and simplex methods applied to global optimization. Journal of Marine Science and Technology, 12(4), 280ā€“289.Leontitsis, A., Kontogiorgos, D., & Pange, J. (2006). Repel the swarm to the optimum. Applied Mathematics and Computation, 173(1), 265ā€“272.Li, X. (2004). Adaptively choosing neighborhood bests using species in a particle swarm optimizer for multimodal function optimization. In Proceedings of the 2004 genetic and evolutionary computation conference (pp. 105ā€“116).Li, C., & Yang, S. (2009). A clustering particle swarm optimizer for dynamic optimization. In Proceedings of the 2009 congress on evolutionary computation (pp. 439ā€“446).Liang, J., Suganthan, P., & Deb, K. (2005). Novel composition test functions for numerical global optimization. In Proceedings of the swarm intelligence symposium [Online]. Available: .Liang, J., Qin, A., Suganthan, P., & Baskar, S. (2006). Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Transactions on Evolutionary Computation, 10(3), 281ā€“295.Lovbjerg, M., & Krink, T. (2002). Extending particle swarm optimizers with self-organized criticality. In Proceedings of the congress on evolutionary computation, Honolulu (pp. 1588ā€“1593).Lung, R., & Dumitrescu, D. (2007). A collaborative model for tracking optima in dynamic environments. In Proceedings of the 2007 congress on evolutionary computation (pp. 564ā€“567).Mendes, R., Kennedy, J., & Neves, J. (2004). The fully informed particle swarm: simpler, maybe better. IEEE Transaction on Evolutionary Computation, 8(3), 204ā€“210.Miranda, V., & Fonseca, N. (2002). New evolutionary particle swarm algorithm (EPSO) applied to voltage/VAR control. In Proceedings of the 14th power systems computation conference, Seville, Spain [Online] Available: .Parrott, D., & Li, X. (2004). A particle swarm model for tracking multiple peaks in a dynamic environment using speciation. In Proceedings of the 2004 congress on evolutionary computation (pp. 98ā€“103).Parrott, D., & Li, X. (2006). Locating and tracking multiple dynamic optima by a particle swarm model using speciation. In IEEE transaction on evolutionary computation (Vol. 10, pp. 440ā€“458).Parsopoulos, K., & Vrahatis, M. (2004). UPSOA unified particle swarm optimization scheme. Lecture Series on Computational Sciences, 868ā€“873.Passaroand, A., & Starita, A. (2008). Particle swarm optimization for multimodal functions: A clustering approach. Journal of Artificial Evolution and Applications, 2008, 15 (Article ID 482032).Peram, T., Veeramachaneni, K., & Mohan, C. (2003). Fitness-distance-ratio based particle swarm optimization. In Swarm intelligence symp. (pp. 174ā€“181).Sedighizadeh, D., & Masehian, E. (2009). Particle swarm optimization methods, taxonomy and applications. International Journal of Computer Theory and Engineering, 1(5), 1793ā€“8201.Shi, Y., & Eberhart, R. (2001). Particle swarm optimization with fuzzy adaptive inertia weight. In Proceedings of the workshop particle swarm optimization, Indianapolis (pp. 101ā€“106).Shi, Y., & Eberhart, R. (1998). A modified particle swarm optimizer. In Proceedings of IEEE International Conference on Evolutionary Computation (pp. 69ā€“73). Washington, DC, USA: IEEE Computer Society.Thomsen, R. (2004). Multimodal optimization using crowding-based differential evolution. In Proceedings of the 2004 congress on evolutionary computation (pp. 1382ā€“1389).Trojanowski, K., & Wierzchoń, S. (2009). Immune-based algorithms for dynamic optimization. Information Sciences, 179(10), 1495ā€“1515.Tsoulos, I., & Stavrakoudis, A. (2010). Enhancing PSO methods for global optimization. Applied Mathematics and Computation, 216(10), 2988ā€“3001.van den Bergh, F., & Engelbrecht, A. (2004). A cooperative approach to particle swarm optimization. IEEE Transactions on Evolutionary Computation, 8, 225ā€“239.Wolpert, D., & Macready, W. (1997). No free lunch theorems for optimization. IEEE Transaction on Evolutionary Computation, 1(1), 67ā€“82.Xie, X., Zhang, W., & Yang, Z. (2002). Dissipative particle swarm optimization. In Proceedings of the congress on evolutionary computation (pp. 1456ā€“1461).Yang, S., & Li, C. (2010). A clustering particle swarm optimizer for locating and tracking multiple optima in dynamic environments. In IEEE Trans. on evolutionary computation (Vol. 14, pp. 959ā€“974).Kuo, H., Chang, J., & Liu, C. (2006). Particle swarm optimization for global optimization problems. Journal of Marine Science and Technology, 14(3), 170ā€“181

    Hybridization of multi-objective deterministic particle swarm with derivative-free local searches

    Get PDF
    The paper presents a multi-objective derivative-free and deterministic global/local hybrid algorithm for the efficient and effective solution of simulation-based design optimization (SBDO) problems. The objective is to show how the hybridization of two multi-objective derivative-free global and local algorithms achieves better performance than the separate use of the two algorithms in solving specific SBDO problems for hull-form design. The proposed method belongs to the class of memetic algorithms, where the global exploration capability of multi-objective deterministic particle swarm optimization is enriched by exploiting the local search accuracy of a derivative-free multi-objective line-search method. To the authors best knowledge, studies are still limited on memetic, multi-objective, deterministic, derivative-free, and evolutionary algorithms for an effective and efficient solution of SBDO for hull-form design. The proposed formulation manages global and local searches based on the hypervolume metric. The hybridization scheme uses two parameters to control the local search activation and the number of function calls used by the local algorithm. The most promising values of these parameters were identified using forty analytical tests representative of the SBDO problem of interest. The resulting hybrid algorithm was finally applied to two SBDO problems for hull-form design. For both analytical tests and SBDO problems, the hybrid method achieves better performance than its global and local counterparts

    Genetic learning particle swarm optimization

    Get PDF
    Social learning in particle swarm optimization (PSO) helps collective efficiency, whereas individual reproduction in genetic algorithm (GA) facilitates global effectiveness. This observation recently leads to hybridizing PSO with GA for performance enhancement. However, existing work uses a mechanistic parallel superposition and research has shown that construction of superior exemplars in PSO is more effective. Hence, this paper first develops a new framework so as to organically hybridize PSO with another optimization technique for ā€œlearning.ā€ This leads to a generalized ā€œlearning PSOā€ paradigm, the *L-PSO. The paradigm is composed of two cascading layers, the first for exemplar generation and the second for particle updates as per a normal PSO algorithm. Using genetic evolution to breed promising exemplars for PSO, a specific novel *L-PSO algorithm is proposed in the paper, termed genetic learning PSO (GL-PSO). In particular, genetic operators are used to generate exemplars from which particles learn and, in turn, historical search information of particles provides guidance to the evolution of the exemplars. By performing crossover, mutation, and selection on the historical information of particles, the constructed exemplars are not only well diversified, but also high qualified. Under such guidance, the global search ability and search efficiency of PSO are both enhanced. The proposed GL-PSO is tested on 42 benchmark functions widely adopted in the literature. Experimental results verify the effectiveness, efficiency, robustness, and scalability of the GL-PSO

    Towards a Better Understanding of the Local Attractor in Particle Swarm Optimization: Speed and Solution Quality

    Full text link
    Particle Swarm Optimization (PSO) is a popular nature-inspired meta-heuristic for solving continuous optimization problems. Although this technique is widely used, the understanding of the mechanisms that make swarms so successful is still limited. We present the first substantial experimental investigation of the influence of the local attractor on the quality of exploration and exploitation. We compare in detail classical PSO with the social-only variant where local attractors are ignored. To measure the exploration capabilities, we determine how frequently both variants return results in the neighborhood of the global optimum. We measure the quality of exploitation by considering only function values from runs that reached a search point sufficiently close to the global optimum and then comparing in how many digits such values still deviate from the global minimum value. It turns out that the local attractor significantly improves the exploration, but sometimes reduces the quality of the exploitation. As a compromise, we propose and evaluate a hybrid PSO which switches off its local attractors at a certain point in time. The effects mentioned can also be observed by measuring the potential of the swarm

    An Investigation into the Merger of Stochastic Diffusion Search and Particle Swarm Optimisation

    Get PDF
    This study reports early research aimed at applying the powerful resource allocation mechanism deployed in Stochastic Diffusion Search (SDS) to the Particle Swarm Optimiser (PSO) metaheuristic, effectively merging the two swarm intelligence algorithms. The results reported herein suggest that the hybrid algorithm, exploiting information sharing between particles, has the potential to improve the optimisation capability of conventional PSOs
    • ā€¦
    corecore