144 research outputs found

    A Comprehensive Survey on Particle Swarm Optimization Algorithm and Its Applications

    Get PDF
    Particle swarm optimization (PSO) is a heuristic global optimization method, proposed originally by Kennedy and Eberhart in 1995. It is now one of the most commonly used optimization techniques. This survey presented a comprehensive investigation of PSO. On one hand, we provided advances with PSO, including its modifications (including quantum-behaved PSO, bare-bones PSO, chaotic PSO, and fuzzy PSO), population topology (as fully connected, von Neumann, ring, star, random, etc.), hybridization (with genetic algorithm, simulated annealing, Tabu search, artificial immune system, ant colony algorithm, artificial bee colony, differential evolution, harmonic search, and biogeography-based optimization), extensions (to multiobjective, constrained, discrete, and binary optimization), theoretical analysis (parameter selection and tuning, and convergence analysis), and parallel implementation (in multicore, multiprocessor, GPU, and cloud computing forms). On the other hand, we offered a survey on applications of PSO to the following eight fields: electrical and electronic engineering, automation control systems, communication theory, operations research, mechanical engineering, fuel and energy, medicine, chemistry, and biology. It is hoped that this survey would be beneficial for the researchers studying PSO algorithms

    A Tent L\'evy Flying Sparrow Search Algorithm for Feature Selection: A COVID-19 Case Study

    Full text link
    The "Curse of Dimensionality" induced by the rapid development of information science, might have a negative impact when dealing with big datasets. In this paper, we propose a variant of the sparrow search algorithm (SSA), called Tent L\'evy flying sparrow search algorithm (TFSSA), and use it to select the best subset of features in the packing pattern for classification purposes. SSA is a recently proposed algorithm that has not been systematically applied to feature selection problems. After verification by the CEC2020 benchmark function, TFSSA is used to select the best feature combination to maximize classification accuracy and minimize the number of selected features. The proposed TFSSA is compared with nine algorithms in the literature. Nine evaluation metrics are used to properly evaluate and compare the performance of these algorithms on twenty-one datasets from the UCI repository. Furthermore, the approach is applied to the coronavirus disease (COVID-19) dataset, yielding the best average classification accuracy and the average number of feature selections, respectively, of 93.47% and 2.1. Experimental results confirm the advantages of the proposed algorithm in improving classification accuracy and reducing the number of selected features compared to other wrapper-based algorithms

    Enhancement of bees algorithm for global optimisation

    Get PDF
    This research focuses on the improvement of the Bees Algorithm, a swarm-based nature-inspired optimisation algorithm that mimics the foraging behaviour of honeybees. The algorithm consists of exploitation and exploration, the two key elements of optimisation techniques that help to find the global optimum in optimisation problems. This thesis presents three new approaches to the Bees Algorithm in a pursuit to improve its convergence speed and accuracy. The first proposed algorithm focuses on intensifying the local search area by incorporating Hooke and Jeeves’ method in its exploitation mechanism. This direct search method contains a pattern move that works well in the new variant named “Bees Algorithm with Hooke and Jeeves” (BA-HJ). The second proposed algorithm replaces the randomly generated recruited bees deployment method with chaotic sequences using a well-known logistic map. This new variant called “Bees Algorithm with Chaos” (ChaosBA) was intended to use the characteristic of chaotic sequences to escape from local optima and at the same time maintain the diversity of the population. The third improvement uses the information of the current best solutions to create new candidate solutions probabilistically using the Estimation Distribution Algorithm (EDA) approach. This new version is called Bees Algorithm with Estimation Distribution (BAED). Simulation results show that these proposed algorithms perform better than the standard BA, SPSO2011 and qABC in terms of convergence for the majority of the tested benchmark functions. The BA-HJ outperformed the standard BA in thirteen out of fifteen benchmark functions and is more effective in eleven out of fifteen benchmark functions when compared to SPSO2011 and qABC. In the case of the ChaosBA, the algorithm outperformed the standard BA in twelve out of fifteen benchmark functions and significantly better in eleven out of fifteen test functions compared to qABC and SPSO2011. BAED discovered the optimal solution with the least number of evaluations in fourteen out of fifteen cases compared to the standard BA, and eleven out of fifteen functions compared to SPSO2011 and qABC. Furthermore, the results on a set of constrained mechanical design problems also show that the performance of the proposed algorithms is comparable to those of the standard BA and other swarm-based algorithms from the literature

    Particle Swarm Optimization: Basic Concepts, Variants and Applications in Power Systems

    Get PDF
    Many areas in power systems require solving one or more nonlinear optimization problems. While analytical methods might suffer from slow convergence and the curse of dimensionality, heuristics-based swarm intelligence can be an efficient alternative. Particle swarm optimization (PSO), part of the swarm intelligence family, is known to effectively solve large-scale nonlinear optimization problems. This paper presents a detailed overview of the basic concepts of PSO and its variants. Also, it provides a comprehensive survey on the power system applications that have benefited from the powerful nature of PSO as an optimization technique. For each application, technical details that are required for applying PSO, such as its type, particle formulation (solution representation), and the most efficient fitness functions are also discussed

    An improved data classification framework based on fractional particle swarm optimization

    Get PDF
    Particle Swarm Optimization (PSO) is a population based stochastic optimization technique which consist of particles that move collectively in iterations to search for the most optimum solutions. However, conventional PSO is prone to lack of convergence and even stagnation in complex high dimensional-search problems with multiple local optima. Therefore, this research proposed an improved Mutually-Optimized Fractional PSO (MOFPSO) algorithm based on fractional derivatives and small step lengths to ensure convergence to global optima by supplying a fine balance between exploration and exploitation. The proposed algorithm is tested and verified for optimization performance comparison on ten benchmark functions against six existing established algorithms in terms of Mean of Error and Standard Deviation values. The proposed MOFPSO algorithm demonstrated lowest Mean of Error values during the optimization on all benchmark functions through all 30 runs (Ackley = 0.2, Rosenbrock = 0.2, Bohachevsky = 9.36E-06, Easom = -0.95, Griewank = 0.01, Rastrigin = 2.5E-03, Schaffer = 1.31E-06, Schwefel 1.2 = 3.2E-05, Sphere = 8.36E-03, Step = 0). Furthermore, the proposed MOFPSO algorithm is hybridized with Back-Propagation (BP), Elman Recurrent Neural Networks (RNN) and Levenberg-Marquardt (LM) Artificial Neural Networks (ANNs) to propose an enhanced data classification framework, especially for data classification applications. The proposed classification framework is then evaluated for classification accuracy, computational time and Mean Squared Error on five benchmark datasets against seven existing techniques. It can be concluded from the simulation results that the proposed MOFPSO-ERNN classification algorithm demonstrated good classification performance in terms of classification accuracy (Breast Cancer = 99.01%, EEG = 99.99%, PIMA Indian Diabetes = 99.37%, Iris = 99.6%, Thyroid = 99.88%) as compared to the existing hybrid classification techniques. Hence, the proposed technique can be employed to improve the overall classification accuracy and reduce the computational time in data classification applications

    Evolutionary Computation 2020

    Get PDF
    Intelligent optimization is based on the mechanism of computational intelligence to refine a suitable feature model, design an effective optimization algorithm, and then to obtain an optimal or satisfactory solution to a complex problem. Intelligent algorithms are key tools to ensure global optimization quality, fast optimization efficiency and robust optimization performance. Intelligent optimization algorithms have been studied by many researchers, leading to improvements in the performance of algorithms such as the evolutionary algorithm, whale optimization algorithm, differential evolution algorithm, and particle swarm optimization. Studies in this arena have also resulted in breakthroughs in solving complex problems including the green shop scheduling problem, the severe nonlinear problem in one-dimensional geodesic electromagnetic inversion, error and bug finding problem in software, the 0-1 backpack problem, traveler problem, and logistics distribution center siting problem. The editors are confident that this book can open a new avenue for further improvement and discoveries in the area of intelligent algorithms. The book is a valuable resource for researchers interested in understanding the principles and design of intelligent algorithms

    Advances in Spacecraft Attitude Control

    Get PDF
    Spacecraft attitude maneuvers comply with Euler's moment equations, a set of three nonlinear, coupled differential equations. Nonlinearities complicate the mathematical treatment of the seemingly simple action of rotating, and these complications lead to a robust lineage of research. This book is meant for basic scientifically inclined readers, and commences with a chapter on the basics of spaceflight and leverages this remediation to reveal very advanced topics to new spaceflight enthusiasts. The topics learned from reading this text will prepare students and faculties to investigate interesting spaceflight problems in an era where cube satellites have made such investigations attainable by even small universities. It is the fondest hope of the editor and authors that readers enjoy this book

    Dual-Stage Hybrid Learning Particle Swarm Optimization Algorithm for Global Optimization Problems

    Get PDF
    Particle swarm optimization (PSO) is a type of swarm intelligence algorithm that is frequently used to resolve specific global optimization problems due to its rapid convergence and ease of operation. However, PSO still has certain deficiencies, such as a poor trade-off between exploration and exploitation and premature convergence. Hence, this paper proposes a dual-stage hybrid learning particle swarm optimization (DHLPSO). In the algorithm, the iterative process is partitioned into two stages. The learning strategy used at each stage emphasizes exploration and exploitation, respectively. In the first stage, to increase population variety, a Manhattan distance based learning strategy is proposed. In this strategy, each particle chooses the furthest Manhattan distance particle and a better particle for learning. In the second stage, an excellent example learning strategy is adopted to perform local optimization operations on the population, in which each particle learns from the global optimal particle and a better particle. Utilizing the Gaussian mutation strategy, the algorithm’s searchability in particular multimodal functions is significantly enhanced. On benchmark functions from CEC 2013, DHLPSO is evaluated alongside other PSO variants already in existence. The comparison results clearly demonstrate that, compared to other cutting-edge PSO variations, DHLPSO implements highly competitive performance in handling global optimization problems

    Gender-Based Deep Learning Firefly Optimization Method for Test Data Generation.

    Get PDF
    Software testing is a widespread validation means of software quality assurance in industry. Intelligent optimization algorithms have been proved to be an effective way of automatic test data generation. Firefly algorithm has received extensive attention and been widely used to solve optimization problems because of less parameters and simple implement. To overcome slow convergence rate and low accuracy of the firefly algorithm, a novel firefly algorithm with deep learning is proposed to generate structural test data. Initially, the population is divided into male subgroup and female subgroup. Following the randomly attracted model, each male firefly will be attracted by another randomly selected female firefly to focus on global search in whole space. Each female firefly implements local search under the leadership of the general center firefly, constructed based on historical experience with deep learning. At the final period of searching, chaos search is conducted near the best firefly to improve search accuracy. Simulation results show that the proposed algorithm can achieve better performance in terms of success coverage rate, coverage time, and diversity of solutions
    • …
    corecore