17 research outputs found

    Vigilant Salp Swarm Algorithm for Feature Selection

    Get PDF
    Feature selection (FS) averts the consideration of unwanted features which may tend the classification algorithm to classify wrongly. Choosing an optimal feature subset from the given set of features is challenging due to the complex associations present within the features. In non-convex conditions, the gradient-based algorithms suffer due to local optima or saddle points with respect to initial conditions where swarm intelligence algorithms pose a higher chance to converge over the global optima. The Salp Swarm Algorithm (SSA) proposed by Mirjalili et al. is based on the chaining behaviour of sea salps but the algorithm lacks diversity in the exploration stage. Rectifying the exploratory behaviour and testing the algorithm against the FS problem is the motivation behind this work. Three variants of the algorithm are proposed, of which the Vigilant Salp Swarm Algorithm (VSSA) inherits the vigilant mechanism in Grey Wolf Optimizer (GWO), the second variant and the third variant replace a simple crossover operator and shuffle crossover operator instead of the follower's position update mechanism used in the VSSA to form Vanilla Crossover VSSA (VCVSSA) and Shuffle Crossover VSSA (SCVSSA)

    Self-adaptive parameter and strategy based particle swarm optimization for large-scale feature selection problems with multiple classifiers

    Get PDF
    This work was partially supported by the National Natural Science Foundation of China (61403206, 61876089,61876185), the Natural Science Foundation of Jiangsu Province (BK20141005), the Natural Science Foundation of the Jiangsu Higher Education Institutions of China (14KJB520025), the Engineering Research Center of Digital Forensics, Ministry of Education, and the Priority Academic Program Development of Jiangsu Higher Education Institutions.Peer reviewedPostprin

    Parameters Identification of the Fractional-Order Permanent Magnet Synchronous Motor Models Using Chaotic Ensemble Particle Swarm Optimizer

    Get PDF
    © 2021 by the authors. In this paper, novel variants for the Ensemble Particle Swarm Optimizer (EPSO) are proposed where ten chaos maps are merged to enhance the EPSO’s performance by adaptively tuning its main parameters. The proposed Chaotic Ensemble Particle Swarm Optimizer variants (C.EPSO) are examined with complex nonlinear systems concerning equal order and variable-order fractional models of Permanent Magnet Synchronous Motor (PMSM). The proposed variants’ results are compared to that of its original version to recommend the most suitable variant for this non-linear optimization problem. A comparison between the introduced variants and the previously published algorithms proves the developed technique’s efficiency for further validation. The results emerge that the Chaotic Ensemble Particle Swarm variants with the Gauss/mouse map is the most proper variant for estimating the parameters of equal order and variable-order fractional PMSM models, as it achieves better accuracy, higher consistency, and faster convergence speed, it may lead to controlling the motor’s unwanted chaotic performance and protect it from ravage

    Advances in Condition Monitoring, Optimization and Control for Complex Industrial Processes

    Get PDF
    The book documents 25 papers collected from the Special Issue “Advances in Condition Monitoring, Optimization and Control for Complex Industrial Processes”, highlighting recent research trends in complex industrial processes. The book aims to stimulate the research field and be of benefit to readers from both academic institutes and industrial sectors

    Computational Optimizations for Machine Learning

    Get PDF
    The present book contains the 10 articles finally accepted for publication in the Special Issue “Computational Optimizations for Machine Learning” of the MDPI journal Mathematics, which cover a wide range of topics connected to the theory and applications of machine learning, neural networks and artificial intelligence. These topics include, among others, various types of machine learning classes, such as supervised, unsupervised and reinforcement learning, deep neural networks, convolutional neural networks, GANs, decision trees, linear regression, SVM, K-means clustering, Q-learning, temporal difference, deep adversarial networks and more. It is hoped that the book will be interesting and useful to those developing mathematical algorithms and applications in the domain of artificial intelligence and machine learning as well as for those having the appropriate mathematical background and willing to become familiar with recent advances of machine learning computational optimization mathematics, which has nowadays permeated into almost all sectors of human life and activity

    Applied Metaheuristic Computing

    Get PDF
    For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC

    Applied Methuerstic computing

    Get PDF
    For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC
    corecore