65 research outputs found

    Feature selection method based on chaotic maps and butterfly optimization algorithm

    Get PDF
    Feature selection (FS) is a challenging problem that attracted the attention of many researchers. FS can be considered as an NP hard problem, If dataset contains N features then 2N solutions are generated with each additional feature, the complexity doubles. To solve this problem, we reduce the dimensionality of the feature by extracting the most important features. In this paper we integrate the chaotic maps in the standard butterfly optimization algorithm to increase the diversity and avoid trapping in local minima in this algorithm.. The proposed algorithm is called Chaotic Butterfly Optimization Algorithm (CBOA).The performance of the proposed CBOA is investigated by applying it on 16 benchmark datasets and comparing it against six meta-heuristics algorithms. The results show that invoking the chaotic maps in the standard BOA can improve its performance with accuracy more than 95%

    Volcano eruption algorithm for solving optimization problems

    Get PDF
    This is an accepted manuscript of an article published by Springer in Neural Computing and Applications on 30/06/2020, available online at https://doi.org/10.1007/s00521-020-05124-x The accepted version of the publication may differ from the final published version.Meta-heuristic algorithms have been proposed to solve several optimization problems in different research areas due to their unique attractive features. Traditionally, heuristic approaches are designed separately for discrete and continuous problems. This paper leverages the meta-heuristic algorithm for solving NP-hard problems in both continuous and discrete optimization fields, such as nonlinear and multi-level programming problems through extensive simulations of volcano eruption process. In particular, a new optimization solution named Volcano Eruption Algorithm (VEA) proposed in this paper, which is inspired from the nature of volcano eruption. The feasibility and efficiency of the algorithm are evaluated using numerical results obtained through several test problems reported in the state-of-theart literature. Based on the solutions and number of required iterations, we observed that the proposed meta-heuristic algorithm performs remarkably well to solve NP-hard problem. Furthermore, the proposed algorithm is applied to solve some large-size benchmarking LP and Internet of Vehicles (IoV) problems efficiently

    S-Shaped vs. V-Shaped Transfer Functions for Ant Lion Optimization Algorithm in Feature Selection Problem

    No full text
    Feature selection is an important preprocessing step for classification problems. It deals with selecting near optimal features in the original dataset. Feature selection is an NP-hard problem, so meta-heuristics can be more efficient than exact methods. In this work, Ant Lion Optimizer (ALO), which is a recent metaheuristic algorithm, is employed as a wrapper feature selection method. Six variants of ALO are proposed where each employ a transfer function to map a continuous search space to a discrete search space. The performance of the proposed approaches is tested on eighteen UCI datasets and compared to a number of existing approaches in the literature: Particle Swarm Optimization, Gravitational Search Algorithm and two existing ALO-based approaches. Computational experiments show that the proposed approaches efficiently explore the feature space and select the most informative features, which help to improve the classification accuracy

    A Robust Multi-Objective Feature Selection Model Based on Local Neighborhood Multi-Verse Optimization

    Get PDF
    This work was supported in part by the Ministerio de Economia y Competitividad under Grant TIN2017-85727-C4-2P.Classification tasks often include, among the large number of features to be processed in the datasets, many irrelevant and redundant ones, which can even decrease the ef ciency of classi ers. Feature Selection (FS) is the most common preprocessing technique utilized to overcome the drawbacks of the high dimensionality of datasets and often has two con icting objectives: The rst function aims to maximize the classi cation performance or reduce the error rate of the classi er. In contrast, the second function is designed to minimize the number of features. However, the majority of wrapper FS techniques are developed for single-objective scenarios. Multi-verse optimizer (MVO) is considered as one of the well-regarded optimization approaches in recent years. In this paper, the binary multi-objective variant of MVO (MOMVO) is proposed to deal with feature selection tasks. The standard MOMVO suffers from local optima stagnation, so we propose an improved binary MOMVO to deal with this issue using the memory concept and personal best of the universes. The experimental results and comparisons indicate that the proposed binary MOMVO approach can effectively eliminate irrelevant and/or redundant features and maintain a minimum classi cation error rate when dealing with different datasets compared with the most popular feature selection techniques. Furthermore, the 14 benchmark datasets showed that the proposed approach outperforms the stat-of-art multi-objective optimization algorithms for feature selection.Spanish Government TIN2017-85727-C4-2

    A decomposition and multi-objective evolutionary optimization model for suspended sediment load prediction in rivers

    No full text
    202303 bcwwVersion of RecordOthersY202147738; Technische Universität Dresden, TUD; Taif University, TU: TURSP-2020/114Publishe

    Diffusion analysis with high and low concentration regions by the finite difference method, the adaptive network-based fuzzy inference system, and the bilayered neural network method

    No full text
    202311 bcchVersion of RecordOthersTechnische Universität Dresden, TUD; Taif University, TUPublishedC

    Binary dragonfly optimization for feature selection using time-varying transfer functions

    No full text
    The Dragonfly Algorithm (DA) is a recently proposed heuristic search algorithm that was shown to have excellent performance for numerous optimization problems. In this paper, a wrapper-feature selection algorithm is proposed based on the Binary Dragonfly Algorithm (BDA). The key component of the BDA is the transfer function that maps a continuous search space to a discrete search space. In this study, eight transfer functions, categorized into two families (S-shaped and V-shaped functions) are integrated into the BDA and evaluated using eighteen benchmark datasets obtained from the UCI data repository. The main contribution of this paper is the proposal of time-varying S-shaped and V-shaped transfer functions to leverage the impact of the step vector on balancing exploration and exploitation. During the early stages of the optimization process, the probability of changing the position of an element is high, which facilitates the exploration of new solutions starting from the initial population. On the other hand, the probability of changing the position of an element becomes lower towards the end of the optimization process. This behavior is obtained by considering the current iteration number as a parameter of transfer functions. The performance of the proposed approaches is compared with that of other state-of-art approaches including the DA, binary grey wolf optimizer (bGWO), binary gravitational search algorithm (BGSA), binary bat algorithm (BBA), particle swarm optimization (PSO), and genetic algorithm in terms of classification accuracy, sensitivity, specificity, area under the curve, and number of selected attributes. Results show that the time-varying S-shaped BDA approach outperforms compared approaches
    corecore