742 research outputs found

    Improved Reptile Search Optimization Algorithm using Chaotic map and Simulated Annealing for Feature Selection in Medical Filed

    Get PDF
    The increased volume of medical datasets has produced high dimensional features, negatively affecting machine learning (ML) classifiers. In ML, the feature selection process is fundamental for selecting the most relevant features and reducing redundant and irrelevant ones. The optimization algorithms demonstrate its capability to solve feature selection problems. Reptile Search Algorithm (RSA) is a new nature-inspired optimization algorithm that stimulates Crocodiles’ encircling and hunting behavior. The unique search of the RSA algorithm obtains promising results compared to other optimization algorithms. However, when applied to high-dimensional feature selection problems, RSA suffers from population diversity and local optima limitations. An improved metaheuristic optimizer, namely the Improved Reptile Search Algorithm (IRSA), is proposed to overcome these limitations and adapt the RSA to solve the feature selection problem. Two main improvements adding value to the standard RSA; the first improvement is to apply the chaos theory at the initialization phase of RSA to enhance its exploration capabilities in the search space. The second improvement is to combine the Simulated Annealing (SA) algorithm with the exploitation search to avoid the local optima problem. The IRSA performance was evaluated over 20 medical benchmark datasets from the UCI machine learning repository. Also, IRSA is compared with the standard RSA and state-of-the-art optimization algorithms, including Particle Swarm Optimization (PSO), Genetic Algorithm (GA), Grasshopper Optimization algorithm (GOA) and Slime Mould Optimization (SMO). The evaluation metrics include the number of selected features, classification accuracy, fitness value, Wilcoxon statistical test (p-value), and convergence curve. Based on the results obtained, IRSA confirmed its superiority over the original RSA algorithm and other optimized algorithms on the majority of the medical datasets

    A hybrid swarm intelligence feature selection approach based on time-varying transition parameter

    Get PDF
    Feature selection aims to reduce the dimensionality of a dataset by removing superfluous attributes. This paper proposes a hybrid approach for feature selection problem by combining particle swarm optimization (PSO), grey wolf optimization (GWO), and tournament selection (TS) mechanism. Particle swarm enhances the diversification at the beginning of the search mechanism, grey wolf enhances the intensification at the end of the search mechanism, while tournament selection maintains diversification not only at the beginning but also at the end of the search process to achieve local optima avoidance. A time-varying transition parameter and a random variable are used to select either particle swarm, grey wolf, or tournament selection techniques during search process. This paper proposes different variants of this approach based on S-shaped and V-shaped transfer functions (TFs) to convert continuous solutions to binaries. These variants are named hybrid tournament grey wolf particle swarm (HTGWPS), followed by S or V letter to indicate the TF type, and followed by the TF’s number. These variants were evaluated using nine high-dimensional datasets. The results revealed that HTGWPS-V1 outperformed other V’s variants, PSO, and GWO on 78% of the datasets based on maximum classification accuracy obtained by a minimal feature subset. Also, HTGWPS-V1 outperformed six well-known-metaheuristics on 67% of the datasets

    Neighborhood search methods with Moth Optimization algorithm as a wrapper method for feature selection problems

    Get PDF
    Feature selection methods are used to select a subset of features from data, therefore only the useful information can be mined from the samples to get better accuracy and improves the computational efficiency of the learning model. Moth-flam Optimization (MFO) algorithm is a population-based approach, that simulates the behavior of real moth in nature, one drawback of the MFO algorithm is that the solutions move toward the best solution, and it easily can be stuck in local optima as we investigated in this paper, therefore, we proposed a MFO Algorithm combined with a neighborhood search method for feature selection problems, in order to avoid the MFO algorithm getting trapped in a local optima, and helps in avoiding the premature convergence, the neighborhood search method is applied after a predefined number of unimproved iterations (the number of tries fail to improve the current solution). As a result, the proposed algorithm shows good performance when compared with the original MFO algorithm and with state-of-the-art approaches

    Improving Design Optimization and Optimization-based Design Knowledge Discovery

    Get PDF
    The use of design optimization in the early stages of architectural design process has attracted a high volume of research in recent years. However, traditional design optimization requires a significant amount of computing time, especially when there are multiple design objectives to achieve. What’s more, there is a lack of studies in the current research on automatic generation of architectural design knowledge from optimization results. This paper presents computational methods for creating and improving a closed loop of design optimization and knowledge discovery in architecture. It first introduces a design knowledge-assisted optimization improvement method with the techniques - offline simulation and Divide & Conquer (D&C) - to reduce the computing time and improve the efficiency of the design optimization process utilizing architectural domain knowledge. It then describes a new design knowledge discovery system where design knowledge can be discovered from optimization through an automatic data mining approach. The discovered knowledge has the potential to further help improve the efficiency of the optimization method, thus forming a closed loop of improving optimization and knowledge discovery. The validations of both methods are presented in the context of a case study with parametric form-finding for a nursing unit design with two design objectives: minimizing the nurses’ travel distance and maximizing daylighting performance in patient rooms

    A Survey on Natural Inspired Computing (NIC): Algorithms and Challenges

    Get PDF
    Nature employs interactive images to incorporate end users2019; awareness and implication aptitude form inspirations into statistical/algorithmic information investigation procedures. Nature-inspired Computing (NIC) is an energetic research exploration field that has appliances in various areas, like as optimization, computational intelligence, evolutionary computation, multi-objective optimization, data mining, resource management, robotics, transportation and vehicle routing. The promising playing field of NIC focal point on managing substantial, assorted and self-motivated dimensions of information all the way through the incorporation of individual opinion by means of inspiration as well as communication methods in the study practices. In addition, it is the permutation of correlated study parts together with Bio-inspired computing, Artificial Intelligence and Machine learning that revolves efficient diagnostics interested in a competent pasture of study. This article intend at given that a summary of Nature-inspired Computing, its capacity and concepts and particulars the most significant scientific study algorithms in the field

    Hybrid feature selection method based on particle swarm optimization and adaptive local search method

    Get PDF
    Machine learning has been expansively examined with data classification as the most popularly researched subject. The accurateness of prediction is impacted by the data provided to the classification algorithm. Meanwhile, utilizing a large amount of data may incur costs especially in data collection and preprocessing. Studies on feature selection were mainly to establish techniques that can decrease the number of utilized features (attributes) in classification, also using data that generate accurate prediction is important. Hence, a particle swarm optimization (PSO) algorithm is suggested in the current article for selecting the ideal set of features. PSO algorithm showed to be superior in different domains in exploring the search space and local search algorithms are good in exploiting the search regions. Thus, we propose the hybridized PSO algorithm with an adaptive local search technique which works based on the current PSO search state and used for accepting the candidate solution. Having this combination balances the local intensification as well as the global diversification of the searching process. Hence, the suggested algorithm surpasses the original PSO algorithm and other comparable approaches, in terms of performance
    • 

    corecore