12 research outputs found

    A QSAR classification model of skin sensitization potential based on improving binary crow search algorithm

    Get PDF
    Classifying of skin sensitization using the quantitative structure-activityrelationship (QSAR) model is important. Applying descriptor selection isessential to improve the performance of the classification task. Recently, abinary crow search algorithm (BCSA) was proposed, which has been successfully applied to solve variable selection. In this work, a new time-varyingtransfer function is proposed to improve the exploration and exploitation capability of the BCSA in selecting the most relevant descriptors in QSAR classification model with high classification accuracy and short computing time.The results demonstrated that the proposed method is reliable and can reasonably separate the compounds according to sensitizers or non-sensitizerswith high classification accuracy

    CLASSIFICATION OF PARKINSON'S DISEASE IN BRAIN MRI IMAGES USING DEEP RESIDUAL CONVOLUTIONAL NEURAL NETWORK

    Get PDF
    In our aging culture, neurodegenerative disorders like Parkinson's disease (PD) are among the most serious health issues. It is a neurological condition that has social and economic effects on individuals. It happens because the brain's dopamine-producing cells are unable to produce enough of the chemical to support the body's motor functions. The main symptoms of this illness are eyesight, excretion activity, speech, and mobility issues, followed by depression, anxiety, sleep issues, and panic attacks. The main aim of this research is to develop a workable clinical decision-making framework that aids the physician in diagnosing patients with PD influence. In this research, we proposed a technique to classify Parkinson’s disease by MRI brain images. Initially, normalize the input data using the min-max normalization method and then remove noise from input images using a median filter. Then utilizing the Binary Dragonfly Algorithm to select the features. Furthermore, to segment the diseased part from MRI brain images using the technique Dense-UNet. Then, classify the disease as if it’s Parkinson’s disease or health control using the Deep Residual Convolutional Neural Network (DRCNN) technique along with Enhanced Whale Optimization Algorithm (EWOA) to get better classification accuracy. Here, we use the public Parkinson’s Progression Marker Initiative (PPMI) dataset for Parkinson’s MRI images. The accuracy, sensitivity, specificity, and precision metrics will be utilized with manually gathered data to assess the efficacy of the proposed methodology

    A hybrid swarm intelligence feature selection approach based on time-varying transition parameter

    Get PDF
    Feature selection aims to reduce the dimensionality of a dataset by removing superfluous attributes. This paper proposes a hybrid approach for feature selection problem by combining particle swarm optimization (PSO), grey wolf optimization (GWO), and tournament selection (TS) mechanism. Particle swarm enhances the diversification at the beginning of the search mechanism, grey wolf enhances the intensification at the end of the search mechanism, while tournament selection maintains diversification not only at the beginning but also at the end of the search process to achieve local optima avoidance. A time-varying transition parameter and a random variable are used to select either particle swarm, grey wolf, or tournament selection techniques during search process. This paper proposes different variants of this approach based on S-shaped and V-shaped transfer functions (TFs) to convert continuous solutions to binaries. These variants are named hybrid tournament grey wolf particle swarm (HTGWPS), followed by S or V letter to indicate the TF type, and followed by the TF’s number. These variants were evaluated using nine high-dimensional datasets. The results revealed that HTGWPS-V1 outperformed other V’s variants, PSO, and GWO on 78% of the datasets based on maximum classification accuracy obtained by a minimal feature subset. Also, HTGWPS-V1 outperformed six well-known-metaheuristics on 67% of the datasets

    Self-adaptive parameter and strategy based particle swarm optimization for large-scale feature selection problems with multiple classifiers

    Get PDF
    This work was partially supported by the National Natural Science Foundation of China (61403206, 61876089,61876185), the Natural Science Foundation of Jiangsu Province (BK20141005), the Natural Science Foundation of the Jiangsu Higher Education Institutions of China (14KJB520025), the Engineering Research Center of Digital Forensics, Ministry of Education, and the Priority Academic Program Development of Jiangsu Higher Education Institutions.Peer reviewedPostprin

    A New Quadratic Binary Harris Hawk Optimization For Feature Selection

    Get PDF
    Harris hawk optimization (HHO) is one of the recently proposed metaheuristic algorithms that has proven to be work more effectively in several challenging optimization tasks. However, the original HHO is developed to solve the continuous optimization problems, but not to the problems with binary variables. This paper proposes the binary version of HHO (BHHO) to solve the feature selection problem in classification tasks. The proposed BHHO is equipped with an S-shaped or V-shaped transfer function to convert the continuous variable into a binary one. Moreover, another variant of HHO, namely quadratic binary Harris hawk optimization (QBHHO), is proposed to enhance the performance of BHHO. In this study, twenty-two datasets collected from the UCI machine learning repository are used to validate the performance of proposed algorithms. A comparative study is conducted to compare the effectiveness of QBHHO with other feature selection algorithms such as binary differential evolution (BDE), genetic algorithm (GA), binary multi-verse optimizer (BMVO), binary flower pollination algorithm (BFPA), and binary salp swarm algorithm (BSSA). The experimental results show the superiority of the proposed QBHHO in terms of classification performance, feature size, and fitness values compared to other algorithms

    Binary dragonfly optimization for feature selection using time-varying transfer functions

    No full text
    The Dragonfly Algorithm (DA) is a recently proposed heuristic search algorithm that was shown to have excellent performance for numerous optimization problems. In this paper, a wrapper-feature selection algorithm is proposed based on the Binary Dragonfly Algorithm (BDA). The key component of the BDA is the transfer function that maps a continuous search space to a discrete search space. In this study, eight transfer functions, categorized into two families (S-shaped and V-shaped functions) are integrated into the BDA and evaluated using eighteen benchmark datasets obtained from the UCI data repository. The main contribution of this paper is the proposal of time-varying S-shaped and V-shaped transfer functions to leverage the impact of the step vector on balancing exploration and exploitation. During the early stages of the optimization process, the probability of changing the position of an element is high, which facilitates the exploration of new solutions starting from the initial population. On the other hand, the probability of changing the position of an element becomes lower towards the end of the optimization process. This behavior is obtained by considering the current iteration number as a parameter of transfer functions. The performance of the proposed approaches is compared with that of other state-of-art approaches including the DA, binary grey wolf optimizer (bGWO), binary gravitational search algorithm (BGSA), binary bat algorithm (BBA), particle swarm optimization (PSO), and genetic algorithm in terms of classification accuracy, sensitivity, specificity, area under the curve, and number of selected attributes. Results show that the time-varying S-shaped BDA approach outperforms compared approaches

    Binary Black Widow Optimization Algorithm for Feature Selection Problems

    Get PDF
    This thesis addresses feature selection (FS) problems, which is a primary stage in data mining. FS is a significant pre-processing stage to enhance the performance of the process with regards to computation cost and accuracy to offer a better comprehension of stored data by removing the unnecessary and irrelevant features from the basic dataset. However, because of the size of the problem, FS is known to be very challenging and has been classified as an NP-hard problem. Traditional methods can only be used to solve small problems. Therefore, metaheuristic algorithms (MAs) are becoming powerful methods for addressing the FS problems. Recently, a new metaheuristic algorithm, known as the Black Widow Optimization (BWO) algorithm, had great results when applied to a range of daunting design problems in the field of engineering, and has not yet been applied to FS problems. In this thesis, we are proposing a modified Binary Black Widow Optimization (BBWO) algorithm to solve FS problems. The FS evaluation method used in this study is the wrapper method, designed to keep a degree of balance between two significant processes: (i) minimize the number of selected features (ii) maintain a high level of accuracy. To achieve this, we have used the k-nearest-neighbor (KNN) machine learning algorithm in the learning stage intending to evaluate the accuracy of the solutions generated by the (BBWO). The proposed method is applied to twenty-eight public datasets provided by UCI. The results are then compared with up-to-date FS algorithms. Our results show that the BBWO works as good as, or even better in some cases, when compared to those FS algorithms. However, the results also show that the BBWO faces the problem of slow convergence due to the use of a population of solutions and the lack of local exploitation. To further improve the exploitation process and enhance the BBWO’s performance, we are proposing an improvement to the BBWO algorithm by combining it with a local metaheuristic algorithm based on the hill-climbing algorithm (HCA). This improvement method (IBBWO) is also tested on the twenty-eight datasets provided by UCI and the results are then compared with the basic BBWO and the up-to-date FS algorithms. Results show that the (IBBWO) produces better results in most cases when compared to basic BBWO. The results also show that IBBWO outperforms the most known FS algorithms in many cases
    corecore