2,106 research outputs found

    An Evolutionary Optimization Algorithm for Automated Classical Machine Learning

    Get PDF
    Machine learning is an evolving branch of computational algorithms that allow computers to learn from experiences, make predictions, and solve different problems without being explicitly programmed. However, building a useful machine learning model is a challenging process, requiring human expertise to perform various proper tasks and ensure that the machine learning\u27s primary objective --determining the best and most predictive model-- is achieved. These tasks include pre-processing, feature selection, and model selection. Many machine learning models developed by experts are designed manually and by trial and error. In other words, even experts need the time and resources to create good predictive machine learning models. The idea of automated machine learning (AutoML) is to automate a machine learning pipeline to release the burden of substantial development costs and manual processes. The algorithms leveraged in these systems have different hyper-parameters. On the other hand, different input datasets have various features. In both cases, the final performance of the model is closely related to the final selected configuration of features and hyper-parameters. That is why they are considered as crucial tasks in the AutoML. The challenges regarding the computationally expensive nature of tuning hyper-parameters and optimally selecting features create significant opportunities for filling the research gaps in the AutoML field. This dissertation explores how to select the features and tune the hyper-parameters of conventional machine learning algorithms efficiently and automatically. To address the challenges in the AutoML area, novel algorithms for hyper-parameter tuning and feature selection are proposed. The hyper-parameter tuning algorithm aims to provide the optimal set of hyper-parameters in three conventional machine learning models (Random Forest, XGBoost and Support Vector Machine) to obtain best scores regarding performance. On the other hand, the feature selection algorithm looks for the optimal subset of features to achieve the highest performance. Afterward, a hybrid framework is designed for both hyper-parameter tuning and feature selection. The proposed framework can discover close to the optimal configuration of features and hyper-parameters. The proposed framework includes the following components: (1) an automatic feature selection component based on artificial bee colony algorithms and machine learning training, and (2) an automatic hyper-parameter tuning component based on artificial bee colony algorithms and machine learning training for faster training and convergence of the learning models. The whole framework has been evaluated using four real-world datasets in different applications. This framework is an attempt to alleviate the challenges of hyper-parameter tuning and feature selection by using efficient algorithms. However, distributed processing, distributed learning, parallel computing, and other big data solutions are not taken into consideration in this framework

    Ant with Artificial Bee Colony Techniques in Vehicular Ad-hoc Networks

    Get PDF
    A VANET faces many problems due to dynamic changing of networks with certain requirements such as low delay, high (PDR) packet delivery ratio, low routing overhead and throughput. However, numerous routing protocols have been suggested to meet the demands of Quality of Service (QoS), but none of them can consistently maintain the highest level of QoS simultaneously. The proposed method Ant with Artificial Bee Colony Techniques provides better performance when compared to the existing techniques. This work is compared with latest developed Techniques in VANET to find the best path and different performance metrics are used to check the performance. This work premeditated the comparative analysis of Quality of services made by the performance of latest emerging techniques in VANET and will provide the best solution for the recognition problem in finding the best path based on the evaluation of performance of Quality of Service. Simulation results imply the benefits of the proposed Ant with Artificial Bee Colony Techniques (AABC) produces better result when compare to the other conventional method and Ant Colony Techniques(ACT)in terms of high packet delivery ratio, less end-to-end delay and less energy consumption level. The performance is evaluated by using Ns2 simulator and results shows that the AABC successfully achieve the optimal routes

    CCSA: Conscious Neighborhood-based Crow Search Algorithm for Solving Global Optimization Problems

    Full text link
    © 2019 Elsevier B.V. In this paper, a conscious neighborhood-based crow search algorithm (CCSA) is proposed for solving global optimization and engineering design problems. It is a successful improvement to tackle the imbalance search strategy and premature convergence problems of the crow search algorithm. CCSA introduces three new search strategies called neighborhood-based local search (NLS), non-neighborhood based global search (NGS) and wandering around based search (WAS) in order to improve the movement of crows in different search spaces. Moreover, a neighborhood concept is defined to select the movement strategy between NLS and NGS consciously, which enhances the balance between local and global search. The proposed CCSA is evaluated on several benchmark functions and four applied problems of engineering design. In all experiments, CCSA is compared by other state-of-the-art swarm intelligence algorithms: CSA, BA, CLPSO, GWO, EEGWO, WOA, KH, ABC, GABC, and Best-so-far ABC. The experimental and statistical results show that CCSA is very competitive especially for large-scale optimization problems, and it is significantly superior to the compared algorithms. Furthermore, the proposed algorithm also finds the best optimal solution for the applied problems of engineering design

    Hybrid approach for metabolites production using differential evolution and minimization of metabolic adjustment

    Get PDF
    Microbial strains can be optimized using metabolic engineering which implements gene knockout techniques. These techniques manipulate potential genes to increase the yield of metabolites through restructuring metabolic networks. Nowadays, several hybrid optimization algorithms have been proposed to optimize the microbial strains. However, the existing algorithms were unable to obtain optimal strains because the nonessential genes are hardly to be diagnosed and need to be removed due to high complexity of metabolic network. Therefore, the main goal of this study is to overcome the limitation of the existing algorithms by proposing a hybrid of Differential Evolution and Minimization of Metabolic Adjustments (DEMOMA). Differential Evolution (DE) is known as population-based stochastic search algorithm with few tuneable parameter control. Minimization of Metabolic Adjustment (MOMA) is one of the constraint based algorithms which act to simulate the cellular metabolism after perturbation (gene knockout) occurred to the metabolic model. The strength of MOMA is the ability to simulate the strains that have undergone mutation precisely compared to Flux Balance Analysis. The data set used for the production of fumaric acid is S. cerevisiae whereas data set for lycopene production is Y. lipolytica metabolic networks model. Experimental results show that the DEMOMA was able to improve the growth rate for the fumaric acid production rate while for the lycopene production, Biomass Product Coupled Yield (BPCY) and production rate were both able to be optimized
    corecore