2,813 research outputs found

    Metaheuristic design of feedforward neural networks: a review of two decades of research

    Get PDF
    Over the past two decades, the feedforward neural network (FNN) optimization has been a key interest among the researchers and practitioners of multiple disciplines. The FNN optimization is often viewed from the various perspectives: the optimization of weights, network architecture, activation nodes, learning parameters, learning environment, etc. Researchers adopted such different viewpoints mainly to improve the FNN's generalization ability. The gradient-descent algorithm such as backpropagation has been widely applied to optimize the FNNs. Its success is evident from the FNN's application to numerous real-world problems. However, due to the limitations of the gradient-based optimization methods, the metaheuristic algorithms including the evolutionary algorithms, swarm intelligence, etc., are still being widely explored by the researchers aiming to obtain generalized FNN for a given problem. This article attempts to summarize a broad spectrum of FNN optimization methodologies including conventional and metaheuristic approaches. This article also tries to connect various research directions emerged out of the FNN optimization practices, such as evolving neural network (NN), cooperative coevolution NN, complex-valued NN, deep learning, extreme learning machine, quantum NN, etc. Additionally, it provides interesting research challenges for future research to cope-up with the present information processing era

    A generic optimising feature extraction method using multiobjective genetic programming

    Get PDF
    In this paper, we present a generic, optimising feature extraction method using multiobjective genetic programming. We re-examine the feature extraction problem and show that effective feature extraction can significantly enhance the performance of pattern recognition systems with simple classifiers. A framework is presented to evolve optimised feature extractors that transform an input pattern space into a decision space in which maximal class separability is obtained. We have applied this method to real world datasets from the UCI Machine Learning and StatLog databases to verify our approach and compare our proposed method with other reported results. We conclude that our algorithm is able to produce classifiers of superior (or equivalent) performance to the conventional classifiers examined, suggesting removal of the need to exhaustively evaluate a large family of conventional classifiers on any new problem. (C) 2010 Elsevier B.V. All rights reserved

    Multi objective genetic algorithm for training three term backpropagation network

    Get PDF
    Multi Objective Evolutionary Algorithms has been applied for learning problem in Artificial Neural Networks to improve the generalization of the training and testing unseen data.This paper proposes the simultaneous optimization method for training Three Term Back Propagation Network (TTBPN) learning using Multi Objective Genetic Algorithm.The Non-dominated Sorting Genetic Algorithm II is applied to optimize the TTBPN structure by simultaneously reducing the error and complexity in terms of number of hidden nodes of the network for better accuracy in classification problem.This methodology is applied in two kinds of multiclasses data set obtained from the University of California at Irvine repository.The results obtained for training and testing on the datasets illustrate less network error and better classification accuracy, besides having simple architecture for the TTBPN

    Multiobjective optimization in bioinformatics and computational biology

    Get PDF

    The influence of mutation on population dynamics in multiobjective genetic programming

    Get PDF
    Using multiobjective genetic programming with a complexity objective to overcome tree bloat is usually very successful but can sometimes lead to undesirable collapse of the population to all single-node trees. In this paper we report a detailed examination of why and when collapse occurs. We have used different types of crossover and mutation operators (depth-fair and sub-tree), different evolutionary approaches (generational and steady-state), and different datasets (6-parity Boolean and a range of benchmark machine learning problems) to strengthen our conclusion. We conclude that mutation has a vital role in preventing population collapse by counterbalancing parsimony pressure and preserving population diversity. Also, mutation controls the size of the generated individuals which tends to dominate the time needed for fitness evaluation and therefore the whole evolutionary process. Further, the average size of the individuals in a GP population depends on the evolutionary approach employed. We also demonstrate that mutation has a wider role than merely culling single-node individuals from the population; even within a diversity-preserving algorithm such as SPEA2 mutation has a role in preserving diversity

    Memetic Pareto Evolutionary Artificial Neural Networks for the determination of growth limits of Listeria Monocytogenes

    Get PDF
    The main objective of this work is to automatically design neural network models with sigmoidal basis units for classification tasks, so that classifiers are obtained in the most balanced way possible in terms of CCR and Sensitivity (given by the lowest percentage of examples correctly predicted to belong to each class). We present a Memetic Pareto Evolutionary NSGA2 (MPENSGA2) approach based on the Pareto-NSGAII evolution (PNSGAII) algorithm. We propose to augmente it with a local search using the improved Rprop—IRprop algorithm for the prediction of growth/no growth of L. monocytogenes as a function of the storage temperature, pH, citric (CA) and ascorbic acid (AA). The results obtained show that the generalization ability can be more efficiently improved within a framework that is multi-objective instead of a within a single-objective one
    • …
    corecore