50,564 research outputs found

    Optimization of Evolutionary Neural Networks Using Hybrid Learning Algorithms

    Full text link
    Evolutionary artificial neural networks (EANNs) refer to a special class of artificial neural networks (ANNs) in which evolution is another fundamental form of adaptation in addition to learning. Evolutionary algorithms are used to adapt the connection weights, network architecture and learning algorithms according to the problem environment. Even though evolutionary algorithms are well known as efficient global search algorithms, very often they miss the best local solutions in the complex solution space. In this paper, we propose a hybrid meta-heuristic learning approach combining evolutionary learning and local search methods (using 1st and 2nd order error information) to improve the learning and faster convergence obtained using a direct evolutionary approach. The proposed technique is tested on three different chaotic time series and the test results are compared with some popular neuro-fuzzy systems and a recently developed cutting angle method of global optimization. Empirical results reveal that the proposed technique is efficient in spite of the computational complexity

    A Hybrid Method for Searching Near-Optimal Artificial Neural Networks

    Full text link
    This paper describes a method for searching near-optimal neural networks using Genetic Algorithms. The method uses an evolutionary search with the simultaneous selection of initial weights, transfer functions, architectures and learning rules. Experimental results have shown that the method is able to produce compact, efficient networks with satisfactory generalization power and shorter training times in comparison to other algorithms. 1

    Development of an Algorithm for Multicriteria Optimization of Deep Learning Neural Networks

    Get PDF
    Nowadays, machine learning methods are actively used to process big data. A promising direction is neural networks, in which structure optimization occurs on the principles of self-configuration. Genetic algorithms are applied to solve this nontrivial problem. Most multicriteria evolutionary algorithms use a procedure known as non-dominant sorting to rank decisions. However, the efficiency of procedures for adding points and updating rank values in non-dominated sorting (incremental non-dominated sorting) remains low. In this regard, this research improves the performance of these algorithms, including the condition of an asynchronous calculation of the fitness of individuals. The relevance of the research is determined by the fact that although many scholars and specialists have studied the self-tuning of neural networks, they have not yet proposed a comprehensive solution to this problem. In particular, algorithms for efficient non-dominated sorting under conditions of incremental and asynchronous updates when using evolutionary methods of multicriteria optimization have not been fully developed to date. To achieve this goal, a hybrid co-evolutionary algorithm was developed that significantly outperforms all algorithms included in it, including error-back propagation and genetic algorithms that operate separately. The novelty of the obtained results lies in the fact that the developed algorithms have minimal asymptotic complexity. The practical value of the developed algorithms is associated with the fact that they make it possible to solve applied problems of increased complexity in a practically acceptable time. Doi: 10.28991/HIJ-2023-04-01-011 Full Text: PD

    EDEN: Evolutionary Deep Networks for Efficient Machine Learning

    Full text link
    Deep neural networks continue to show improved performance with increasing depth, an encouraging trend that implies an explosion in the possible permutations of network architectures and hyperparameters for which there is little intuitive guidance. To address this increasing complexity, we propose Evolutionary DEep Networks (EDEN), a computationally efficient neuro-evolutionary algorithm which interfaces to any deep neural network platform, such as TensorFlow. We show that EDEN evolves simple yet successful architectures built from embedding, 1D and 2D convolutional, max pooling and fully connected layers along with their hyperparameters. Evaluation of EDEN across seven image and sentiment classification datasets shows that it reliably finds good networks -- and in three cases achieves state-of-the-art results -- even on a single GPU, in just 6-24 hours. Our study provides a first attempt at applying neuro-evolution to the creation of 1D convolutional networks for sentiment analysis including the optimisation of the embedding layer.Comment: 7 pages, 3 figures, 3 tables and see video https://vimeo.com/23451009

    Metaheuristic design of feedforward neural networks: a review of two decades of research

    Get PDF
    Over the past two decades, the feedforward neural network (FNN) optimization has been a key interest among the researchers and practitioners of multiple disciplines. The FNN optimization is often viewed from the various perspectives: the optimization of weights, network architecture, activation nodes, learning parameters, learning environment, etc. Researchers adopted such different viewpoints mainly to improve the FNN's generalization ability. The gradient-descent algorithm such as backpropagation has been widely applied to optimize the FNNs. Its success is evident from the FNN's application to numerous real-world problems. However, due to the limitations of the gradient-based optimization methods, the metaheuristic algorithms including the evolutionary algorithms, swarm intelligence, etc., are still being widely explored by the researchers aiming to obtain generalized FNN for a given problem. This article attempts to summarize a broad spectrum of FNN optimization methodologies including conventional and metaheuristic approaches. This article also tries to connect various research directions emerged out of the FNN optimization practices, such as evolving neural network (NN), cooperative coevolution NN, complex-valued NN, deep learning, extreme learning machine, quantum NN, etc. Additionally, it provides interesting research challenges for future research to cope-up with the present information processing era
    corecore