1,417 research outputs found

    Metaheuristic design of feedforward neural networks: a review of two decades of research

    Get PDF
    Over the past two decades, the feedforward neural network (FNN) optimization has been a key interest among the researchers and practitioners of multiple disciplines. The FNN optimization is often viewed from the various perspectives: the optimization of weights, network architecture, activation nodes, learning parameters, learning environment, etc. Researchers adopted such different viewpoints mainly to improve the FNN's generalization ability. The gradient-descent algorithm such as backpropagation has been widely applied to optimize the FNNs. Its success is evident from the FNN's application to numerous real-world problems. However, due to the limitations of the gradient-based optimization methods, the metaheuristic algorithms including the evolutionary algorithms, swarm intelligence, etc., are still being widely explored by the researchers aiming to obtain generalized FNN for a given problem. This article attempts to summarize a broad spectrum of FNN optimization methodologies including conventional and metaheuristic approaches. This article also tries to connect various research directions emerged out of the FNN optimization practices, such as evolving neural network (NN), cooperative coevolution NN, complex-valued NN, deep learning, extreme learning machine, quantum NN, etc. Additionally, it provides interesting research challenges for future research to cope-up with the present information processing era

    Multilayer Perceptron: Architecture Optimization and Training

    Get PDF
    The multilayer perceptron has a large wide of classification and regression applications in many fields: pattern recognition, voice and classification problems. But the architecture choice has a great impact on the convergence of these networks. In the present paper we introduce a new approach to optimize the network architecture, for solving the obtained model we use the genetic algorithm and we train the network with a back-propagation algorithm. The numerical results assess the effectiveness of the theoretical results shown in this paper, and the advantages of the new modeling compared to the previous model in the literature

    Invariant set of weight of perceptron trained by perceptron training algorithm

    Get PDF
    In this paper, an invariant set of the weight of the perceptron trained by the perceptron training algorithm is defined and characterized. The dynamic range of the steady state values of the weight of the perceptron can be evaluated via finding the dynamic range of the weight of the perceptron inside the largest invariant set. Also, the necessary and sufficient condition for the forward dynamics of the weight of the perceptron to be injective as well as the condition for the invariant set of the weight of the perceptron to be attractive is derived

    Evolutionary cellular configurations for designing feed-forward neural networks architectures

    Get PDF
    Proceeding of: 6th International Work-Conference on Artificial and Natural Neural Networks, IWANN 2001 Granada, Spain, June 13–15, 2001In the recent years, the interest to develop automatic methods to determine appropriate architectures of feed-forward neural networks has increased. Most of the methods are based on evolutionary computation paradigms. Some of the designed methods are based on direct representations of the parameters of the network. These representations do not allow scalability, so to represent large architectures, very large structures are required. An alternative more interesting are the indirect schemes. They codify a compact representation of the neural network. In this work, an indirect constructive encoding scheme is presented. This scheme is based on cellular automata representations in order to increase the scalability of the method

    Bibliometric Mapping of the Computational Intelligence Field

    Get PDF
    In this paper, a bibliometric study of the computational intelligence field is presented. Bibliometric maps showing the associations between the main concepts in the field are provided for the periods 1996–2000 and 2001–2005. Both the current structure of the field and the evolution of the field over the last decade are analyzed. In addition, a number of emerging areas in the field are identified. It turns out that computational intelligence can best be seen as a field that is structured around four important types of problems, namely control problems, classification problems, regression problems, and optimization problems. Within the computational intelligence field, the neural networks and fuzzy systems subfields are fairly intertwined, whereas the evolutionary computation subfield has a relatively independent position.neural networks;bibliometric mapping;fuzzy systems;bibliometrics;computational intelligence;evolutionary computation

    Neural networks in geophysical applications

    Get PDF
    Neural networks are increasingly popular in geophysics. Because they are universal approximators, these tools can approximate any continuous function with an arbitrary precision. Hence, they may yield important contributions to finding solutions to a variety of geophysical applications. However, knowledge of many methods and techniques recently developed to increase the performance and to facilitate the use of neural networks does not seem to be widespread in the geophysical community. Therefore, the power of these tools has not yet been explored to their full extent. In this paper, techniques are described for faster training, better overall performance, i.e., generalization,and the automatic estimation of network size and architecture

    Improved Genetic Algorithm Multilayer Perceptron Network For Data Classification

    Get PDF
    Secara umumnya, algoritma genetik (GA) konvensional mempunyai beberapa kelemahan seperti penumpuan pramatang, kecenderungan terperangkap pada penyelesaian optima setempat dan ketidakupayaan penalaan di sekitar kawasan berpotensi. Oleh itu, GA ditambahbaik dengan strategi pencarian, penghasilan semula dan elitisma baharu dicadangkan dalam kajian ini. Pernambahbaikan pertama melibatkan perubahan kepada struktur operasi GA yang mana ia menumpu pencarian di sekitar kawasan berpotensi tinggi. Kedua, teknik baharu penghasilan semula yang dinamakan Segmented Multi-Chromosome Crossover (SMCC) telah diperkenalkan. Teknik tersebut mengelak kemusnahan maklumat hampir optima yang terkandung dalam segmen genetik dan membolehkan generasi baharu mewarisi maklumat penting daripada berbilang induk. Ketiganya, tiga jenis variasi elitism dinamakan sebagai Best Among Normal and Improved Population (BANI), Best Between Similar Rank (BBSR) dan Equally Contributed (EQ) telah dibangunkan. Ia melibatkan pertandingan di kalangan individu terbaik daripada populasi normal dan ditambahbaik untuk kelangsungan pada generasi selepasnya. GA yang ditambahbaik kemudiannya digunakan untuk mengoptimasi dan merekabentuk rangkaian perseptron berbilang lapisan (MLP) secara automatik bagi penyelesaian masalah pengkelasan corak. Bilangan nod terlindung, nilai pemberat sambungan awalan dan pemilihan ciri MLP yang memainkan peranan penting dalam menentukan prestasi pengkelasan dipilih untuk dioptimasi secara automatik oleh GA ditambahbaik. Prestasi GA ditambahbaik telah dinilai menggunakan fungsi ujian penanda aras yangv rumit serta berbilang mod dan dibandingkan dengan GA piawai. Berdasarkan kekerapan sesuatu algoritma menghasilkan keputusan terbaik terhadap fungsi ujian yang berbeza; ianya telah terbukti bahawa prestasi teknik yang dicadangkan mengatasi GA piawai. BANI, BBSR dan EQ mencatatkan 30, 18 dan 17 kekerapan keputusan terbaik masing-masing berbanding GA piawai yang hanya mencatatkan 3 keputusan terbaik. Manakala, prestasi pengkelasan GA-MLP yang ditambahbaik telah dinilai menggunakan set-set data yang berbeza dari segi saiz ciri kemasukan dan bilangan kelas keluaran. Keputusan menunjukkan keberkesanan algoritma baharu daripada segi peratusan ujian kejituan. Peratus peningkatan keseluruhan sebanyak 0.6%, 0.1% dan 0.3% bagi ujian kejituan dicatatkan oleh BANI, BBSR dan EQ berbanding dengan GA-MLP piawai. ____________________________________________________________ In general, conventional genetic algorithm (GA) has several drawbacks such as premature convergence, high tendency to get trapped in local optima solution and incapable of fine tuning around potential region. Thus, new improved GA that focuses on new search, reproduction and elitism strategy is proposed in this study. The first improvement involves changes in the operational structure of GA in which it concentrates the search in highly potential area in the search region. Secondly, a novel reproduction technique called Segmented Multi-Chromosome Crossover (SMCC) is introduced. The proposed technique avoids the destruction of nearly optimal information contained in the gene segment and allows offspring to inherit highly important information among multiple parents. Thirdly, three new variations of elitism scheme namely Best Among Normal and Improved Population (BANI), Best Between Similar Rank (BBSR) and Equally Contributed (EQ) are developed. It involves competition among best individuals from normal and improved population to ensure survival in the next generation. The improved GA is then applied for optimization and automatic design of multilayer perceptron (MLP) neural network in solving pattern classification problem. Hidden node size, initial weights and feature selection of the MLP that play significant role in the classification performance are selected to be automatically optimized by the improved GA. The performance of improved GA has been evaluated using highly complicated and multimodal benchmark test functions and compared with the standard GA. Based on the occurrences of the best result obtained by an algorithm across different test functions; it is proven that the proposed method outperforms standard GA. BANI, BBSR and EQ scores 30, 18 and 17 occurrences respectively compared to the standard GA that only scores 3 occurrences. Meanwhile, the improved GA-MLP classification performance has been evaluated using datasets that vary in input features and output sizes. The results demonstrate the effectiveness of the new algorithms in term of test accuracy percentage. There is an overall improvement of 0.6%, 0.1% and 0.3% in test accuracy of BANI, BBSR and EQ compared to the standard GA-MLP
    corecore