5 research outputs found

    Invariant set of weight of perceptron trained by perceptron training algorithm

    Get PDF
    In this paper, an invariant set of the weight of the perceptron trained by the perceptron training algorithm is defined and characterized. The dynamic range of the steady state values of the weight of the perceptron can be evaluated via finding the dynamic range of the weight of the perceptron inside the largest invariant set. Also, the necessary and sufficient condition for the forward dynamics of the weight of the perceptron to be injective as well as the condition for the invariant set of the weight of the perceptron to be attractive is derived

    Parameter optimization of evolving spiking neural network with dynamic population particle swarm optimization

    Get PDF
    Evolving Spiking Neural Network (ESNN) is widely used in classification problem. However, ESNN like any other neural networks is incapable to find its own parameter optimum values, which are crucial for classification accuracy. Thus, in this study, ESNN is integrated with an improved Particle Swarm Optimization (PSO) known as Dynamic Population Particle Swarm Optimization (DPPSO) to optimize the ESNN parameters: the modulation factor (Mod), similarity factor (Sim) and threshold factor (C). To find the optimum ESNN parameter value, DPPSO uses a dynamic population that removes the lowest particle value in every pre-defined iteration. The integration of ESNN-DPPSO facilitates the ESNN parameter optimization searching during the training stage. The performance analysis is measured by classification accuracy and is compared with the existing method. Five datasets gained from University of California Irvine (UCI) Machine Learning Repository are used for this study. The experimental result presents better accuracy compared to the existing technique and thus improves the ESNN method in optimising its parameter values

    Parameter optimization of evolving spiking neural network with dynamic population particle swarm optimization

    Get PDF
    Evolving Spiking Neural Network (ESNN) is widely used in classification problem. However, ESNN like any other neural networks is incapable to find its own parameter optimum values, which are crucial for classification accuracy. Thus, in this study, ESNN is integrated with an improved Particle Swarm Optimization (PSO) known as Dynamic Population Particle Swarm Optimization (DPPSO) to optimize the ESNN parameters: the modulation factor (Mod), similarity factor (Sim) and threshold factor (C). To find the optimum ESNN parameter value, DPPSO uses a dynamic population that removes the lowest particle value in every pre-defined iteration. The integration of ESNN-DPPSO facilitates the ESNN parameter optimization searching during the training stage. The performance analysis is measured by classification accuracy and is compared with the existing method. Five datasets gained from University of California Irvine (UCI) Machine Learning Repository are used for this study. The experimental result presents better accuracy compared to the existing technique and thus improves the ESNN method in optimising its parameter values

    An adaptive high-order neural tree for pattern recognition

    No full text
    A new neural tree model, called adaptive high-order neural tree (AHNT), is proposed for classifying large sets of multidimensional patterns. The AHNT is built by recursively dividing the training set into subsets and by assigning each subset to a different child node. Each node is composed of a high-order perceptron (HOP) whose order is automatically tuned taking into account the complexity of the pattern set reaching that node. First-order nodes divide the input space with hyperplanes, while HOPs divide the input space arbitrarily, but at the expense of increased complexity Experimental results demonstrate that the AHNT generalizes better than trees with homogeneous nodes, produces small trees and avoids the use of complex comparative statistical tests and/or a priori selection of large parameter sets
    corecore