5 research outputs found

    The Enhancement of Evolving Spiking Neural Network with Firefly Algorithm

    Get PDF
    This study presents the integration between Evolving Spiking Neural Network (ESNN) and Firefly Algorithm (FA) for parameter optimization of ESNN model. Since ESNN lacks the ability to automatically select the optimum parameters, Firefly Algorithm (FA), as one of nature inspired metaheuristic algorithms is used as a new parameter optimizer for ESNN. The proposed method, ESNN-FA is used to determine the optimum value of ESNN parameter which is modulation factor (Mod), similarity factor (Sim) and threshold factor (C). Five standard datasets from UCI machine learning are used to measure the effectiveness of the proposed work. The classification results obtained shown an increase in accuracy than standard ESNN for all dataset except for iris dataset. Classification accuracy for iris dataset is 84% which less than standard ESNN with 89.33%. The rest of datasets achieved higher classification accuracy than standard ESNN which for breast cancer with 92.12% than 66.18%, diabetes with 68.25% than 38.46%, heart with 78.15% than 66.3% and wine with 78.66% than 44.45%

    Parameter optimization of evolving spiking neural networks using improved firefly algorithm for classification tasks

    Get PDF
    Evolving Spiking Neural Network (ESNN) is the third generation of artificial neural network that has been widely used in numerous studies in recent years. However, there are issues of ESSN that need to be improved; one of which is its parameters namely the modulation factor (Mod), similarity factor (Sim) and threshold factor (C) that have to be manually tuned for optimal values that are suitable for any particular problem. The objective of the proposed work is to automatically determine the optimum values of the ESNN parameters for various datasets by integrating the Firefly Algorithm (FA) optimizer into the ESNN training phase and adaptively searching for the best parameter values. In this study, FA has been modified and improved, and was applied to improve the accuracy of ESNN structure and rates of classification accuracy. Five benchmark datasets from University of California, Irvine (UCI) Machine Learning Repository, have been used to measure the effectiveness of the integration model. Performance analysis of the proposed work was conducted by calculating classification accuracy, and compared with other parameter optimisation methods. The results from the experimentation have proven that the proposed algorithms have attained the optimal parameters values for ESNN

    Parameter optimization of evolving spiking neural network with dynamic population particle swarm optimization

    Get PDF
    Evolving Spiking Neural Network (ESNN) is widely used in classification problem. However, ESNN like any other neural networks is incapable to find its own parameter optimum values, which are crucial for classification accuracy. Thus, in this study, ESNN is integrated with an improved Particle Swarm Optimization (PSO) known as Dynamic Population Particle Swarm Optimization (DPPSO) to optimize the ESNN parameters: the modulation factor (Mod), similarity factor (Sim) and threshold factor (C). To find the optimum ESNN parameter value, DPPSO uses a dynamic population that removes the lowest particle value in every pre-defined iteration. The integration of ESNN-DPPSO facilitates the ESNN parameter optimization searching during the training stage. The performance analysis is measured by classification accuracy and is compared with the existing method. Five datasets gained from University of California Irvine (UCI) Machine Learning Repository are used for this study. The experimental result presents better accuracy compared to the existing technique and thus improves the ESNN method in optimising its parameter values

    Parameter optimization of evolving spiking neural network with dynamic population particle swarm optimization

    Get PDF
    Evolving Spiking Neural Network (ESNN) is widely used in classification problem. However, ESNN like any other neural networks is incapable to find its own parameter optimum values, which are crucial for classification accuracy. Thus, in this study, ESNN is integrated with an improved Particle Swarm Optimization (PSO) known as Dynamic Population Particle Swarm Optimization (DPPSO) to optimize the ESNN parameters: the modulation factor (Mod), similarity factor (Sim) and threshold factor (C). To find the optimum ESNN parameter value, DPPSO uses a dynamic population that removes the lowest particle value in every pre-defined iteration. The integration of ESNN-DPPSO facilitates the ESNN parameter optimization searching during the training stage. The performance analysis is measured by classification accuracy and is compared with the existing method. Five datasets gained from University of California Irvine (UCI) Machine Learning Repository are used for this study. The experimental result presents better accuracy compared to the existing technique and thus improves the ESNN method in optimising its parameter values

    Integrated Feature Selection and Parameter Optimization for Evolving Spiking Neural Networks Using Quantum Inspired Particle Swarm Optimization

    No full text
    corecore