1 research outputs found

    Incorporating Structural Plasticity Approaches in Spiking Neural Networks for EEG Modelling

    Get PDF
    Structural Plasticity (SP) in the brain is a process that allows neuronal structure changes, in response to learning. Spiking Neural Networks (SNN) are an emerging form of artificial neural networks that uses brain-inspired techniques to learn. However, the application of SP in SNNs, its impact on overall learning and network behaviour is rarely explored. In the present study, we use an SNN with a single hidden layer, to apply SP in classifying Electroencephalography signals of two publicly available datasets. We considered classification accuracy as the learning capability and applied metaheuristics to derive the optimised number of neurons for the hidden layer along with other hyperparameters of the network. The optimised structure was then compared with overgrown and undergrown structures to compare the accuracy, stability, and behaviour of the network properties. Networks with SP yielded ~94% and ~92% accuracies in classifying wrist positions and mental states(stressed vs relaxed) respectively. The same SNN developed for mental state classification produced ~77% and ~73% accuracies in classifying arousal and valence. Moreover, the networks with SP demonstrated superior performance stability during iterative random initiations. Interestingly, these networks had a smaller number of inactive neurons and a preference for lowered neuron firing thresholds. This research highlights the importance of systematically selecting the hidden layer neurons over arbitrary settings, particularly for SNNs using Spike Time Dependent Plasticity learning and provides potential findings that may lead to the development of SP learning algorithms for SNNs
    corecore