269 research outputs found

    A Supervised STDP-based Training Algorithm for Living Neural Networks

    Full text link
    Neural networks have shown great potential in many applications like speech recognition, drug discovery, image classification, and object detection. Neural network models are inspired by biological neural networks, but they are optimized to perform machine learning tasks on digital computers. The proposed work explores the possibilities of using living neural networks in vitro as basic computational elements for machine learning applications. A new supervised STDP-based learning algorithm is proposed in this work, which considers neuron engineering constrains. A 74.7% accuracy is achieved on the MNIST benchmark for handwritten digit recognition.Comment: 5 pages, 3 figures, Accepted by ICASSP 201

    Review of medical data analysis based on spiking neural networks

    Full text link
    Medical data mainly includes various types of biomedical signals and medical images, which can be used by professional doctors to make judgments on patients' health conditions. However, the interpretation of medical data requires a lot of human cost and there may be misjudgments, so many scholars use neural networks and deep learning to classify and study medical data, which can improve the efficiency and accuracy of doctors and detect diseases early for early diagnosis, etc. Therefore, it has a wide range of application prospects. However, traditional neural networks have disadvantages such as high energy consumption and high latency (slow computation speed). This paper presents recent research on signal classification and disease diagnosis based on a third-generation neural network, the spiking neuron network, using medical data including EEG signals, ECG signals, EMG signals and MRI images. The advantages and disadvantages of pulsed neural networks compared with traditional networks are summarized and its development orientation in the future is prospected

    Functional Implications of Synaptic Spike Timing Dependent Plasticity and Anti-Hebbian Membrane Potential Dependent Plasticity

    Get PDF
    A central hypothesis of neuroscience is that the change of the strength of synaptic connections between neurons is the basis for learning in the animal brain. However, the rules underlying the activity dependent change as well as their functional consequences are not well understood. This thesis develops and investigates several different quantitative models of synaptic plasticity. In the first part, the Contribution Dynamics model of Spike Timing Dependent Plasticity (STDP) is presented. It is shown to provide a better fit to experimental data than previous models. Additionally, investigation of the response properties of the model synapse to oscillatory neuronal activity shows that synapses are sensitive to theta oscillations (4-10 Hz), which are known to boost learning in behavioral experiments. In the second part, a novel Membrane Potential Dependent Plasticity (MPDP) rule is developed, which can be used to train neurons to fire precisely timed output activity. Previously, this could only be achieved with artificial supervised learning rules, whereas MPDP is a local activity dependent mechanism that is supported by experimental results

    Design and development of opto-neural processors for simulation of neural networks trained in image detection for potential implementation in hybrid robotics

    Full text link
    Neural networks have been employed for a wide range of processing applications like image processing, motor control, object detection and many others. Living neural networks offer advantages of lower power consumption, faster processing, and biological realism. Optogenetics offers high spatial and temporal control over biological neurons and presents potential in training live neural networks. This work proposes a simulated living neural network trained indirectly by backpropagating STDP based algorithms using precision activation by optogenetics achieving accuracy comparable to traditional neural network training algorithms
    • …
    corecore