204,799 research outputs found

    Influence of synaptic depression on memory storage capacity

    Full text link
    Synaptic efficacy between neurons is known to change within a short time scale dynamically. Neurophysiological experiments show that high-frequency presynaptic inputs decrease synaptic efficacy between neurons. This phenomenon is called synaptic depression, a short term synaptic plasticity. Many researchers have investigated how the synaptic depression affects the memory storage capacity. However, the noise has not been taken into consideration in their analysis. By introducing "temperature", which controls the level of the noise, into an update rule of neurons, we investigate the effects of synaptic depression on the memory storage capacity in the presence of the noise. We analytically compute the storage capacity by using a statistical mechanics technique called Self Consistent Signal to Noise Analysis (SCSNA). We find that the synaptic depression decreases the storage capacity in the case of finite temperature in contrast to the case of the low temperature limit, where the storage capacity does not change

    Neural networks with dynamical synapses: from mixed-mode oscillations and spindles to chaos

    Full text link
    Understanding of short-term synaptic depression (STSD) and other forms of synaptic plasticity is a topical problem in neuroscience. Here we study the role of STSD in the formation of complex patterns of brain rhythms. We use a cortical circuit model of neural networks composed of irregular spiking excitatory and inhibitory neurons having type 1 and 2 excitability and stochastic dynamics. In the model, neurons form a sparsely connected network and their spontaneous activity is driven by random spikes representing synaptic noise. Using simulations and analytical calculations, we found that if the STSD is absent, the neural network shows either asynchronous behavior or regular network oscillations depending on the noise level. In networks with STSD, changing parameters of synaptic plasticity and the noise level, we observed transitions to complex patters of collective activity: mixed-mode and spindle oscillations, bursts of collective activity, and chaotic behaviour. Interestingly, these patterns are stable in a certain range of the parameters and separated by critical boundaries. Thus, the parameters of synaptic plasticity can play a role of control parameters or switchers between different network states. However, changes of the parameters caused by a disease may lead to dramatic impairment of ongoing neural activity. We analyze the chaotic neural activity by use of the 0-1 test for chaos (Gottwald, G. & Melbourne, I., 2004) and show that it has a collective nature.Comment: 7 pages, Proceedings of 12th Granada Seminar, September 17-21, 201

    Analog hardware for learning neural networks

    Get PDF
    This is a recurrent or feedforward analog neural network processor having a multi-level neuron array and a synaptic matrix for storing weighted analog values of synaptic connection strengths which is characterized by temporarily changing one connection strength at a time to determine its effect on system output relative to the desired target. That connection strength is then adjusted based on the effect, whereby the processor is taught the correct response to training examples connection by connection

    A novel plasticity rule can explain the development of sensorimotor intelligence

    Full text link
    Grounding autonomous behavior in the nervous system is a fundamental challenge for neuroscience. In particular, the self-organized behavioral development provides more questions than answers. Are there special functional units for curiosity, motivation, and creativity? This paper argues that these features can be grounded in synaptic plasticity itself, without requiring any higher level constructs. We propose differential extrinsic plasticity (DEP) as a new synaptic rule for self-learning systems and apply it to a number of complex robotic systems as a test case. Without specifying any purpose or goal, seemingly purposeful and adaptive behavior is developed, displaying a certain level of sensorimotor intelligence. These surprising results require no system specific modifications of the DEP rule but arise rather from the underlying mechanism of spontaneous symmetry breaking due to the tight brain-body-environment coupling. The new synaptic rule is biologically plausible and it would be an interesting target for a neurobiolocal investigation. We also argue that this neuronal mechanism may have been a catalyst in natural evolution.Comment: 18 pages, 5 figures, 7 video

    Stochastic mean field formulation of the dynamics of diluted neural networks

    Get PDF
    We consider pulse-coupled Leaky Integrate-and-Fire neural networks with randomly distributed synaptic couplings. This random dilution induces fluctuations in the evolution of the macroscopic variables and deterministic chaos at the microscopic level. Our main aim is to mimic the effect of the dilution as a noise source acting on the dynamics of a globally coupled non-chaotic system. Indeed, the evolution of a diluted neural network can be well approximated as a fully pulse coupled network, where each neuron is driven by a mean synaptic current plus additive noise. These terms represent the average and the fluctuations of the synaptic currents acting on the single neurons in the diluted system. The main microscopic and macroscopic dynamical features can be retrieved with this stochastic approximation. Furthermore, the microscopic stability of the diluted network can be also reproduced, as demonstrated from the almost coincidence of the measured Lyapunov exponents in the deterministic and stochastic cases for an ample range of system sizes. Our results strongly suggest that the fluctuations in the synaptic currents are responsible for the emergence of chaos in this class of pulse coupled networks.Comment: 12 Pages, 4 Figure
    corecore