303 research outputs found

    Competing synapses with two timescales: a basis for learning and forgetting

    Full text link
    Competitive dynamics are thought to occur in many processes of learning involving synaptic plasticity. Here we show, in a game theory-inspired model of synaptic interactions, that the competition between synapses in their weak and strong states gives rise to a natural framework of learning, with the prediction of memory inherent in a timescale for `forgetting' a learned signal. Among our main results is the prediction that memory is optimized if the weak synapses are really weak, and the strong synapses are really strong. Our work admits of many extensions and possible experiments to test its validity, and in particular might complement an existing model of reaching, which has strong experimental support.Comment: 7 pages, 3 figures, to appear in Europhysics Letter

    Eigenvalue Distributions for a Class of Covariance Matrices with Applications to Bienenstock-Cooper-Munro Neurons Under Noisy Conditions

    Full text link
    We analyze the effects of noise correlations in the input to, or among, BCM neurons using the Wigner semicircular law to construct random, positive-definite symmetric correlation matrices and compute their eigenvalue distributions. In the finite dimensional case, we compare our analytic results with numerical simulations and show the effects of correlations on the lifetimes of synaptic strengths in various visual environments. These correlations can be due either to correlations in the noise from the input LGN neurons, or correlations in the variability of lateral connections in a network of neurons. In particular, we find that for fixed dimensionality, a large noise variance can give rise to long lifetimes of synaptic strengths. This may be of physiological significance.Comment: 7 pages, 7 figure

    Random Walks for Spike-Timing Dependent Plasticity

    Full text link
    Random walk methods are used to calculate the moments of negative image equilibrium distributions in synaptic weight dynamics governed by spike-timing dependent plasticity (STDP). The neural architecture of the model is based on the electrosensory lateral line lobe (ELL) of mormyrid electric fish, which forms a negative image of the reafferent signal from the fish's own electric discharge to optimize detection of sensory electric fields. Of particular behavioral importance to the fish is the variance of the equilibrium postsynaptic potential in the presence of noise, which is determined by the variance of the equilibrium weight distribution. Recurrence relations are derived for the moments of the equilibrium weight distribution, for arbitrary postsynaptic potential functions and arbitrary learning rules. For the case of homogeneous network parameters, explicit closed form solutions are developed for the covariances of the synaptic weight and postsynaptic potential distributions.Comment: 18 pages, 8 figures, 15 subfigures; uses revtex4, subfigure, amsmat

    Fourier-Space Crystallography as Group Cohomology

    Full text link
    We reformulate Fourier-space crystallography in the language of cohomology of groups. Once the problem is understood as a classification of linear functions on the lattice, restricted by a particular group relation, and identified by gauge transformation, the cohomological description becomes natural. We review Fourier-space crystallography and group cohomology, quote the fact that cohomology is dual to homology, and exhibit several results, previously established for special cases or by intricate calculation, that fall immediately out of the formalism. In particular, we prove that {\it two phase functions are gauge equivalent if and only if they agree on all their gauge-invariant integral linear combinations} and show how to find all these linear combinations systematically.Comment: plain tex, 14 pages (replaced 5/8/01 to include archive preprint number for reference 22

    An associative network with spatially organized connectivity

    Full text link
    We investigate the properties of an autoassociative network of threshold-linear units whose synaptic connectivity is spatially structured and asymmetric. Since the methods of equilibrium statistical mechanics cannot be applied to such a network due to the lack of a Hamiltonian, we approach the problem through a signal-to-noise analysis, that we adapt to spatially organized networks. The conditions are analyzed for the appearance of stable, spatially non-uniform profiles of activity with large overlaps with one of the stored patterns. It is also shown, with simulations and analytic results, that the storage capacity does not decrease much when the connectivity of the network becomes short range. In addition, the method used here enables us to calculate exactly the storage capacity of a randomly connected network with arbitrary degree of dilution.Comment: 27 pages, 6 figures; Accepted for publication in JSTA

    How Gibbs distributions may naturally arise from synaptic adaptation mechanisms. A model-based argumentation

    Get PDF
    This paper addresses two questions in the context of neuronal networks dynamics, using methods from dynamical systems theory and statistical physics: (i) How to characterize the statistical properties of sequences of action potentials ("spike trains") produced by neuronal networks ? and; (ii) what are the effects of synaptic plasticity on these statistics ? We introduce a framework in which spike trains are associated to a coding of membrane potential trajectories, and actually, constitute a symbolic coding in important explicit examples (the so-called gIF models). On this basis, we use the thermodynamic formalism from ergodic theory to show how Gibbs distributions are natural probability measures to describe the statistics of spike trains, given the empirical averages of prescribed quantities. As a second result, we show that Gibbs distributions naturally arise when considering "slow" synaptic plasticity rules where the characteristic time for synapse adaptation is quite longer than the characteristic time for neurons dynamics.Comment: 39 pages, 3 figure

    Formation of feedforward networks and frequency synchrony by spike-timing-dependent plasticity

    Get PDF
    Spike-timing-dependent plasticity (STDP) with asymmetric learning windows is commonly found in the brain and useful for a variety of spike-based computations such as input filtering and associative memory. A natural consequence of STDP is establishment of causality in the sense that a neuron learns to fire with a lag after specific presynaptic neurons have fired. The effect of STDP on synchrony is elusive because spike synchrony implies unitary spike events of different neurons rather than a causal delayed relationship between neurons. We explore how synchrony can be facilitated by STDP in oscillator networks with a pacemaker. We show that STDP with asymmetric learning windows leads to self-organization of feedforward networks starting from the pacemaker. As a result, STDP drastically facilitates frequency synchrony. Even though differences in spike times are lessened as a result of synaptic plasticity, the finite time lag remains so that perfect spike synchrony is not realized. In contrast to traditional mechanisms of large-scale synchrony based on mutual interaction of coupled neurons, the route to synchrony discovered here is enslavement of downstream neurons by upstream ones. Facilitation of such feedforward synchrony does not occur for STDP with symmetric learning windows.Comment: 9 figure

    Emergent complex neural dynamics

    Full text link
    A large repertoire of spatiotemporal activity patterns in the brain is the basis for adaptive behaviour. Understanding the mechanism by which the brain's hundred billion neurons and hundred trillion synapses manage to produce such a range of cortical configurations in a flexible manner remains a fundamental problem in neuroscience. One plausible solution is the involvement of universal mechanisms of emergent complex phenomena evident in dynamical systems poised near a critical point of a second-order phase transition. We review recent theoretical and empirical results supporting the notion that the brain is naturally poised near criticality, as well as its implications for better understanding of the brain
    corecore