190 research outputs found

    Movement Dependence and Layer Specificity of Entorhinal Phase Precession in Two-Dimensional Environments

    Get PDF
    As a rat moves, grid cells in its entorhinal cortex (EC) discharge at multiple locations of the external world, and the firing fields of each grid cell span a hexagonal lattice. For movements on linear tracks, spikes tend to occur at successively earlier phases of the theta-band filtered local field potential during the traversal of a firing field - a phenomenon termed phase precession. The complex movement patterns observed in two-dimensional (2D) open-field environments may fundamentally alter phase precession. To study this question at the behaviorally relevant single-run level, we analyzed EC spike patterns as a function of the distance traveled by the rat along each trajectory. This analysis revealed that cells across all EC layers fire spikes that phase-precess;indeed, the rate and extent of phase precession were the same, only the correlation between spike phase and path length was weaker in EC layer III. Both slope and correlation of phase precession were surprisingly similar on linear tracks and in 2D open-field environments despite strong differences in the movement statistics, including running speed. While the phase-precession slope did not correlate with the average running speed, it did depend on specific properties of the animal's path. The longer a curving path through a grid-field in a 2D environment, the shallower was the rate of phase precession, while runs that grazed a grid field tangentially led to a steeper phase-precession slope than runs through the field center. Oscillatory interference models for grid cells do not reproduce the observed phenomena

    Stability of Negative Image Equilibria in Spike-Timing Dependent Plasticity

    Full text link
    We investigate the stability of negative image equilibria in mean synaptic weight dynamics governed by spike-timing dependent plasticity (STDP). The neural architecture of the model is based on the electrosensory lateral line lobe (ELL) of mormyrid electric fish, which forms a negative image of the reafferent signal from the fish's own electric discharge to optimize detection of external electric fields. We derive a necessary and sufficient condition for stability, for arbitrary postsynaptic potential functions and arbitrary learning rules. We then apply the general result to several examples of biological interest.Comment: 13 pages, revtex4; uses packages: graphicx, subfigure; 9 figures, 16 subfigure

    Markov analysis of stochastic resonance in a periodically driven integrate-fire neuron

    Full text link
    We model the dynamics of the leaky integrate-fire neuron under periodic stimulation as a Markov process with respect to the stimulus phase. This avoids the unrealistic assumption of a stimulus reset after each spike made in earlier work and thus solves the long-standing reset problem. The neuron exhibits stochastic resonance, both with respect to input noise intensity and stimulus frequency. The latter resonance arises by matching the stimulus frequency to the refractory time of the neuron. The Markov approach can be generalized to other periodically driven stochastic processes containing a reset mechanism.Comment: 23 pages, 10 figure

    An associative memory of Hodgkin-Huxley neuron networks with Willshaw-type synaptic couplings

    Full text link
    An associative memory has been discussed of neural networks consisting of spiking N (=100) Hodgkin-Huxley (HH) neurons with time-delayed couplings, which memorize P patterns in their synaptic weights. In addition to excitatory synapses whose strengths are modified after the Willshaw-type learning rule with the 0/1 code for quiescent/active states, the network includes uniform inhibitory synapses which are introduced to reduce cross-talk noises. Our simulations of the HH neuron network for the noise-free state have shown to yield a fairly good performance with the storage capacity of αc=Pmax/N0.42.4\alpha_c = P_{\rm max}/N \sim 0.4 - 2.4 for the low neuron activity of f0.040.10f \sim 0.04-0.10. This storage capacity of our temporal-code network is comparable to that of the rate-code model with the Willshaw-type synapses. Our HH neuron network is realized not to be vulnerable to the distribution of time delays in couplings. The variability of interspace interval (ISI) of output spike trains in the process of retrieving stored patterns is also discussed.Comment: 15 pages, 3 figures, changed Titl

    Intrinsic Stability of Temporally Shifted Spike-Timing Dependent Plasticity

    Get PDF
    Spike-timing dependent plasticity (STDP), a widespread synaptic modification mechanism, is sensitive to correlations between presynaptic spike trains and it generates competition among synapses. However, STDP has an inherent instability because strong synapses are more likely to be strengthened than weak ones, causing them to grow in strength until some biophysical limit is reached. Through simulations and analytic calculations, we show that a small temporal shift in the STDP window that causes synchronous, or nearly synchronous, pre- and postsynaptic action potentials to induce long-term depression can stabilize synaptic strengths. Shifted STDP also stabilizes the postsynaptic firing rate and can implement both Hebbian and anti-Hebbian forms of competitive synaptic plasticity. Interestingly, the overall level of inhibition determines whether plasticity is Hebbian or anti-Hebbian. Even a random symmetric jitter of a few milliseconds in the STDP window can stabilize synaptic strengths while retaining these features. The same results hold for a shifted version of the more recent “triplet” model of STDP. Our results indicate that the detailed shape of the STDP window function near the transition from depression to potentiation is of the utmost importance in determining the consequences of STDP, suggesting that this region warrants further experimental study

    Emergence of Connectivity Motifs in Networks of Model Neurons with Short- and Long-term Plastic Synapses

    Get PDF
    Recent evidence in rodent cerebral cortex and olfactory bulb suggests that short-term dynamics of excitatory synaptic transmission is correlated to stereotypical connectivity motifs. It was observed that neurons with short-term facilitating synapses form predominantly reciprocal pairwise connections, while neurons with short-term depressing synapses form unidirectional pairwise connections. The cause of these structural differences in synaptic microcircuits is unknown. We propose that these connectivity motifs emerge from the interactions between short-term synaptic dynamics (SD) and long-term spike-timing dependent plasticity (STDP). While the impact of STDP on SD was shown in vitro, the mutual interactions between STDP and SD in large networks are still the subject of intense research. We formulate a computational model by combining SD and STDP, which captures faithfully short- and long-term dependence on both spike times and frequency. As a proof of concept, we simulate recurrent networks of spiking neurons with random initial connection efficacies and where synapses are either all short-term facilitating or all depressing. For identical background inputs, and as a direct consequence of internally generated activity, we find that networks with depressing synapses evolve unidirectional connectivity motifs, while networks with facilitating synapses evolve reciprocal connectivity motifs. This holds for heterogeneous networks including both facilitating and depressing synapses. Our study highlights the conditions under which SD-STDP might the correlation between facilitation and reciprocal connectivity motifs, as well as between depression and unidirectional motifs. We further suggest experiments for the validation of the proposed mechanism

    The spike-timing-dependent learning rule to encode spatiotemporal patterns in a network of spiking neurons

    Full text link
    We study associative memory neural networks based on the Hodgkin-Huxley type of spiking neurons. We introduce the spike-timing-dependent learning rule, in which the time window with the negative part as well as the positive part is used to describe the biologically plausible synaptic plasticity. The learning rule is applied to encode a number of periodical spatiotemporal patterns, which are successfully reproduced in the periodical firing pattern of spiking neurons in the process of memory retrieval. The global inhibition is incorporated into the model so as to induce the gamma oscillation. The occurrence of gamma oscillation turns out to give appropriate spike timings for memory retrieval of discrete type of spatiotemporal pattern. The theoretical analysis to elucidate the stationary properties of perfect retrieval state is conducted in the limit of an infinite number of neurons and shows the good agreement with the result of numerical simulations. The result of this analysis indicates that the presence of the negative and positive parts in the form of the time window contributes to reduce the size of crosstalk term, implying that the time window with the negative and positive parts is suitable to encode a number of spatiotemporal patterns. We draw some phase diagrams, in which we find various types of phase transitions with change of the intensity of global inhibition.Comment: Accepted for publication in Physical Review

    Formation of feedforward networks and frequency synchrony by spike-timing-dependent plasticity

    Get PDF
    Spike-timing-dependent plasticity (STDP) with asymmetric learning windows is commonly found in the brain and useful for a variety of spike-based computations such as input filtering and associative memory. A natural consequence of STDP is establishment of causality in the sense that a neuron learns to fire with a lag after specific presynaptic neurons have fired. The effect of STDP on synchrony is elusive because spike synchrony implies unitary spike events of different neurons rather than a causal delayed relationship between neurons. We explore how synchrony can be facilitated by STDP in oscillator networks with a pacemaker. We show that STDP with asymmetric learning windows leads to self-organization of feedforward networks starting from the pacemaker. As a result, STDP drastically facilitates frequency synchrony. Even though differences in spike times are lessened as a result of synaptic plasticity, the finite time lag remains so that perfect spike synchrony is not realized. In contrast to traditional mechanisms of large-scale synchrony based on mutual interaction of coupled neurons, the route to synchrony discovered here is enslavement of downstream neurons by upstream ones. Facilitation of such feedforward synchrony does not occur for STDP with symmetric learning windows.Comment: 9 figure

    How Structure Determines Correlations in Neuronal Networks

    Get PDF
    Networks are becoming a ubiquitous metaphor for the understanding of complex biological systems, spanning the range between molecular signalling pathways, neural networks in the brain, and interacting species in a food web. In many models, we face an intricate interplay between the topology of the network and the dynamics of the system, which is generally very hard to disentangle. A dynamical feature that has been subject of intense research in various fields are correlations between the noisy activity of nodes in a network. We consider a class of systems, where discrete signals are sent along the links of the network. Such systems are of particular relevance in neuroscience, because they provide models for networks of neurons that use action potentials for communication. We study correlations in dynamic networks with arbitrary topology, assuming linear pulse coupling. With our novel approach, we are able to understand in detail how specific structural motifs affect pairwise correlations. Based on a power series decomposition of the covariance matrix, we describe the conditions under which very indirect interactions will have a pronounced effect on correlations and population dynamics. In random networks, we find that indirect interactions may lead to a broad distribution of activation levels with low average but highly variable correlations. This phenomenon is even more pronounced in networks with distance dependent connectivity. In contrast, networks with highly connected hubs or patchy connections often exhibit strong average correlations. Our results are particularly relevant in view of new experimental techniques that enable the parallel recording of spiking activity from a large number of neurons, an appropriate interpretation of which is hampered by the currently limited understanding of structure-dynamics relations in complex networks
    corecore