27 research outputs found

    Stochastic transitions of attractors in associative memory models with correlated noise

    Full text link
    We investigate dynamics of recurrent neural networks with correlated noise to analyze the noise's effect. The mechanism of correlated firing has been analyzed in various models, but its functional roles have not been discussed in sufficient detail. Aoyagi and Aoki have shown that the state transition of a network is invoked by synchronous spikes. We introduce two types of noise to each neuron: thermal independent noise and correlated noise. Due to the effects of correlated noise, the correlation between neural inputs cannot be ignored, so the behavior of the network has sample dependence. We discuss two types of associative memory models: one with auto- and weak cross-correlation connections and one with hierarchically correlated patterns. The former is similar in structure to Aoyagi and Aoki's model. We show that stochastic transition can be presented by correlated rather than thermal noise. In the latter, we show stochastic transition from a memory state to a mixture state using correlated noise. To analyze the stochastic transitions, we derive a macroscopic dynamic description as a recurrence relation form of a probability density function when the correlated noise exists. Computer simulations agree with theoretical results.Comment: 21 page

    Theory of Interaction of Memory Patterns in Layered Associative Networks

    Full text link
    A synfire chain is a network that can generate repeated spike patterns with millisecond precision. Although synfire chains with only one activity propagation mode have been intensively analyzed with several neuron models, those with several stable propagation modes have not been thoroughly investigated. By using the leaky integrate-and-fire neuron model, we constructed a layered associative network embedded with memory patterns. We analyzed the network dynamics with the Fokker-Planck equation. First, we addressed the stability of one memory pattern as a propagating spike volley. We showed that memory patterns propagate as pulse packets. Second, we investigated the activity when we activated two different memory patterns. Simultaneous activation of two memory patterns with the same strength led the propagating pattern to a mixed state. In contrast, when the activations had different strengths, the pulse packet converged to a two-peak state. Finally, we studied the effect of the preceding pulse packet on the following pulse packet. The following pulse packet was modified from its original activated memory pattern, and it converged to a two-peak state, mixed state or non-spike state depending on the time interval

    Thermal history of the string universe

    Full text link
    Thermal history of the string universe based on the Brandenberger and Vafa's scenario is examined. The analysis thereby provides a theoretical foundation of the string universe scenario. Especially the picture of the initial oscillating phase is shown to be natural from the thermodynamical point of view. A new tool is employed to evaluate the multi state density of the string gas. This analysis points out that the well-known functional form of the multi state density is not applicable for the important region T≤THT \leq T_H, and derives a correct form of it.Comment: 39 pages, no figures, use revtex.sty, aps.sty, aps10.sty & preprint.st

    Sparse and Dense Encoding in Layered Associative Network of Spiking Neurons

    Full text link
    A synfire chain is a simple neural network model which can propagate stable synchronous spikes called a pulse packet and widely researched. However how synfire chains coexist in one network remains to be elucidated. We have studied the activity of a layered associative network of Leaky Integrate-and-Fire neurons in which connection we embed memory patterns by the Hebbian Learning. We analyzed their activity by the Fokker-Planck method. In our previous report, when a half of neurons belongs to each memory pattern (memory pattern rate F=0.5F=0.5), the temporal profiles of the network activity is split into temporally clustered groups called sublattices under certain input conditions. In this study, we show that when the network is sparsely connected (F<0.5F<0.5), synchronous firings of the memory pattern are promoted. On the contrary, the densely connected network (F>0.5F>0.5) inhibit synchronous firings. The sparseness and denseness also effect the basin of attraction and the storage capacity of the embedded memory patterns. We show that the sparsely(densely) connected networks enlarge(shrink) the basion of attraction and increase(decrease) the storage capacity

    Structure of Spontaneous UP and DOWN Transitions Self-Organizing in a Cortical Network Model

    Get PDF
    Synaptic plasticity is considered to play a crucial role in the experience-dependent self-organization of local cortical networks. In the absence of sensory stimuli, cerebral cortex exhibits spontaneous membrane potential transitions between an UP and a DOWN state. To reveal how cortical networks develop spontaneous activity, or conversely, how spontaneous activity structures cortical networks, we analyze the self-organization of a recurrent network model of excitatory and inhibitory neurons, which is realistic enough to replicate UP–DOWN states, with spike-timing-dependent plasticity (STDP). The individual neurons in the self-organized network exhibit a variety of temporal patterns in the two-state transitions. In addition, the model develops a feed-forward network-like structure that produces a diverse repertoire of precise sequences of the UP state. Our model shows that the self-organized activity well resembles the spontaneous activity of cortical networks if STDP is accompanied by the pruning of weak synapses. These results suggest that the two-state membrane potential transitions play an active role in structuring local cortical circuits

    Signal Propagation in Feedforward Neuronal Networks with Unreliable Synapses

    Full text link
    In this paper, we systematically investigate both the synfire propagation and firing rate propagation in feedforward neuronal network coupled in an all-to-all fashion. In contrast to most earlier work, where only reliable synaptic connections are considered, we mainly examine the effects of unreliable synapses on both types of neural activity propagation in this work. We first study networks composed of purely excitatory neurons. Our results show that both the successful transmission probability and excitatory synaptic strength largely influence the propagation of these two types of neural activities, and better tuning of these synaptic parameters makes the considered network support stable signal propagation. It is also found that noise has significant but different impacts on these two types of propagation. The additive Gaussian white noise has the tendency to reduce the precision of the synfire activity, whereas noise with appropriate intensity can enhance the performance of firing rate propagation. Further simulations indicate that the propagation dynamics of the considered neuronal network is not simply determined by the average amount of received neurotransmitter for each neuron in a time instant, but also largely influenced by the stochastic effect of neurotransmitter release. Second, we compare our results with those obtained in corresponding feedforward neuronal networks connected with reliable synapses but in a random coupling fashion. We confirm that some differences can be observed in these two different feedforward neuronal network models. Finally, we study the signal propagation in feedforward neuronal networks consisting of both excitatory and inhibitory neurons, and demonstrate that inhibition also plays an important role in signal propagation in the considered networks.Comment: 33pages, 16 figures; Journal of Computational Neuroscience (published

    Mechanisms explaining transitions between tonic and phasic firing in neuronal populations as predicted by a low dimensional firing rate model

    Get PDF
    Several firing patterns experimentally observed in neural populations have been successfully correlated to animal behavior. Population bursting, hereby regarded as a period of high firing rate followed by a period of quiescence, is typically observed in groups of neurons during behavior. Biophysical membrane-potential models of single cell bursting involve at least three equations. Extending such models to study the collective behavior of neural populations involves thousands of equations and can be very expensive computationally. For this reason, low dimensional population models that capture biophysical aspects of networks are needed. \noindent The present paper uses a firing-rate model to study mechanisms that trigger and stop transitions between tonic and phasic population firing. These mechanisms are captured through a two-dimensional system, which can potentially be extended to include interactions between different areas of the nervous system with a small number of equations. The typical behavior of midbrain dopaminergic neurons in the rodent is used as an example to illustrate and interpret our results. \noindent The model presented here can be used as a building block to study interactions between networks of neurons. This theoretical approach may help contextualize and understand the factors involved in regulating burst firing in populations and how it may modulate distinct aspects of behavior.Comment: 25 pages (including references and appendices); 12 figures uploaded as separate file

    Balancing Feed-Forward Excitation and Inhibition via Hebbian Inhibitory Synaptic Plasticity

    Get PDF
    It has been suggested that excitatory and inhibitory inputs to cortical cells are balanced, and that this balance is important for the highly irregular firing observed in the cortex. There are two hypotheses as to the origin of this balance. One assumes that it results from a stable solution of the recurrent neuronal dynamics. This model can account for a balance of steady state excitation and inhibition without fine tuning of parameters, but not for transient inputs. The second hypothesis suggests that the feed forward excitatory and inhibitory inputs to a postsynaptic cell are already balanced. This latter hypothesis thus does account for the balance of transient inputs. However, it remains unclear what mechanism underlies the fine tuning required for balancing feed forward excitatory and inhibitory inputs. Here we investigated whether inhibitory synaptic plasticity is responsible for the balance of transient feed forward excitation and inhibition. We address this issue in the framework of a model characterizing the stochastic dynamics of temporally anti-symmetric Hebbian spike timing dependent plasticity of feed forward excitatory and inhibitory synaptic inputs to a single post-synaptic cell. Our analysis shows that inhibitory Hebbian plasticity generates ‘negative feedback’ that balances excitation and inhibition, which contrasts with the ‘positive feedback’ of excitatory Hebbian synaptic plasticity. As a result, this balance may increase the sensitivity of the learning dynamics to the correlation structure of the excitatory inputs

    Dendritic Slow Dynamics Enables Localized Cortical Activity to Switch between Mobile and Immobile Modes with Noisy Background Input

    Get PDF
    Mounting lines of evidence suggest the significant computational ability of a single neuron empowered by active dendritic dynamics. This motivates us to study what functionality can be acquired by a network of such neurons. The present paper studies how such rich single-neuron dendritic dynamics affects the network dynamics, a question which has scarcely been specifically studied to date. We simulate neurons with active dendrites networked locally like cortical pyramidal neurons, and find that naturally arising localized activity – called a bump – can be in two distinct modes, mobile or immobile. The mode can be switched back and forth by transient input to the cortical network. Interestingly, this functionality arises only if each neuron is equipped with the observed slow dendritic dynamics and with in vivo-like noisy background input. If the bump activity is considered to indicate a point of attention in the sensory areas or to indicate a representation of memory in the storage areas of the cortex, this would imply that the flexible mode switching would be of great potential use for the brain as an information processing device. We derive these conclusions using a natural extension of the conventional field model, which is defined by combining two distinct fields, one representing the somatic population and the other representing the dendritic population. With this tool, we analyze the spatial distribution of the degree of after-spike adaptation and explain how we can understand the presence of the two distinct modes and switching between the modes. We also discuss the possible functional impact of this mode-switching ability
    corecore