3,706 research outputs found

    Snakes and ladders in an inhomogeneous neural field model

    Get PDF
    Continuous neural field models with inhomogeneous synaptic connectivities are known to support traveling fronts as well as stable bumps of localized activity. We analyze stationary localized structures in a neural field model with periodic modulation of the synaptic connectivity kernel and find that they are arranged in a snakes-and-ladders bifurcation structure. In the case of Heaviside firing rates, we construct analytically symmetric and asymmetric states and hence derive closed-form expressions for the corresponding bifurcation diagrams. We show that the ideas proposed by Beck and co-workers to analyze snaking solutions to the Swift-Hohenberg equation remain valid for the neural field model, even though the corresponding spatial-dynamical formulation is non-autonomous. We investigate how the modulation amplitude affects the bifurcation structure and compare numerical calculations for steep sigmoidal firing rates with analytic predictions valid in the Heaviside limit

    Sleep-like slow oscillations improve visual classification through synaptic homeostasis and memory association in a thalamo-cortical model

    Full text link
    The occurrence of sleep passed through the evolutionary sieve and is widespread in animal species. Sleep is known to be beneficial to cognitive and mnemonic tasks, while chronic sleep deprivation is detrimental. Despite the importance of the phenomenon, a complete understanding of its functions and underlying mechanisms is still lacking. In this paper, we show interesting effects of deep-sleep-like slow oscillation activity on a simplified thalamo-cortical model which is trained to encode, retrieve and classify images of handwritten digits. During slow oscillations, spike-timing-dependent-plasticity (STDP) produces a differential homeostatic process. It is characterized by both a specific unsupervised enhancement of connections among groups of neurons associated to instances of the same class (digit) and a simultaneous down-regulation of stronger synapses created by the training. This hierarchical organization of post-sleep internal representations favours higher performances in retrieval and classification tasks. The mechanism is based on the interaction between top-down cortico-thalamic predictions and bottom-up thalamo-cortical projections during deep-sleep-like slow oscillations. Indeed, when learned patterns are replayed during sleep, cortico-thalamo-cortical connections favour the activation of other neurons coding for similar thalamic inputs, promoting their association. Such mechanism hints at possible applications to artificial learning systems.Comment: 11 pages, 5 figures, v5 is the final version published on Scientific Reports journa

    Dynamical Synapses Enhance Neural Information Processing: Gracefulness, Accuracy and Mobility

    Full text link
    Experimental data have revealed that neuronal connection efficacy exhibits two forms of short-term plasticity, namely, short-term depression (STD) and short-term facilitation (STF). They have time constants residing between fast neural signaling and rapid learning, and may serve as substrates for neural systems manipulating temporal information on relevant time scales. The present study investigates the impact of STD and STF on the dynamics of continuous attractor neural networks (CANNs) and their potential roles in neural information processing. We find that STD endows the network with slow-decaying plateau behaviors-the network that is initially being stimulated to an active state decays to a silent state very slowly on the time scale of STD rather than on the time scale of neural signaling. This provides a mechanism for neural systems to hold sensory memory easily and shut off persistent activities gracefully. With STF, we find that the network can hold a memory trace of external inputs in the facilitated neuronal interactions, which provides a way to stabilize the network response to noisy inputs, leading to improved accuracy in population decoding. Furthermore, we find that STD increases the mobility of the network states. The increased mobility enhances the tracking performance of the network in response to time-varying stimuli, leading to anticipative neural responses. In general, we find that STD and STP tend to have opposite effects on network dynamics and complementary computational advantages, suggesting that the brain may employ a strategy of weighting them differentially depending on the computational purpose.Comment: 40 pages, 17 figure

    Spatially structured oscillations in a two-dimensional excitatory neuronal network with synaptic depression

    Get PDF
    We study the spatiotemporal dynamics of a two-dimensional excitatory neuronal network with synaptic depression. Coupling between populations of neurons is taken to be nonlocal, while depression is taken to be local and presynaptic. We show that the network supports a wide range of spatially structured oscillations, which are suggestive of phenomena seen in cortical slice experiments and in vivo. The particular form of the oscillations depends on initial conditions and the level of background noise. Given an initial, spatially localized stimulus, activity evolves to a spatially localized oscillating core that periodically emits target waves. Low levels of noise can spontaneously generate several pockets of oscillatory activity that interact via their target patterns. Periodic activity in space can also organize into spiral waves, provided that there is some source of rotational symmetry breaking due to external stimuli or noise. In the high gain limit, no oscillatory behavior exists, but a transient stimulus can lead to a single, outward propagating target wave

    Short term synaptic depression improves information transfer in perceptual multistability

    Get PDF
    Competitive neural networks are often used to model the dynamics of perceptual bistability. Switching between percepts can occur through fluctuations and/or a slow adaptive process. Here, we analyze switching statistics in competitive networks with short term synaptic depression and noise. We start by analyzing a ring model that yields spatially structured solutions and complement this with a study of a space-free network whose populations are coupled with mutual inhibition. Dominance times arising from depression driven switching can be approximated using a separation of timescales in the ring and space-free model. For purely noise-driven switching, we use energy arguments to justify how dominance times are exponentially related to input strength. We also show that a combination of depression and noise generates realistic distributions of dominance times. Unimodal functions of dominance times are more easily differentiated from one another using Bayesian sampling, suggesting synaptic depression induced switching transfers more information about stimuli than noise-driven switching. Finally, we analyze a competitive network model of perceptual tristability, showing depression generates a memory of previous percepts based on the ordering of percepts.Comment: 26 pages, 15 figure

    Sisyphus Effect in Pulse Coupled Excitatory Neural Networks with Spike-Timing Dependent Plasticity

    Full text link
    The collective dynamics of excitatory pulse coupled neural networks with spike timing dependent plasticity (STDP) is studied. Depending on the model parameters stationary states characterized by High or Low Synchronization can be observed. In particular, at the transition between these two regimes, persistent irregular low frequency oscillations between strongly and weakly synchronized states are observable, which can be identified as infraslow oscillations with frequencies 0.02 - 0.03 Hz. Their emergence can be explained in terms of the Sisyphus Effect, a mechanism caused by a continuous feedback between the evolution of the coherent population activity and of the average synaptic weight. Due to this effect, the synaptic weights have oscillating equilibrium values, which prevents the neuronal population from relaxing into a stationary macroscopic state.Comment: 18 pages, 24 figures, submitted to Physical Review

    Phase Diagram of Spiking Neural Networks

    Get PDF
    In computer simulations of spiking neural networks, often it is assumed that every two neurons of the network are connected by a probability of 2\%, 20\% of neurons are inhibitory and 80\% are excitatory. These common values are based on experiments, observations, and trials and errors, but here, I take a different perspective, inspired by evolution, I systematically simulate many networks, each with a different set of parameters, and then I try to figure out what makes the common values desirable. I stimulate networks with pulses and then measure their: dynamic range, dominant frequency of population activities, total duration of activities, maximum rate of population and the occurrence time of maximum rate. The results are organized in phase diagram. This phase diagram gives an insight into the space of parameters -- excitatory to inhibitory ratio, sparseness of connections and synaptic weights. This phase diagram can be used to decide the parameters of a model. The phase diagrams show that networks which are configured according to the common values, have a good dynamic range in response to an impulse and their dynamic range is robust in respect to synaptic weights, and for some synaptic weights they oscillate in α\alpha or β\beta frequencies, even in absence of external stimuli.Comment: oscillations are studied in this versio

    Moving bumps in theta neuron networks

    Full text link
    We consider large networks of theta neurons on a ring, synaptically coupled with an asymmetric kernel. Such networks support stable "bumps" of activity, which move along the ring if the coupling kernel is asymmetric. We investigate the effects of the kernel asymmetry on the existence, stability and speed of these moving bumps using continuum equations formally describing infinite networks. Depending on the level of heterogeneity within the network we find complex sequences of bifurcations as the amount of asymmetry is varied, in strong contrast to the behaviour of a classical neural field model.Comment: To appear in Chao
    • …
    corecore