4,006 research outputs found

    Sisyphus Effect in Pulse Coupled Excitatory Neural Networks with Spike-Timing Dependent Plasticity

    Full text link
    The collective dynamics of excitatory pulse coupled neural networks with spike timing dependent plasticity (STDP) is studied. Depending on the model parameters stationary states characterized by High or Low Synchronization can be observed. In particular, at the transition between these two regimes, persistent irregular low frequency oscillations between strongly and weakly synchronized states are observable, which can be identified as infraslow oscillations with frequencies 0.02 - 0.03 Hz. Their emergence can be explained in terms of the Sisyphus Effect, a mechanism caused by a continuous feedback between the evolution of the coherent population activity and of the average synaptic weight. Due to this effect, the synaptic weights have oscillating equilibrium values, which prevents the neuronal population from relaxing into a stationary macroscopic state.Comment: 18 pages, 24 figures, submitted to Physical Review

    Temporal ordering of input modulates connectivity formation in a developmental neuronal network model of the cortex

    Get PDF
    Preterm infant brain activity is discontinuous; bursts of activity recorded using EEG (electroencephalography), thought to be driven by subcortical regions, display scale free properties and exhibit a complex temporal ordering known as long-range temporal correlations (LRTCs). During brain development, activity-dependent mechanisms are essential for synaptic connectivity formation, and abolishing burst activity in animal models leads to weak disorganised synaptic connectivity. Moreover, synaptic pruning shares similar mechanisms to spike-timing dependent plasticity (STDP), suggesting that the timing of activity may play a critical role in connectivity formation. We investigated, in a computational model of leaky integrate-and-fire neurones, whether the temporal ordering of burst activity within an external driving input could modulate connectivity formation in the network. Connectivity evolved across the course of simulations using an approach analogous to STDP, from networks with initial random connectivity. Small-world connectivity and hub neurones emerged in the network structure—characteristic properties of mature brain networks. Notably, driving the network with an external input which exhibited LRTCs in the temporal ordering of burst activity facilitated the emergence of these network properties, increasing the speed with which they emerged compared with when the network was driven by the same input with the bursts randomly ordered in time. Moreover, the emergence of small-world properties was dependent on the strength of the LRTCs. These results suggest that the temporal ordering of burst activity could play an important role in synaptic connectivity formation and the emergence of small-world topology in the developing brain

    Characterization and Compensation of Network-Level Anomalies in Mixed-Signal Neuromorphic Modeling Platforms

    Full text link
    Advancing the size and complexity of neural network models leads to an ever increasing demand for computational resources for their simulation. Neuromorphic devices offer a number of advantages over conventional computing architectures, such as high emulation speed or low power consumption, but this usually comes at the price of reduced configurability and precision. In this article, we investigate the consequences of several such factors that are common to neuromorphic devices, more specifically limited hardware resources, limited parameter configurability and parameter variations. Our final aim is to provide an array of methods for coping with such inevitable distortion mechanisms. As a platform for testing our proposed strategies, we use an executable system specification (ESS) of the BrainScaleS neuromorphic system, which has been designed as a universal emulation back-end for neuroscientific modeling. We address the most essential limitations of this device in detail and study their effects on three prototypical benchmark network models within a well-defined, systematic workflow. For each network model, we start by defining quantifiable functionality measures by which we then assess the effects of typical hardware-specific distortion mechanisms, both in idealized software simulations and on the ESS. For those effects that cause unacceptable deviations from the original network dynamics, we suggest generic compensation mechanisms and demonstrate their effectiveness. Both the suggested workflow and the investigated compensation mechanisms are largely back-end independent and do not require additional hardware configurability beyond the one required to emulate the benchmark networks in the first place. We hereby provide a generic methodological environment for configurable neuromorphic devices that are targeted at emulating large-scale, functional neural networks

    Experimental analysis and computational modeling of interburst intervals in spontaneous activity of cortical neuronal culture

    Get PDF
    Rhythmic bursting is the most striking behavior of cultured cortical networks and may start in the second week after plating. In this study, we focus on the intervals between spontaneously occurring bursts, and compare experimentally recorded values with model simulations. In the models, we use standard neurons and synapses, with physiologically plausible parameters taken from literature. All networks had a random recurrent architecture with sparsely connected neurons. The number of neurons varied between 500 and 5,000. We find that network models with homogeneous synaptic strengths produce asynchronous spiking or stable regular bursts. The latter, however, are in a range not seen in recordings. By increasing the synaptic strength in a (randomly chosen) subset of neurons, our simulations show interburst intervals (IBIs) that agree better with in vitro experiments. In this regime, called weakly synchronized, the models produce irregular network bursts, which are initiated by neurons with relatively stronger synapses. In some noise-driven networks, a subthreshold, deterministic, input is applied to neurons with strong synapses, to mimic pacemaker network drive. We show that models with such “intrinsically active neurons” (pacemaker-driven models) tend to generate IBIs that are determined by the frequency of the fastest pacemaker and do not resemble experimental data. Alternatively, noise-driven models yield realistic IBIs. Generally, we found that large-scale noise-driven neuronal network models required synaptic strengths with a bimodal distribution to reproduce the experimentally observed IBI range. Our results imply that the results obtained from small network models cannot simply be extrapolated to models of more realistic size. Synaptic strengths in large-scale neuronal network simulations need readjustment to a bimodal distribution, whereas small networks do not require such change
    corecore