330 research outputs found
Intrinsically-generated fluctuating activity in excitatory-inhibitory networks
Recurrent networks of non-linear units display a variety of dynamical regimes
depending on the structure of their synaptic connectivity. A particularly
remarkable phenomenon is the appearance of strongly fluctuating, chaotic
activity in networks of deterministic, but randomly connected rate units. How
this type of intrinsi- cally generated fluctuations appears in more realistic
networks of spiking neurons has been a long standing question. To ease the
comparison between rate and spiking networks, recent works investigated the
dynami- cal regimes of randomly-connected rate networks with segregated
excitatory and inhibitory populations, and firing rates constrained to be
positive. These works derived general dynamical mean field (DMF) equations
describing the fluctuating dynamics, but solved these equations only in the
case of purely inhibitory networks. Using a simplified excitatory-inhibitory
architecture in which DMF equations are more easily tractable, here we show
that the presence of excitation qualitatively modifies the fluctuating activity
compared to purely inhibitory networks. In presence of excitation,
intrinsically generated fluctuations induce a strong increase in mean firing
rates, a phenomenon that is much weaker in purely inhibitory networks.
Excitation moreover induces two different fluctuating regimes: for moderate
overall coupling, recurrent inhibition is sufficient to stabilize fluctuations,
for strong coupling, firing rates are stabilized solely by the upper bound
imposed on activity, even if inhibition is stronger than excitation. These
results extend to more general network architectures, and to rate networks
receiving noisy inputs mimicking spiking activity. Finally, we show that
signatures of the second dynamical regime appear in networks of
integrate-and-fire neurons
Noise-induced behaviors in neural mean field dynamics
The collective behavior of cortical neurons is strongly affected by the
presence of noise at the level of individual cells. In order to study these
phenomena in large-scale assemblies of neurons, we consider networks of
firing-rate neurons with linear intrinsic dynamics and nonlinear coupling,
belonging to a few types of cell populations and receiving noisy currents.
Asymptotic equations as the number of neurons tends to infinity (mean field
equations) are rigorously derived based on a probabilistic approach. These
equations are implicit on the probability distribution of the solutions which
generally makes their direct analysis difficult. However, in our case, the
solutions are Gaussian, and their moments satisfy a closed system of nonlinear
ordinary differential equations (ODEs), which are much easier to study than the
original stochastic network equations, and the statistics of the empirical
process uniformly converge towards the solutions of these ODEs. Based on this
description, we analytically and numerically study the influence of noise on
the collective behaviors, and compare these asymptotic regimes to simulations
of the network. We observe that the mean field equations provide an accurate
description of the solutions of the network equations for network sizes as
small as a few hundreds of neurons. In particular, we observe that the level of
noise in the system qualitatively modifies its collective behavior, producing
for instance synchronized oscillations of the whole network, desynchronization
of oscillating regimes, and stabilization or destabilization of stationary
solutions. These results shed a new light on the role of noise in shaping
collective dynamics of neurons, and gives us clues for understanding similar
phenomena observed in biological networks
Mean-field equations for stochastic firing-rate neural fields with delays: Derivation and noise-induced transitions
In this manuscript we analyze the collective behavior of mean-field limits of
large-scale, spatially extended stochastic neuronal networks with delays.
Rigorously, the asymptotic regime of such systems is characterized by a very
intricate stochastic delayed integro-differential McKean-Vlasov equation that
remain impenetrable, leaving the stochastic collective dynamics of such
networks poorly understood. In order to study these macroscopic dynamics, we
analyze networks of firing-rate neurons, i.e. with linear intrinsic dynamics
and sigmoidal interactions. In that case, we prove that the solution of the
mean-field equation is Gaussian, hence characterized by its two first moments,
and that these two quantities satisfy a set of coupled delayed
integro-differential equations. These equations are similar to usual neural
field equations, and incorporate noise levels as a parameter, allowing analysis
of noise-induced transitions. We identify through bifurcation analysis several
qualitative transitions due to noise in the mean-field limit. In particular,
stabilization of spatially homogeneous solutions, synchronized oscillations,
bumps, chaotic dynamics, wave or bump splitting are exhibited and arise from
static or dynamic Turing-Hopf bifurcations. These surprising phenomena allow
further exploring the role of noise in the nervous system.Comment: Updated to the latest version published, and clarified the dependence
in space of Brownian motion
Balanced neural architecture and the idling brain
A signature feature of cortical spike trains is their trial-to-trial variability. This variability is large in the spontaneous state and is reduced when cortex is driven by a stimulus or task. Models of recurrent cortical networks with unstructured, yet balanced, excitation and inhibition generate variability consistent with evoked conditions. However, these models produce spike trains which lack the long timescale fluctuations and large variability exhibited during spontaneous cortical dynamics. We propose that global network architectures which support a large number of stable states (attractor networks) allow balanced networks to capture key features of neural variability in both spontaneous and evoked conditions. We illustrate this using balanced spiking networks with clustered assembly, feedforward chain, and ring structures. By assuming that global network structure is related to stimulus preference, we show that signal correlations are related to the magnitude of correlations in the spontaneous state. Finally, we contrast the impact of stimulation on the trial-to-trial variability in attractor networks with that of strongly coupled spiking networks with chaotic firing rate instabilities, recently investigated by Ostojic (2014). We find that only attractor networks replicate an experimentally observed stimulus-induced quenching of trial-to-trial variability. In total, the comparison of the trial-variable dynamics of single neurons or neuron pairs during spontaneous and evoked activity can be a window into the global structure of balanced cortical networks. © 2014 Doiron and Litwin-Kumar
Synchronously-pumped OPO coherent Ising machine: benchmarking and prospects
The coherent Ising machine (CIM) is a network of optical parametric oscillators (OPOs) that solves for the ground state of Ising problems through OPO bifurcation dynamics. Here, we present experimental results comparing the performance of the CIM to quantum annealers (QAs) on two classes of NP-hard optimization problems: ground state calculation of the Sherrington-Kirkpatrick (SK) model and MAX-CUT. While the two machines perform comparably on sparsely-connected problems such as cubic MAX-CUT, on problems with dense connectivity, the QA shows an exponential performance penalty relative to CIMs. We attribute this to the embedding overhead required to map dense problems onto the sparse hardware architecture of the QA, a problem that can be overcome in photonic architectures such as the CIM
Inferring network properties of cortical neurons with synaptic coupling and parameter dispersion
Computational models at different space-time scales allow us to understand the fundamental mechanisms that govern neural processes and relate uniquely these processes to neuroscience data. In this work, we propose a novel neurocomputational unit (a mesoscopic model which tell us about the interaction between local cortical nodes in a large scale neural mass model) of bursters that qualitatively captures the complex dynamics exhibited by a full network of parabolic bursting neurons. We observe that the temporal dynamics and fluctuation of mean synaptic action term exhibits a high degree of correlation with the spike/burst activity of our population. With heterogeneity in the applied drive and mean synaptic coupling derived from fast excitatory synapse approximations we observe long term behavior in our population dynamics such as partial oscillations, incoherence, and synchrony. In order to understand the origin of multistability at the population level as a function of mean synaptic coupling and heterogeneity in the firing rate threshold we employ a simple generative model for parabolic bursting recently proposed by Ghosh et al. (2009). Further, we use here a mean coupling formulated for fast spiking neurons for our analysis of generic model. Stability analysis of this mean field network allow us to identify all the relevant network states found in the detailed biophysical model. We derive here analytically several boundary solutions, a result which holds for any number of spikes per burst. These findings illustrate the role of oscillations occurring at slow time scales (bursts) on the global behavior of the network.EC/FP7/269921/EU/Brain-inspired multiscale computation in neuromorphic hybrid systems/BrainScale
Noise-enhanced spatial-photonic Ising machine
Ising machines are novel computing devices for the energy minimization of Ising models. These combinatorial optimization problems are of paramount importance for science and technology, but remain difficult to tackle on large scale by conventional electronics. Recently, various photonics-based Ising machines demonstrated fast computing of a Ising ground state by data processing through multiple temporal or spatial optical channels. Experimental noise acts as a detrimental effect in many of these devices. On the contrary, here we demonstrate that an optimal noise level enhances the performance of spatial-photonic Ising machines on frustrated spin problems. By controlling the error rate at the detection, we introduce a noisy-feedback mechanism in an Ising machine based on spatial light modulation. We investigate the device performance on systems with hundreds of individually-addressable spins with all-to-all couplings and we found an increased success probability at a specific noise level. The optimal noise amplitude depends on graph properties and size, thus indicating an additional tunable parameter helpful in exploring complex energy landscapes and in avoiding getting stuck in local minima. Our experimental results identify noise as a potentially valuable resource for optical computing. This concept, which also holds in different nanophotonic neural networks, may be crucial in developing novel hardware with optics-enabled parallel architecture for large-scale optimizations
Synchronously-pumped OPO coherent Ising machine: benchmarking and prospects
The coherent Ising machine (CIM) is a network of optical parametric oscillators (OPOs) that solves for the ground state of Ising problems through OPO bifurcation dynamics. Here, we present experimental results comparing the performance of the CIM to quantum annealers (QAs) on two classes of NP-hard optimization problems: ground state calculation of the Sherrington-Kirkpatrick (SK) model and MAX-CUT. While the two machines perform comparably on sparsely-connected problems such as cubic MAX-CUT, on problems with dense connectivity, the QA shows an exponential performance penalty relative to CIMs. We attribute this to the embedding overhead required to map dense problems onto the sparse hardware architecture of the QA, a problem that can be overcome in photonic architectures such as the CIM
Combinatorial Optimization with Photonics-Inspired Clock Models
NP-hard combinatorial optimization problems are in general hard problems that their computational complexity grows faster than polynomial scaling with the size of the problem. Thus, over the years there has been a great interest in developing unconventional methods and algorithms for solving such problems. Here, inspired by the nonlinear optical process of q-photon down-conversion, in which a photon is converted into q degenerate lower energy photons, we introduce a nonlinear dynamical model that builds on coupled single-variable phase oscillators and allows for efficiently approximating the ground state of the classical q-state planar Potts Hamiltonian. This reduces the exhaustive search in the large discrete solution space of a large class of combinatorial problems that are represented by the Potts Hamiltonian to solving a system of coupled dynamical equations. To reduce the problem of trapping into local minima, we introduce two different mechanisms by utilizing controlled chaotic dynamics and by dynamical formation of the cost function through adiabatic parameter tuning. The proposed algorithm is applied to graph-q-partitioning problems on several complex graphs
- …