26,599 research outputs found
The Little-Hopfield model on a Random Graph
We study the Hopfield model on a random graph in scaling regimes where the
average number of connections per neuron is a finite number and where the spin
dynamics is governed by a synchronous execution of the microscopic update rule
(Little-Hopfield model).We solve this model within replica symmetry and by
using bifurcation analysis we prove that the spin-glass/paramagnetic and the
retrieval/paramagnetictransition lines of our phase diagram are identical to
those of sequential dynamics.The first-order retrieval/spin-glass transition
line follows by direct evaluation of our observables using population dynamics.
Within the accuracy of numerical precision and for sufficiently small values of
the connectivity parameter we find that this line coincides with the
corresponding sequential one. Comparison with simulation experiments shows
excellent agreement.Comment: 14 pages, 4 figure
Transition to chaos in random neuronal networks
Firing patterns in the central nervous system often exhibit strong temporal
irregularity and heterogeneity in their time averaged response properties.
Previous studies suggested that these properties are outcome of an intrinsic
chaotic dynamics. Indeed, simplified rate-based large neuronal networks with
random synaptic connections are known to exhibit sharp transition from fixed
point to chaotic dynamics when the synaptic gain is increased. However, the
existence of a similar transition in neuronal circuit models with more
realistic architectures and firing dynamics has not been established.
In this work we investigate rate based dynamics of neuronal circuits composed
of several subpopulations and random connectivity. Nonzero connections are
either positive-for excitatory neurons, or negative for inhibitory ones, while
single neuron output is strictly positive; in line with known constraints in
many biological systems. Using Dynamic Mean Field Theory, we find the phase
diagram depicting the regimes of stable fixed point, unstable dynamic and
chaotic rate fluctuations. We characterize the properties of systems near the
chaotic transition and show that dilute excitatory-inhibitory architectures
exhibit the same onset to chaos as a network with Gaussian connectivity.
Interestingly, the critical properties near transition depend on the shape of
the single- neuron input-output transfer function near firing threshold.
Finally, we investigate network models with spiking dynamics. When synaptic
time constants are slow relative to the mean inverse firing rates, the network
undergoes a sharp transition from fast spiking fluctuations and static firing
rates to a state with slow chaotic rate fluctuations. When the synaptic time
constants are finite, the transition becomes smooth and obeys scaling
properties, similar to crossover phenomena in statistical mechanicsComment: 28 Pages, 12 Figures, 5 Appendice
Chaos in neural networks with a nonmonotonic transfer function
Time evolution of diluted neural networks with a nonmonotonic transfer
function is analitically described by flow equations for macroscopic variables.
The macroscopic dynamics shows a rich variety of behaviours: fixed-point,
periodicity and chaos. We examine in detail the structure of the strange
attractor and in particular we study the main features of the stable and
unstable manifolds, the hyperbolicity of the attractor and the existence of
homoclinic intersections. We also discuss the problem of the robustness of the
chaos and we prove that in the present model chaotic behaviour is fragile
(chaotic regions are densely intercalated with periodicity windows), according
to a recently discussed conjecture. Finally we perform an analysis of the
microscopic behaviour and in particular we examine the occurrence of damage
spreading by studying the time evolution of two almost identical initial
configurations. We show that for any choice of the parameters the two initial
states remain microscopically distinct.Comment: 12 pages, 11 figures. Accepted for publication in Physical Review E.
Originally submitted to the neuro-sys archive which was never publicly
announced (was 9905001
Stochastic mean field formulation of the dynamics of diluted neural networks
We consider pulse-coupled Leaky Integrate-and-Fire neural networks with
randomly distributed synaptic couplings. This random dilution induces
fluctuations in the evolution of the macroscopic variables and deterministic
chaos at the microscopic level. Our main aim is to mimic the effect of the
dilution as a noise source acting on the dynamics of a globally coupled
non-chaotic system. Indeed, the evolution of a diluted neural network can be
well approximated as a fully pulse coupled network, where each neuron is driven
by a mean synaptic current plus additive noise. These terms represent the
average and the fluctuations of the synaptic currents acting on the single
neurons in the diluted system. The main microscopic and macroscopic dynamical
features can be retrieved with this stochastic approximation. Furthermore, the
microscopic stability of the diluted network can be also reproduced, as
demonstrated from the almost coincidence of the measured Lyapunov exponents in
the deterministic and stochastic cases for an ample range of system sizes. Our
results strongly suggest that the fluctuations in the synaptic currents are
responsible for the emergence of chaos in this class of pulse coupled networks.Comment: 12 Pages, 4 Figure
Collective oscillations in disordered neural networks
We investigate the onset of collective oscillations in a network of
pulse-coupled leaky-integrate-and-fire neurons in the presence of quenched and
annealed disorder. We find that the disorder induces a weak form of chaos that
is analogous to that arising in the Kuramoto model for a finite number N of
oscillators [O.V. Popovych at al., Phys. Rev. E 71} 065201(R) (2005)]. In fact,
the maximum Lyapunov exponent turns out to scale to zero for N going to
infinite, with an exponent that is different for the two types of disorder. In
the thermodynamic limit, the random-network dynamics reduces to that of a fully
homogenous system with a suitably scaled coupling strength. Moreover, we show
that the Lyapunov spectrum of the periodically collective state scales to zero
as 1/N^2, analogously to the scaling found for the `splay state'.Comment: 8.5 Pages, 12 figures, submitted to Physical Review
Slowly evolving geometry in recurrent neural networks I: extreme dilution regime
We study extremely diluted spin models of neural networks in which the
connectivity evolves in time, although adiabatically slowly compared to the
neurons, according to stochastic equations which on average aim to reduce
frustration. The (fast) neurons and (slow) connectivity variables equilibrate
separately, but at different temperatures. Our model is exactly solvable in
equilibrium. We obtain phase diagrams upon making the condensed ansatz (i.e.
recall of one pattern). These show that, as the connectivity temperature is
lowered, the volume of the retrieval phase diverges and the fraction of
mis-aligned spins is reduced. Still one always retains a region in the
retrieval phase where recall states other than the one corresponding to the
`condensed' pattern are locally stable, so the associative memory character of
our model is preserved.Comment: 18 pages, 6 figure
An associative network with spatially organized connectivity
We investigate the properties of an autoassociative network of
threshold-linear units whose synaptic connectivity is spatially structured and
asymmetric. Since the methods of equilibrium statistical mechanics cannot be
applied to such a network due to the lack of a Hamiltonian, we approach the
problem through a signal-to-noise analysis, that we adapt to spatially
organized networks. The conditions are analyzed for the appearance of stable,
spatially non-uniform profiles of activity with large overlaps with one of the
stored patterns. It is also shown, with simulations and analytic results, that
the storage capacity does not decrease much when the connectivity of the
network becomes short range. In addition, the method used here enables us to
calculate exactly the storage capacity of a randomly connected network with
arbitrary degree of dilution.Comment: 27 pages, 6 figures; Accepted for publication in JSTA
Finite Connectivity Attractor Neural Networks
We study a family of diluted attractor neural networks with a finite average
number of (symmetric) connections per neuron. As in finite connectivity spin
glasses, their equilibrium properties are described by order parameter
functions, for which we derive an integral equation in replica symmetric (RS)
approximation. A bifurcation analysis of this equation reveals the locations of
the paramagnetic to recall and paramagnetic to spin-glass transition lines in
the phase diagram. The line separating the retrieval phase from the spin-glass
phase is calculated at zero temperature. All phase transitions are found to be
continuous.Comment: 17 pages, 4 figure
- …