233 research outputs found
Optimal Dynamical Range of Excitable Networks at Criticality
A recurrent idea in the study of complex systems is that optimal information
processing is to be found near bifurcation points or phase transitions.
However, this heuristic hypothesis has few (if any) concrete realizations where
a standard and biologically relevant quantity is optimized at criticality. Here
we give a clear example of such a phenomenon: a network of excitable elements
has its sensitivity and dynamic range maximized at the critical point of a
non-equilibrium phase transition. Our results are compatible with the essential
role of gap junctions in olfactory glomeruli and retinal ganglionar cell
output. Synchronization and global oscillations also appear in the network
dynamics. We propose that the main functional role of electrical coupling is to
provide an enhancement of dynamic range, therefore allowing the coding of
information spanning several orders of magnitude. The mechanism could provide a
microscopic neural basis for psychophysical laws.Comment: 2 figures, 6 page
Long Term and Short Term Effects of Perturbations in a Immune Network Model
In this paper we review the trajectory of a model proposed by Stauffer and
Weisbuch in 1992 to describe the evolution of the immune repertoire and present
new results about its dynamical behavior. Ten years later this model, which is
based on the ideas of the immune network as proposed by Jerne, has been able to
describe a multi-connected network and could be used to reproduce immunization
and aging experiments performed with mice. Besides its biological implications,
the physical aspects of the complex dynamics of this network is very
interesting {\it per se}. The immunization protocol is simulated by introducing
small and large perturbations (damages), and in this work we discuss the role
of both. In a very recent paper we studied the aging effects by using
auto-correlation functions, and the results obtained apparently indicated that
the small perturbations would be more important than the large ones, since
their cumulative effects may change the attractor of the dynamics. However our
new results indicate that both types of perturbations are important. It is the
cooperative effects between both that lead to the complex behavior which allows
to reproduce experimental results.Comment: 15 pages, 5 figure
Can dynamical synapses produce true self-organized criticality?
Neuronal networks can present activity described by power-law distributed
avalanches presumed to be a signature of a critical state. Here we study a
random-neighbor network of excitable cellular automata coupled by dynamical
synapses. The model exhibits a very similar to conservative self-organized
criticality (SOC) models behavior even with dissipative bulk dynamics. This
occurs because in the stationary regime the model is conservative on average,
and, in the thermodynamic limit, the probability distribution for the global
branching ratio converges to a delta-function centered at its critical value.
So, this non-conservative model pertain to the same universality class of
conservative SOC models and contrasts with other dynamical synapses models that
present only self-organized quasi-criticality (SOqC). Analytical results show
very good agreement with simulations of the model and enable us to study the
emergence of SOC as a function of the parametric derivatives of the stationary
branching ratio.Comment: 14 pages, 6 figure
Diversity improves performance in excitable networks
As few real systems comprise indistinguishable units, diversity is a hallmark
of nature. Diversity among interacting units shapes properties of collective
behavior such as synchronization and information transmission. However, the
benefits of diversity on information processing at the edge of a phase
transition, ordinarily assumed to emerge from identical elements, remain
largely unexplored. Analyzing a general model of excitable systems with
heterogeneous excitability, we find that diversity can greatly enhance optimal
performance (by two orders of magnitude) when distinguishing incoming inputs.
Heterogeneous systems possess a subset of specialized elements whose capability
greatly exceeds that of the nonspecialized elements. Nonetheless, the behavior
of the whole network can outperform all subgroups. We also find that diversity
can yield multiple percolation, with performance optimized at tricriticality.
Our results are robust in specific and more realistic neuronal systems
comprising a combination of excitatory and inhibitory units, and indicate that
diversity-induced amplification can be harnessed by neuronal systems for
evaluating stimulus intensities.Comment: 17 pages, 7 figure
Dynamic range of hypercubic stochastic excitable media
We study the response properties of d-dimensional hypercubic excitable
networks to a stochastic stimulus. Each site, modelled either by a three-state
stochastic susceptible-infected-recovered-susceptible system or by the
probabilistic Greenberg-Hastings cellular automaton, is continuously and
independently stimulated by an external Poisson rate h. The response function
(mean density of active sites rho versus h) is obtained via simulations (for
d=1, 2, 3, 4) and mean field approximations at the single-site and pair levels
(for all d). In any dimension, the dynamic range of the response function is
maximized precisely at the nonequilibrium phase transition to self-sustained
activity, in agreement with a reasoning recently proposed. Moreover, the
maximum dynamic range attained at a given dimension d is a decreasing function
of d.Comment: 7 pages, 4 figure
On the Aging Dynamics in an Immune Network Model
Recently we have used a cellular automata model which describes the dynamics
of a multi-connected network to reproduce the refractory behavior and aging
effects obtained in immunization experiments performed with mice when subjected
to multiple perturbations. In this paper we investigate the similarities
between the aging dynamics observed in this multi-connected network and the one
observed in glassy systems, by using the usual tools applied to analyze the
latter. An interesting feature we show here is that the model reproduces the
biological aspects observed in the experiments during the long transient time
it takes to reach the stationary state. Depending on the initial conditions,
and without any perturbation, the system may reach one of a family of
long-period attractors. The pertrubations may drive the system from its natural
attractor to other attractors of the same family. We discuss the different
roles played by the small random perturbations (noise) and by the large
periodic perturbations (immunizations)
Anticipated Synchronization in a Biologically Plausible Model of Neuronal Motifs
Two identical autonomous dynamical systems coupled in a master-slave
configuration can exhibit anticipated synchronization (AS) if the slave also
receives a delayed negative self-feedback. Recently, AS was shown to occur in
systems of simplified neuron models, requiring the coupling of the neuronal
membrane potential with its delayed value. However, this coupling has no
obvious biological correlate. Here we propose a canonical neuronal microcircuit
with standard chemical synapses, where the delayed inhibition is provided by an
interneuron. In this biologically plausible scenario, a smooth transition from
delayed synchronization (DS) to AS typically occurs when the inhibitory
synaptic conductance is increased. The phenomenon is shown to be robust when
model parameters are varied within physiological range. Since the DS-AS
transition amounts to an inversion in the timing of the pre- and post-synaptic
spikes, our results could have a bearing on spike-timing-dependent-plasticity
models
- …
