3,672 research outputs found
Coherent periodic activity in excitatory Erdos-Renyi neural networks:The role of network connectivity
We consider an excitatory random network of leaky integrate-and-fire pulse
coupled neurons. The neurons are connected as in a directed Erd\"os-Renyi graph
with average connectivity scaling as a power law with the number of
neurons in the network. The scaling is controlled by a parameter ,
which allows to pass from massively connected to sparse networks and therefore
to modify the topology of the system. At a macroscopic level we observe two
distinct dynamical phases: an Asynchronous State (AS) corresponding to a
desynchronized dynamics of the neurons and a Partial Synchronization (PS)
regime associated with a coherent periodic activity of the network. At low
connectivity the system is in an AS, while PS emerges above a certain critical
average connectivity . For sufficiently large networks,
saturates to a constant value suggesting that a minimal average connectivity is
sufficient to observe coherent activity in systems of any size irrespectively
of the kind of considered network: sparse or massively connected. However, this
value depends on the nature of the synapses: reliable or unreliable. For
unreliable synapses the critical value required to observe the onset of
macroscopic behaviors is noticeably smaller than for reliable synaptic
transmission. Due to the disorder present in the system, for finite number of
neurons we have inhomogeneities in the neuronal behaviors, inducing a weak form
of chaos, which vanishes in the thermodynamic limit. In such a limit the
disordered systems exhibit regular (non chaotic) dynamics and their properties
correspond to that of a homogeneous fully connected network for any
-value. Apart for the peculiar exception of sparse networks, which
remain intrinsically inhomogeneous at any system size.Comment: 7 pages, 11 figures, submitted to Chao
A guided tour of asynchronous cellular automata
Research on asynchronous cellular automata has received a great amount of
attention these last years and has turned to a thriving field. We survey the
recent research that has been carried out on this topic and present a wide
state of the art where computing and modelling issues are both represented.Comment: To appear in the Journal of Cellular Automat
Power-law statistics and universal scaling in the absence of criticality
Critical states are sometimes identified experimentally through power-law
statistics or universal scaling functions. We show here that such features
naturally emerge from networks in self-sustained irregular regimes away from
criticality. In these regimes, statistical physics theory of large interacting
systems predict a regime where the nodes have independent and identically
distributed dynamics. We thus investigated the statistics of a system in which
units are replaced by independent stochastic surrogates, and found the same
power-law statistics, indicating that these are not sufficient to establish
criticality. We rather suggest that these are universal features of large-scale
networks when considered macroscopically. These results put caution on the
interpretation of scaling laws found in nature.Comment: in press in Phys. Rev.
Noise-induced synchronization and anti-resonance in excitable systems; Implications for information processing in Parkinson's Disease and Deep Brain Stimulation
We study the statistical physics of a surprising phenomenon arising in large
networks of excitable elements in response to noise: while at low noise,
solutions remain in the vicinity of the resting state and large-noise solutions
show asynchronous activity, the network displays orderly, perfectly
synchronized periodic responses at intermediate level of noise. We show that
this phenomenon is fundamentally stochastic and collective in nature. Indeed,
for noise and coupling within specific ranges, an asymmetry in the transition
rates between a resting and an excited regime progressively builds up, leading
to an increase in the fraction of excited neurons eventually triggering a chain
reaction associated with a macroscopic synchronized excursion and a collective
return to rest where this process starts afresh, thus yielding the observed
periodic synchronized oscillations. We further uncover a novel anti-resonance
phenomenon: noise-induced synchronized oscillations disappear when the system
is driven by periodic stimulation with frequency within a specific range. In
that anti-resonance regime, the system is optimal for measures of information
capacity. This observation provides a new hypothesis accounting for the
efficiency of Deep Brain Stimulation therapies in Parkinson's disease, a
neurodegenerative disease characterized by an increased synchronization of
brain motor circuits. We further discuss the universality of these phenomena in
the class of stochastic networks of excitable elements with confining coupling,
and illustrate this universality by analyzing various classical models of
neuronal networks. Altogether, these results uncover some universal mechanisms
supporting a regularizing impact of noise in excitable systems, reveal a novel
anti-resonance phenomenon in these systems, and propose a new hypothesis for
the efficiency of high-frequency stimulation in Parkinson's disease
Transition to chaos in random neuronal networks
Firing patterns in the central nervous system often exhibit strong temporal
irregularity and heterogeneity in their time averaged response properties.
Previous studies suggested that these properties are outcome of an intrinsic
chaotic dynamics. Indeed, simplified rate-based large neuronal networks with
random synaptic connections are known to exhibit sharp transition from fixed
point to chaotic dynamics when the synaptic gain is increased. However, the
existence of a similar transition in neuronal circuit models with more
realistic architectures and firing dynamics has not been established.
In this work we investigate rate based dynamics of neuronal circuits composed
of several subpopulations and random connectivity. Nonzero connections are
either positive-for excitatory neurons, or negative for inhibitory ones, while
single neuron output is strictly positive; in line with known constraints in
many biological systems. Using Dynamic Mean Field Theory, we find the phase
diagram depicting the regimes of stable fixed point, unstable dynamic and
chaotic rate fluctuations. We characterize the properties of systems near the
chaotic transition and show that dilute excitatory-inhibitory architectures
exhibit the same onset to chaos as a network with Gaussian connectivity.
Interestingly, the critical properties near transition depend on the shape of
the single- neuron input-output transfer function near firing threshold.
Finally, we investigate network models with spiking dynamics. When synaptic
time constants are slow relative to the mean inverse firing rates, the network
undergoes a sharp transition from fast spiking fluctuations and static firing
rates to a state with slow chaotic rate fluctuations. When the synaptic time
constants are finite, the transition becomes smooth and obeys scaling
properties, similar to crossover phenomena in statistical mechanicsComment: 28 Pages, 12 Figures, 5 Appendice
Sequential and asynchronous processes driven by stochastic or quantum grammars and their application to genomics: a survey
We present the formalism of sequential and asynchronous processes defined in
terms of random or quantum grammars and argue that these processes have
relevance in genomics. To make the article accessible to the
non-mathematicians, we keep the mathematical exposition as elementary as
possible, focusing on some general ideas behind the formalism and stating the
implications of the known mathematical results. We close with a set of open
challenging problems.Comment: Presented at the European Congress on Mathematical and Theoretical
Biology, Dresden 18--22 July 200
Spatio-Temporal Patterns act as Computational Mechanisms governing Emergent behavior in Robotic Swarms
open access articleOur goal is to control a robotic swarm without removing its swarm-like nature. In other words, we aim to intrinsically control a robotic swarm emergent behavior. Past attempts at governing robotic swarms or their selfcoordinating emergent behavior, has proven ineffective, largely due to the swarm’s inherent randomness (making it difficult to predict) and utter simplicity (they lack a leader, any kind of centralized control, long-range communication, global knowledge, complex internal models and only operate on a couple of basic, reactive rules). The main problem is that emergent phenomena itself is not fully understood, despite being at the forefront of current research. Research into 1D and 2D Cellular Automata has uncovered a hidden computational layer which bridges the micromacro gap (i.e., how individual behaviors at the micro-level influence the global behaviors on the macro-level). We hypothesize that there also lie embedded computational mechanisms at the heart of a robotic swarm’s emergent behavior. To test this theory, we proceeded to simulate robotic swarms (represented as both particles and dynamic networks) and then designed local rules to induce various types of intelligent, emergent behaviors (as well as designing genetic algorithms to evolve robotic swarms with emergent behaviors). Finally, we analysed these robotic swarms and successfully confirmed our hypothesis; analyzing their developments and interactions over time revealed various forms of embedded spatiotemporal patterns which store, propagate and parallel process information across the swarm according to some internal, collision-based logic (solving the mystery of how simple robots are able to self-coordinate and allow global behaviors to emerge across the swarm)
- …