83 research outputs found
Transmitting a signal by amplitude modulation in a chaotic network
We discuss the ability of a network with non linear relays and chaotic
dynamics to transmit signals, on the basis of a linear response theory
developed by Ruelle \cite{Ruelle} for dissipative systems. We show in
particular how the dynamics interfere with the graph topology to produce an
effective transmission network, whose topology depends on the signal, and
cannot be directly read on the ``wired'' network. This leads one to reconsider
notions such as ``hubs''. Then, we show examples where, with a suitable choice
of the carrier frequency (resonance), one can transmit a signal from a node to
another one by amplitude modulation, \textit{in spite of chaos}. Also, we give
an example where a signal, transmitted to any node via different paths, can
only be recovered by a couple of \textit{specific} nodes. This opens the
possibility for encoding data in a way such that the recovery of the signal
requires the knowledge of the carrier frequency \textit{and} can be performed
only at some specific node.Comment: 19 pages, 13 figures, submitted (03-03-2005
Entropy-based parametric estimation of spike train statistics
We consider the evolution of a network of neurons, focusing on the asymptotic
behavior of spikes dynamics instead of membrane potential dynamics. The spike
response is not sought as a deterministic response in this context, but as a
conditional probability : "Reading out the code" consists of inferring such a
probability. This probability is computed from empirical raster plots, by using
the framework of thermodynamic formalism in ergodic theory. This gives us a
parametric statistical model where the probability has the form of a Gibbs
distribution. In this respect, this approach generalizes the seminal and
profound work of Schneidman and collaborators. A minimal presentation of the
formalism is reviewed here, while a general algorithmic estimation method is
proposed yielding fast convergent implementations. It is also made explicit how
several spike observables (entropy, rate, synchronizations, correlations) are
given in closed-form from the parametric estimation. This paradigm does not
only allow us to estimate the spike statistics, given a design choice, but also
to compare different models, thus answering comparative questions about the
neural code such as : "are correlations (or time synchrony or a given set of
spike patterns, ..) significant with respect to rate coding only ?" A numerical
validation of the method is proposed and the perspectives regarding spike-train
code analysis are also discussed.Comment: 37 pages, 8 figures, submitte
Self-Organized Criticality and Thermodynamic formalism
We introduce a dissipative version of the Zhang's model of Self-Organized
Criticality, where a parameter allows to tune the local energy dissipation. We
analyze the main dynamical features of the model and relate in particular the
Lyapunov spectrum with the transport properties in the stationary regime. We
develop a thermodynamic formalism where we define formal Gibbs measure,
partition function and pressure characterizing the avalanche distributions. We
discuss the infinite size limit in this setting. We show in particular that a
Lee-Yang phenomenon occurs in this model, for the only conservative case. This
suggests new connexions to classical critical phenomena.Comment: 35 pages, 15 Figures, submitte
Expectation-driven interaction: a model based on Luhmann's contingency approach
We introduce an agent-based model of interaction, drawing on the contingency
approach from Luhmann's theory of social systems. The agent interactions are
defined by the exchange of distinct messages. Message selection is based on the
history of the interaction and developed within the confines of the problem of
double contingency. We examine interaction strategies in the light of the
message-exchange description using analytical and computational methods.Comment: 37 pages, 16 Figures, to appear in Journal of Artificial Societies
and Social Simulation
Stable resonances and signal propagation in a chaotic network of coupled units
We apply the linear response theory developed in \cite{Ruelle} to analyze how
a periodic signal of weak amplitude, superimposed upon a chaotic background, is
transmitted in a network of non linearly interacting units. We numerically
compute the complex susceptibility and show the existence of specific poles
(stable resonances) corresponding to the response to perturbations transverse
to the attractor. Contrary to the poles of correlation functions they depend on
the pair emitting/receiving units. This dynamic differentiation, induced by non
linearities, exhibits the different ability that units have to transmit a
signal in this network.Comment: 10 pages, 3 figures, to appear in Phys. rev.
Back-engineering of spiking neural networks parameters
We consider the deterministic evolution of a time-discretized spiking network
of neurons with connection weights having delays, modeled as a discretized
neural network of the generalized integrate and fire (gIF) type. The purpose is
to study a class of algorithmic methods allowing to calculate the proper
parameters to reproduce exactly a given spike train generated by an hidden
(unknown) neural network. This standard problem is known as NP-hard when delays
are to be calculated. We propose here a reformulation, now expressed as a
Linear-Programming (LP) problem, thus allowing to provide an efficient
resolution. This allows us to "back-engineer" a neural network, i.e. to find
out, given a set of initial conditions, which parameters (i.e., connection
weights in this case), allow to simulate the network spike dynamics. More
precisely we make explicit the fact that the back-engineering of a spike train,
is a Linear (L) problem if the membrane potentials are observed and a LP
problem if only spike times are observed, with a gIF model. Numerical
robustness is discussed. We also explain how it is the use of a generalized IF
neuron model instead of a leaky IF model that allows us to derive this
algorithm. Furthermore, we point out how the L or LP adjustment mechanism is
local to each unit and has the same structure as an "Hebbian" rule. A step
further, this paradigm is easily generalizable to the design of input-output
spike train transformations. This means that we have a practical method to
"program" a spiking network, i.e. find a set of parameters allowing us to
exactly reproduce the network output, given an input. Numerical verifications
and illustrations are provided.Comment: 30 pages, 17 figures, submitte
A discrete time neural network model with spiking neurons II. Dynamics with noise
We provide rigorous and exact results characterizing the statistics of spike
trains in a network of leaky integrate and fire neurons, where time is discrete
and where neurons are submitted to noise, without restriction on the synaptic
weights. We show the existence and uniqueness of an invariant measure of Gibbs
type and discuss its properties. We also discuss Markovian approximations and
relate them to the approaches currently used in computational neuroscience to
analyse experimental spike trains statistics.Comment: 43 pages - revised version - to appear il Journal of Mathematical
Biolog
How Gibbs distributions may naturally arise from synaptic adaptation mechanisms. A model-based argumentation
This paper addresses two questions in the context of neuronal networks
dynamics, using methods from dynamical systems theory and statistical physics:
(i) How to characterize the statistical properties of sequences of action
potentials ("spike trains") produced by neuronal networks ? and; (ii) what are
the effects of synaptic plasticity on these statistics ? We introduce a
framework in which spike trains are associated to a coding of membrane
potential trajectories, and actually, constitute a symbolic coding in important
explicit examples (the so-called gIF models). On this basis, we use the
thermodynamic formalism from ergodic theory to show how Gibbs distributions are
natural probability measures to describe the statistics of spike trains, given
the empirical averages of prescribed quantities. As a second result, we show
that Gibbs distributions naturally arise when considering "slow" synaptic
plasticity rules where the characteristic time for synapse adaptation is quite
longer than the characteristic time for neurons dynamics.Comment: 39 pages, 3 figure
Lyapunov exponents and transport in the Zhang model of Self-Organized Criticality
We discuss the role played by the Lyapunov exponents in the dynamics of
Zhang's model of Self-Organized Criticality. We show that a large part of the
spectrum (slowest modes) is associated with the energy transpor in the lattice.
In particular, we give bounds on the first negative Lyapunov exponent in terms
of the energy flux dissipated at the boundaries per unit of time. We then
establish an explicit formula for the transport modes that appear as diffusion
modes in a landscape where the metric is given by the density of active sites.
We use a finite size scaling ansatz for the Lyapunov spectrum and relate the
scaling exponent to the scaling of quantities like avalanche size, duration,
density of active sites, etc ...Comment: 33 pages, 6 figures, 1 table (to appear
Anomalous scaling and Lee-Yang zeroes in Self-Organized Criticality
We show that the generating functions of avalanche observables in SOC models
exhibits a Lee-Yang phenomenon. This establishes a new link between the
classical theory of critical phenomena and SOC. A scaling theory of the
Lee-Yang zeroes is proposed including finite sampling effects.Comment: 33 pages, 19 figures, submitte
- …