30,105 research outputs found
On Dynamics of Integrate-and-Fire Neural Networks with Conductance Based Synapses
We present a mathematical analysis of a networks with Integrate-and-Fire
neurons and adaptive conductances. Taking into account the realistic fact that
the spike time is only known within some \textit{finite} precision, we propose
a model where spikes are effective at times multiple of a characteristic time
scale , where can be \textit{arbitrary} small (in particular,
well beyond the numerical precision). We make a complete mathematical
characterization of the model-dynamics and obtain the following results. The
asymptotic dynamics is composed by finitely many stable periodic orbits, whose
number and period can be arbitrary large and can diverge in a region of the
synaptic weights space, traditionally called the "edge of chaos", a notion
mathematically well defined in the present paper. Furthermore, except at the
edge of chaos, there is a one-to-one correspondence between the membrane
potential trajectories and the raster plot. This shows that the neural code is
entirely "in the spikes" in this case. As a key tool, we introduce an order
parameter, easy to compute numerically, and closely related to a natural notion
of entropy, providing a relevant characterization of the computational
capabilities of the network. This allows us to compare the computational
capabilities of leaky and Integrate-and-Fire models and conductance based
models. The present study considers networks with constant input, and without
time-dependent plasticity, but the framework has been designed for both
extensions.Comment: 36 pages, 9 figure
Entropy-based parametric estimation of spike train statistics
We consider the evolution of a network of neurons, focusing on the asymptotic
behavior of spikes dynamics instead of membrane potential dynamics. The spike
response is not sought as a deterministic response in this context, but as a
conditional probability : "Reading out the code" consists of inferring such a
probability. This probability is computed from empirical raster plots, by using
the framework of thermodynamic formalism in ergodic theory. This gives us a
parametric statistical model where the probability has the form of a Gibbs
distribution. In this respect, this approach generalizes the seminal and
profound work of Schneidman and collaborators. A minimal presentation of the
formalism is reviewed here, while a general algorithmic estimation method is
proposed yielding fast convergent implementations. It is also made explicit how
several spike observables (entropy, rate, synchronizations, correlations) are
given in closed-form from the parametric estimation. This paradigm does not
only allow us to estimate the spike statistics, given a design choice, but also
to compare different models, thus answering comparative questions about the
neural code such as : "are correlations (or time synchrony or a given set of
spike patterns, ..) significant with respect to rate coding only ?" A numerical
validation of the method is proposed and the perspectives regarding spike-train
code analysis are also discussed.Comment: 37 pages, 8 figures, submitte
A Survey on Continuous Time Computations
We provide an overview of theories of continuous time computation. These
theories allow us to understand both the hardness of questions related to
continuous time dynamical systems and the computational power of continuous
time analog models. We survey the existing models, summarizing results, and
point to relevant references in the literature
Automatic differentiation in machine learning: a survey
Derivatives, mostly in the form of gradients and Hessians, are ubiquitous in
machine learning. Automatic differentiation (AD), also called algorithmic
differentiation or simply "autodiff", is a family of techniques similar to but
more general than backpropagation for efficiently and accurately evaluating
derivatives of numeric functions expressed as computer programs. AD is a small
but established field with applications in areas including computational fluid
dynamics, atmospheric sciences, and engineering design optimization. Until very
recently, the fields of machine learning and AD have largely been unaware of
each other and, in some cases, have independently discovered each other's
results. Despite its relevance, general-purpose AD has been missing from the
machine learning toolbox, a situation slowly changing with its ongoing adoption
under the names "dynamic computational graphs" and "differentiable
programming". We survey the intersection of AD and machine learning, cover
applications where AD has direct relevance, and address the main implementation
techniques. By precisely defining the main differentiation techniques and their
interrelationships, we aim to bring clarity to the usage of the terms
"autodiff", "automatic differentiation", and "symbolic differentiation" as
these are encountered more and more in machine learning settings.Comment: 43 pages, 5 figure
- …