912 research outputs found
Path Integral Approach to Random Neural Networks
In this work we study of the dynamics of large size random neural networks.
Different methods have been developed to analyse their behavior, most of them
rely on heuristic methods based on Gaussian assumptions regarding the
fluctuations in the limit of infinite sizes. These approaches, however, do not
justify the underlying assumptions systematically. Furthermore, they are
incapable of deriving in general the stability of the derived mean field
equations, and they are not amenable to analysis of finite size corrections.
Here we present a systematic method based on Path Integrals which overcomes
these limitations. We apply the method to a large non-linear rate based neural
network with random asymmetric connectivity matrix. We derive the Dynamic Mean
Field (DMF) equations for the system, and derive the Lyapunov exponent of the
system. Although the main results are well known, here for the first time, we
calculate the spectrum of fluctuations around the mean field equations from
which we derive the general stability conditions for the DMF states. The
methods presented here, can be applied to neural networks with more complex
dynamics and architectures. In addition, the theory can be used to compute
systematic finite size corrections to the mean field equations.Comment: 20 pages, 5 figure
Transition to chaos in random neuronal networks
Firing patterns in the central nervous system often exhibit strong temporal
irregularity and heterogeneity in their time averaged response properties.
Previous studies suggested that these properties are outcome of an intrinsic
chaotic dynamics. Indeed, simplified rate-based large neuronal networks with
random synaptic connections are known to exhibit sharp transition from fixed
point to chaotic dynamics when the synaptic gain is increased. However, the
existence of a similar transition in neuronal circuit models with more
realistic architectures and firing dynamics has not been established.
In this work we investigate rate based dynamics of neuronal circuits composed
of several subpopulations and random connectivity. Nonzero connections are
either positive-for excitatory neurons, or negative for inhibitory ones, while
single neuron output is strictly positive; in line with known constraints in
many biological systems. Using Dynamic Mean Field Theory, we find the phase
diagram depicting the regimes of stable fixed point, unstable dynamic and
chaotic rate fluctuations. We characterize the properties of systems near the
chaotic transition and show that dilute excitatory-inhibitory architectures
exhibit the same onset to chaos as a network with Gaussian connectivity.
Interestingly, the critical properties near transition depend on the shape of
the single- neuron input-output transfer function near firing threshold.
Finally, we investigate network models with spiking dynamics. When synaptic
time constants are slow relative to the mean inverse firing rates, the network
undergoes a sharp transition from fast spiking fluctuations and static firing
rates to a state with slow chaotic rate fluctuations. When the synaptic time
constants are finite, the transition becomes smooth and obeys scaling
properties, similar to crossover phenomena in statistical mechanicsComment: 28 Pages, 12 Figures, 5 Appendice
Classification and Geometry of General Perceptual Manifolds
Perceptual manifolds arise when a neural population responds to an ensemble
of sensory signals associated with different physical features (e.g.,
orientation, pose, scale, location, and intensity) of the same perceptual
object. Object recognition and discrimination requires classifying the
manifolds in a manner that is insensitive to variability within a manifold. How
neuronal systems give rise to invariant object classification and recognition
is a fundamental problem in brain theory as well as in machine learning. Here
we study the ability of a readout network to classify objects from their
perceptual manifold representations. We develop a statistical mechanical theory
for the linear classification of manifolds with arbitrary geometry revealing a
remarkable relation to the mathematics of conic decomposition. Novel
geometrical measures of manifold radius and manifold dimension are introduced
which can explain the classification capacity for manifolds of various
geometries. The general theory is demonstrated on a number of representative
manifolds, including L2 ellipsoids prototypical of strictly convex manifolds,
L1 balls representing polytopes consisting of finite sample points, and
orientation manifolds which arise from neurons tuned to respond to a continuous
angle variable, such as object orientation. The effects of label sparsity on
the classification capacity of manifolds are elucidated, revealing a scaling
relation between label sparsity and manifold radius. Theoretical predictions
are corroborated by numerical simulations using recently developed algorithms
to compute maximum margin solutions for manifold dichotomies. Our theory and
its extensions provide a powerful and rich framework for applying statistical
mechanics of linear classification to data arising from neuronal responses to
object stimuli, as well as to artificial deep networks trained for object
recognition tasks.Comment: 24 pages, 12 figures, Supplementary Material
Thermal properties of slow dynamics
The limit of small entropy production is reached in relaxing systems long
after preparation, and in stationary driven systems in the limit of small
driving power. Surprisingly, for extended systems this limit is not in general
the Gibbs-Boltzmann distribution, or a small departure from it. Interesting
cases in which it is not are glasses, phase-separation, and certain driven
complex fluids.
We describe a scenario with several coexisting temperatures acting on
different timescales, and partial equilibrations at each time scale. This
scenario entails strong modifications of the fluctuation-dissipation equalities
and the existence of some unexpected reciprocity relations. Both predictions
are open to experimental verification, particularly the latter.
The construction is consistent in general, since it can be viewed as the
breaking of a symmetry down to a residual group. It does not assume the
presence of quenched disorder. It can be -- and to a certain extent has been --
tested numerically, while some experiments are on their way. There is
furthermore the perspective that analytic arguments may be constructed to prove
or disprove its generality.Comment: 11 pages, invited talk to be presented at STATPHYS 20, Pari
- …