19,426 research outputs found
State-Space Inference and Learning with Gaussian Processes
State-space inference and learning with Gaussian processes (GPs) is an unsolved problem. We propose a new, general methodology for inference and learning in nonlinear state-space models that are described probabilistically by non-parametric GP models. We apply the expectation maximization algorithm to iterate between inference in the latent state-space and learning the parameters of the underlying GP dynamics model. Copyright 2010 by the authors
Random Recurrent Neural Networks Dynamics
This paper is a review dealing with the study of large size random recurrent
neural networks. The connection weights are selected according to a probability
law and it is possible to predict the network dynamics at a macroscopic scale
using an averaging principle. After a first introductory section, the section 1
reviews the various models from the points of view of the single neuron
dynamics and of the global network dynamics. A summary of notations is
presented, which is quite helpful for the sequel. In section 2, mean-field
dynamics is developed.
The probability distribution characterizing global dynamics is computed. In
section 3, some applications of mean-field theory to the prediction of chaotic
regime for Analog Formal Random Recurrent Neural Networks (AFRRNN) are
displayed. The case of AFRRNN with an homogeneous population of neurons is
studied in section 4. Then, a two-population model is studied in section 5. The
occurrence of a cyclo-stationary chaos is displayed using the results of
\cite{Dauce01}. In section 6, an insight of the application of mean-field
theory to IF networks is given using the results of \cite{BrunelHakim99}.Comment: Review paper, 36 pages, 5 figure
Nonlinear Compressive Particle Filtering
Many systems for which compressive sensing is used today are dynamical. The
common approach is to neglect the dynamics and see the problem as a sequence of
independent problems. This approach has two disadvantages. Firstly, the
temporal dependency in the state could be used to improve the accuracy of the
state estimates. Secondly, having an estimate for the state and its support
could be used to reduce the computational load of the subsequent step. In the
linear Gaussian setting, compressive sensing was recently combined with the
Kalman filter to mitigate above disadvantages. In the nonlinear dynamical case,
compressive sensing can not be used and, if the state dimension is high, the
particle filter would perform poorly. In this paper we combine one of the most
novel developments in compressive sensing, nonlinear compressive sensing, with
the particle filter. We show that the marriage of the two is essential and that
neither the particle filter or nonlinear compressive sensing alone gives a
satisfying solution.Comment: Accepted to CDC 201
Dynamical Functional Theory for Compressed Sensing
We introduce a theoretical approach for designing generalizations of the
approximate message passing (AMP) algorithm for compressed sensing which are
valid for large observation matrices that are drawn from an invariant random
matrix ensemble. By design, the fixed points of the algorithm obey the
Thouless-Anderson-Palmer (TAP) equations corresponding to the ensemble. Using a
dynamical functional approach we are able to derive an effective stochastic
process for the marginal statistics of a single component of the dynamics. This
allows us to design memory terms in the algorithm in such a way that the
resulting fields become Gaussian random variables allowing for an explicit
analysis. The asymptotic statistics of these fields are consistent with the
replica ansatz of the compressed sensing problem.Comment: 5 pages, accepted for ISIT 201
Noise-induced behaviors in neural mean field dynamics
The collective behavior of cortical neurons is strongly affected by the
presence of noise at the level of individual cells. In order to study these
phenomena in large-scale assemblies of neurons, we consider networks of
firing-rate neurons with linear intrinsic dynamics and nonlinear coupling,
belonging to a few types of cell populations and receiving noisy currents.
Asymptotic equations as the number of neurons tends to infinity (mean field
equations) are rigorously derived based on a probabilistic approach. These
equations are implicit on the probability distribution of the solutions which
generally makes their direct analysis difficult. However, in our case, the
solutions are Gaussian, and their moments satisfy a closed system of nonlinear
ordinary differential equations (ODEs), which are much easier to study than the
original stochastic network equations, and the statistics of the empirical
process uniformly converge towards the solutions of these ODEs. Based on this
description, we analytically and numerically study the influence of noise on
the collective behaviors, and compare these asymptotic regimes to simulations
of the network. We observe that the mean field equations provide an accurate
description of the solutions of the network equations for network sizes as
small as a few hundreds of neurons. In particular, we observe that the level of
noise in the system qualitatively modifies its collective behavior, producing
for instance synchronized oscillations of the whole network, desynchronization
of oscillating regimes, and stabilization or destabilization of stationary
solutions. These results shed a new light on the role of noise in shaping
collective dynamics of neurons, and gives us clues for understanding similar
phenomena observed in biological networks
Dynamic Compressive Sensing of Time-Varying Signals via Approximate Message Passing
In this work the dynamic compressive sensing (CS) problem of recovering
sparse, correlated, time-varying signals from sub-Nyquist, non-adaptive, linear
measurements is explored from a Bayesian perspective. While there has been a
handful of previously proposed Bayesian dynamic CS algorithms in the
literature, the ability to perform inference on high-dimensional problems in a
computationally efficient manner remains elusive. In response, we propose a
probabilistic dynamic CS signal model that captures both amplitude and support
correlation structure, and describe an approximate message passing algorithm
that performs soft signal estimation and support detection with a computational
complexity that is linear in all problem dimensions. The algorithm, DCS-AMP,
can perform either causal filtering or non-causal smoothing, and is capable of
learning model parameters adaptively from the data through an
expectation-maximization learning procedure. We provide numerical evidence that
DCS-AMP performs within 3 dB of oracle bounds on synthetic data under a variety
of operating conditions. We further describe the result of applying DCS-AMP to
two real dynamic CS datasets, as well as a frequency estimation task, to
bolster our claim that DCS-AMP is capable of offering state-of-the-art
performance and speed on real-world high-dimensional problems.Comment: 32 pages, 7 figure
Statistical physics-based reconstruction in compressed sensing
Compressed sensing is triggering a major evolution in signal acquisition. It
consists in sampling a sparse signal at low rate and later using computational
power for its exact reconstruction, so that only the necessary information is
measured. Currently used reconstruction techniques are, however, limited to
acquisition rates larger than the true density of the signal. We design a new
procedure which is able to reconstruct exactly the signal with a number of
measurements that approaches the theoretical limit in the limit of large
systems. It is based on the joint use of three essential ingredients: a
probabilistic approach to signal reconstruction, a message-passing algorithm
adapted from belief propagation, and a careful design of the measurement matrix
inspired from the theory of crystal nucleation. The performance of this new
algorithm is analyzed by statistical physics methods. The obtained improvement
is confirmed by numerical studies of several cases.Comment: 20 pages, 8 figures, 3 tables. Related codes and data are available
at http://aspics.krzakala.or
- …