4,643 research outputs found
Spike train statistics and Gibbs distributions
This paper is based on a lecture given in the LACONEU summer school,
Valparaiso, January 2012. We introduce Gibbs distribution in a general setting,
including non stationary dynamics, and present then three examples of such
Gibbs distributions, in the context of neural networks spike train statistics:
(i) Maximum entropy model with spatio-temporal constraints; (ii) Generalized
Linear Models; (iii) Conductance based Inte- grate and Fire model with chemical
synapses and gap junctions.Comment: 23 pages, submitte
Parameter estimation for integer-valued Gibbs distributions
We consider Gibbs distributions, which are families of probability
distributions over a discrete space with probability mass function
given by . Here
is a fixed function (called a Hamiltonian),
is the parameter of the distribution, and the normalization factor
is called
the partition function. We study how function can be estimated using an
oracle that produces samples for a value in
a given interval .
We consider the problem of estimating the normalized coefficients for
indices satisfying
, where is a
given parameter and is a given subset of . We solve this using
samples where
, and we show this is optimal up
to logarithmic factors. We also improve the sample complexity to roughly
for applications where
the coefficients are log-concave (e.g. counting connected subgraphs of a given
graph).
As a key subroutine, we show how to estimate using samples. This improves over a prior
algorithm of Kolmogorov (2018) that uses
samples. We also show a "batched" version of this algorithm which
simultaneously estimates for many values of
, at essentially the same cost as for estimating just
alone. We show matching lower bounds,
demonstrating that this complexity is optimal as a function of up to
logarithmic terms.Comment: Superseded by arXiv:2007.1082
On Sampling from the Gibbs Distribution with Random Maximum A-Posteriori Perturbations
In this paper we describe how MAP inference can be used to sample efficiently
from Gibbs distributions. Specifically, we provide means for drawing either
approximate or unbiased samples from Gibbs' distributions by introducing low
dimensional perturbations and solving the corresponding MAP assignments. Our
approach also leads to new ways to derive lower bounds on partition functions.
We demonstrate empirically that our method excels in the typical "high signal -
high coupling" regime. The setting results in ragged energy landscapes that are
challenging for alternative approaches to sampling and/or lower bounds
Convergence Rate of Riemannian Hamiltonian Monte Carlo and Faster Polytope Volume Computation
We give the first rigorous proof of the convergence of Riemannian Hamiltonian
Monte Carlo, a general (and practical) method for sampling Gibbs distributions.
Our analysis shows that the rate of convergence is bounded in terms of natural
smoothness parameters of an associated Riemannian manifold. We then apply the
method with the manifold defined by the log barrier function to the problems of
(1) uniformly sampling a polytope and (2) computing its volume, the latter by
extending Gaussian cooling to the manifold setting. In both cases, the total
number of steps needed is O^{*}(mn^{\frac{2}{3}}), improving the state of the
art. A key ingredient of our analysis is a proof of an analog of the KLS
conjecture for Gibbs distributions over manifolds
Approximation algorithms for the normalizing constant of Gibbs distributions
Consider a family of distributions where
means that . Here is the
proper normalizing constant, equal to . Then
is known as a Gibbs distribution, and is the
partition function. This work presents a new method for approximating the
partition function to a specified level of relative accuracy using only a
number of samples, that is, when
. This is a sharp improvement over previous, similar approaches that
used a much more complicated algorithm, requiring
samples.Comment: Published in at http://dx.doi.org/10.1214/14-AAP1015 the Annals of
Applied Probability (http://www.imstat.org/aap/) by the Institute of
Mathematical Statistics (http://www.imstat.org
How Gibbs distributions may naturally arise from synaptic adaptation mechanisms. A model-based argumentation
This paper addresses two questions in the context of neuronal networks
dynamics, using methods from dynamical systems theory and statistical physics:
(i) How to characterize the statistical properties of sequences of action
potentials ("spike trains") produced by neuronal networks ? and; (ii) what are
the effects of synaptic plasticity on these statistics ? We introduce a
framework in which spike trains are associated to a coding of membrane
potential trajectories, and actually, constitute a symbolic coding in important
explicit examples (the so-called gIF models). On this basis, we use the
thermodynamic formalism from ergodic theory to show how Gibbs distributions are
natural probability measures to describe the statistics of spike trains, given
the empirical averages of prescribed quantities. As a second result, we show
that Gibbs distributions naturally arise when considering "slow" synaptic
plasticity rules where the characteristic time for synapse adaptation is quite
longer than the characteristic time for neurons dynamics.Comment: 39 pages, 3 figure
Gibbs distributions for random partitions generated by a fragmentation process
In this paper we study random partitions of 1,...n, where every cluster of
size j can be in any of w\_j possible internal states. The Gibbs (n,k,w)
distribution is obtained by sampling uniformly among such partitions with k
clusters. We provide conditions on the weight sequence w allowing construction
of a partition valued random process where at step k the state has the Gibbs
(n,k,w) distribution, so the partition is subject to irreversible fragmentation
as time evolves. For a particular one-parameter family of weight sequences
w\_j, the time-reversed process is the discrete Marcus-Lushnikov coalescent
process with affine collision rate K\_{i,j}=a+b(i+j) for some real numbers a
and b. Under further restrictions on a and b, the fragmentation process can be
realized by conditioning a Galton-Watson tree with suitable offspring
distribution to have n nodes, and cutting the edges of this tree by random
sampling of edges without replacement, to partition the tree into a collection
of subtrees. Suitable offspring distributions include the binomial, negative
binomial and Poisson distributions.Comment: 38 pages, 2 figures, version considerably modified. To appear in the
Journal of Statistical Physic
- …