3,403 research outputs found
Cutoff for non-backtracking random walks on sparse random graphs
A finite ergodic Markov chain is said to exhibit cutoff if its distance to
stationarity remains close to 1 over a certain number of iterations and then
abruptly drops to near 0 on a much shorter time scale. Discovered in the
context of card shuffling (Aldous-Diaconis, 1986), this phenomenon is now
believed to be rather typical among fast mixing Markov chains. Yet,
establishing it rigorously often requires a challengingly detailed
understanding of the underlying chain. Here we consider non-backtracking random
walks on random graphs with a given degree sequence. Under a general sparsity
condition, we establish the cutoff phenomenon, determine its precise window,
and prove that the (suitably rescaled) cutoff profile approaches a remarkably
simple, universal shape
Universality of cutoff for the Ising model
On any locally-finite geometry, the stochastic Ising model is known to be
contractive when the inverse-temperature is small enough, via classical
results of Dobrushin and of Holley in the 1970's. By a general principle
proposed by Peres, the dynamics is then expected to exhibit cutoff. However, so
far cutoff for the Ising model has been confirmed mainly for lattices, heavily
relying on amenability and log Sobolev inequalities. Without these, cutoff was
unknown at any fixed , no matter how small, even in basic examples
such as the Ising model on a binary tree or a random regular graph.
We use the new framework of information percolation to show that, in any
geometry, there is cutoff for the Ising model at high enough temperatures.
Precisely, on any sequence of graphs with maximum degree , the Ising model
has cutoff provided that for some absolute constant
(a result which, up to the value of , is best possible). Moreover, the
cutoff location is established as the time at which the sum of squared
magnetizations drops to 1, and the cutoff window is , just as when
.
Finally, the mixing time from almost every initial state is not more than a
factor of faster then the worst one (with
as ), whereas the uniform starting state is at
least times faster.Comment: 26 pages, 2 figures. Companion paper to arXiv:1401.606
Weighted dependency graphs
The theory of dependency graphs is a powerful toolbox to prove asymptotic
normality of sums of random variables. In this article, we introduce a more
general notion of weighted dependency graphs and give normality criteria in
this context. We also provide generic tools to prove that some weighted graph
is a weighted dependency graph for a given family of random variables.
To illustrate the power of the theory, we give applications to the following
objects: uniform random pair partitions, the random graph model ,
uniform random permutations, the symmetric simple exclusion process and
multilinear statistics on Markov chains. The application to random permutations
gives a bivariate extension of a functional central limit theorem of Janson and
Barbour. On Markov chains, we answer positively an open question of Bourdon and
Vall\'ee on the asymptotic normality of subword counts in random texts
generated by a Markovian source.Comment: 57 pages. Third version: minor modifications, after review proces
- …