In a Markov chain started at a state x, the hitting time τ(y) is the
first time that the chain reaches another state y. We study the probability
Px(τ(y)=t) that the first visit to y occurs precisely at a
given time t. Informally speaking, the event that a new state is visited at a
large time t may be considered a "surprise". We prove the following three
bounds:
1) In any Markov chain with n states, Px(τ(y)=t)≤tn.
2) In a reversible chain with n states, Px(τ(y)=t)≤t2n for t≥4n+4.
3) For random walk on a simple graph with n≥2 vertices,
Px(τ(y)=t)≤t4elogn.
We construct examples showing that these bounds are close to optimal. The
main feature of our bounds is that they require very little knowledge of the
structure of the Markov chain.
To prove the bound for random walk on graphs, we establish the following
estimate conjectured by Aldous, Ding and Oveis-Gharan (private communication):
For random walk on an n-vertex graph, for every initial vertex x,
\[ \sum_y \left( \sup_{t \ge 0} p^t(x, y) \right) = O(\log n). \