58 research outputs found

    A polynomial time approximation scheme for computing the supremum of Gaussian processes

    Full text link
    We give a polynomial time approximation scheme (PTAS) for computing the supremum of a Gaussian process. That is, given a finite set of vectors VRdV\subseteq\mathbb{R}^d, we compute a (1+ε)(1+\varepsilon)-factor approximation to EXNd[supvVv,X]\mathop {\mathbb{E}}_{X\leftarrow\mathcal{N}^d}[\sup_{v\in V}|\langle v,X\rangle|] deterministically in time poly(d)VOε(1)\operatorname {poly}(d)\cdot|V|^{O_{\varepsilon}(1)}. Previously, only a constant factor deterministic polynomial time approximation algorithm was known due to the work of Ding, Lee and Peres [Ann. of Math. (2) 175 (2012) 1409-1471]. This answers an open question of Lee (2010) and Ding [Ann. Probab. 42 (2014) 464-496]. The study of supremum of Gaussian processes is of considerable importance in probability with applications in functional analysis, convex geometry, and in light of the recent breakthrough work of Ding, Lee and Peres [Ann. of Math. (2) 175 (2012) 1409-1471], to random walks on finite graphs. As such our result could be of use elsewhere. In particular, combining with the work of Ding [Ann. Probab. 42 (2014) 464-496], our result yields a PTAS for computing the cover time of bounded-degree graphs. Previously, such algorithms were known only for trees. Along the way, we also give an explicit oblivious estimator for semi-norms in Gaussian space with optimal query complexity. Our algorithm and its analysis are elementary in nature, using two classical comparison inequalities, Slepian's lemma and Kanter's lemma.Comment: Published in at http://dx.doi.org/10.1214/13-AAP997 the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org

    The evolution of the cover time

    Full text link
    The cover time of a graph is a celebrated example of a parameter that is easy to approximate using a randomized algorithm, but for which no constant factor deterministic polynomial time approximation is known. A breakthrough due to Kahn, Kim, Lovasz and Vu yielded a (log log n)^2 polynomial time approximation. We refine this upper bound, and show that the resulting bound is sharp and explicitly computable in random graphs. Cooper and Frieze showed that the cover time of the largest component of the Erdos-Renyi random graph G(n,c/n) in the supercritical regime with c>1 fixed, is asymptotic to f(c) n \log^2 n, where f(c) tends to 1 as c tends to 1. However, our new bound implies that the cover time for the critical Erdos-Renyi random graph G(n,1/n) has order n, and shows how the cover time evolves from the critical window to the supercritical phase. Our general estimate also yields the order of the cover time for a variety of other concrete graphs, including critical percolation clusters on the Hamming hypercube {0,1}^n, on high-girth expanders, and on tori Z_n^d for fixed large d. For the graphs we consider, our results show that the blanket time, introduced by Winkler and Zuckerman, is within a constant factor of the cover time. Finally, we prove that for any connected graph, adding an edge can increase the cover time by at most a factor of 4.Comment: 14 pages, to appear in CP

    Majorizing measures for the optimizer

    Get PDF
    The theory of majorizing measures, extensively developed by Fernique, Talagrand and many others, provides one of the most general frameworks for controlling the behavior of stochastic processes. In particular, it can be applied to derive quantitative bounds on the expected suprema and the degree of continuity of sample paths for many processes. One of the crowning achievements of the theory is Talagrand’s tight alternative characterization of the suprema of Gaussian processes in terms of majorizing measures. The proof of this theorem was difficult, and thus considerable effort was put into the task of developing both shorter and easier to understand proofs. A major reason for this difficulty was considered to be theory of majorizing measures itself, which had the reputation of being opaque and mysterious. As a consequence, most recent treatments of the theory (including by Talagrand himself) have eschewed the use of majorizing measures in favor of a purely combinatorial approach (the generic chaining) where objects based on sequences of partitions provide roughly matching upper and lower bounds on the desired expected supremum. In this paper, we return to majorizing measures as a primary object of study, and give a viewpoint that we think is natural and clarifying from an optimization perspective. As our main contribution, we give an algorithmic proof of the majorizing measures theorem based on two parts: We make the simple (but apparently new) observation that finding the best majorizing measure can be cast as a convex program. This also allows for efficiently computing the measure using off-the-shelf methods from convex optimization. We obtain tree-based upper and lower bound certificates by rounding, in a series of steps, the primal and dual solutions to this convex program. While duality has conceptually been part of the theory since its beginnings, as far as we are aware no explicit link to convex optimization has been previously made

    A sharp estimate for cover times on binary trees

    Full text link
    We compute the second order correction for the cover time of the binary tree of depth nn by (continuous-time) random walk, and show that with probability approaching 1 as nn increases, \sqrt{\tau_{\mathrm{cov}}}=\sqrt{|E|}[\sqrt{2\log 2}\cdot n - {\log n}/{\sqrt{2\log 2}} + O((\log\logn)^8], thus showing that the second order correction differs from the corresponding one for the maximum of the Gaussian free field on the tree.Comment: 14 pages, no figur
    corecore