189 research outputs found

    A Matrix Expander Chernoff Bound

    Full text link
    We prove a Chernoff-type bound for sums of matrix-valued random variables sampled via a random walk on an expander, confirming a conjecture due to Wigderson and Xiao. Our proof is based on a new multi-matrix extension of the Golden-Thompson inequality which improves in some ways the inequality of Sutter, Berta, and Tomamichel, and may be of independent interest, as well as an adaptation of an argument for the scalar case due to Healy. Secondarily, we also provide a generic reduction showing that any concentration inequality for vector-valued martingales implies a concentration inequality for the corresponding expander walk, with a weakening of parameters proportional to the squared mixing time.Comment: Fixed a minor bug in the proof of Theorem 3.

    Towards Stronger Counterexamples to the Log-Approximate-Rank Conjecture

    Get PDF
    We give improved separations for the query complexity analogue of the log-approximate-rank conjecture i.e. we show that there are a plethora of total Boolean functions on nn input bits, each of which has approximate Fourier sparsity at most O(n3)O(n^3) and randomized parity decision tree complexity Θ(n)\Theta(n). This improves upon the recent work of Chattopadhyay, Mande and Sherif (JACM '20) both qualitatively (in terms of designing a large number of examples) and quantitatively (improving the gap from quartic to cubic). We leave open the problem of proving a randomized communication complexity lower bound for XOR compositions of our examples. A linear lower bound would lead to new and improved refutations of the log-approximate-rank conjecture. Moreover, if any of these compositions had even a sub-linear cost randomized communication protocol, it would demonstrate that randomized parity decision tree complexity does not lift to randomized communication complexity in general (with the XOR gadget)
    • …
    corecore