6 research outputs found
Fast and Powerful Hashing using Tabulation
Randomized algorithms are often enjoyed for their simplicity, but the hash
functions employed to yield the desired probabilistic guarantees are often too
complicated to be practical. Here we survey recent results on how simple
hashing schemes based on tabulation provide unexpectedly strong guarantees.
Simple tabulation hashing dates back to Zobrist [1970]. Keys are viewed as
consisting of characters and we have precomputed character tables
mapping characters to random hash values. A key
is hashed to . This schemes is
very fast with character tables in cache. While simple tabulation is not even
4-independent, it does provide many of the guarantees that are normally
obtained via higher independence, e.g., linear probing and Cuckoo hashing.
Next we consider twisted tabulation where one input character is "twisted" in
a simple way. The resulting hash function has powerful distributional
properties: Chernoff-Hoeffding type tail bounds and a very small bias for
min-wise hashing. This also yields an extremely fast pseudo-random number
generator that is provably good for many classic randomized algorithms and
data-structures.
Finally, we consider double tabulation where we compose two simple tabulation
functions, applying one to the output of the other, and show that this yields
very high independence in the classic framework of Carter and Wegman [1977]. In
fact, w.h.p., for a given set of size proportional to that of the space
consumed, double tabulation gives fully-random hashing. We also mention some
more elaborate tabulation schemes getting near-optimal independence for given
time and space.
While these tabulation schemes are all easy to implement and use, their
analysis is not
Quantum Speedup for Graph Sparsification, Cut Approximation and Laplacian Solving
Graph sparsification underlies a large number of algorithms, ranging from
approximation algorithms for cut problems to solvers for linear systems in the
graph Laplacian. In its strongest form, "spectral sparsification" reduces the
number of edges to near-linear in the number of nodes, while approximately
preserving the cut and spectral structure of the graph. In this work we
demonstrate a polynomial quantum speedup for spectral sparsification and many
of its applications. In particular, we give a quantum algorithm that, given a
weighted graph with nodes and edges, outputs a classical description of
an -spectral sparsifier in sublinear time
. This contrasts with the optimal classical
complexity . We also prove that our quantum algorithm is optimal
up to polylog-factors. The algorithm builds on a string of existing results on
sparsification, graph spanners, quantum algorithms for shortest paths, and
efficient constructions for -wise independent random strings. Our algorithm
implies a quantum speedup for solving Laplacian systems and for approximating a
range of cut problems such as min cut and sparsest cut.Comment: v2: several small improvements to the text. An extended abstract will
appear in FOCS'20; v3: corrected a minor mistake in Appendix
Quantum speedup for graph sparsification, cut approximation, and Laplacian solving
Graph sparsification underlies a large number of algorithms, ranging from approximation algorithms for cut problems to solvers for linear systems in the graph Laplacian. In its strongest form, “spectral sparsification” reduces the number of edges to near-linear in the number of nodes, while approximately preserving the cut and spectral structure of the graph. In this work we demonstrate a polynomial quantum speedup for spectral sparsification and many of its applications. In particular, we give a quantum algorithm that, given a weighted graph with n nodes and m edges, outputs a classical description of an ϵ -spectral sparsifier in sublinear time O˜(mn−−−√/ϵ) . This contrasts with the optimal classical complexity O˜(m) . We also prove that our quantum algorithm is optimal up to polylog-factors. The algorithm builds on a string of existing results on sparsification, graph spanners, quantum algorithms for shortest paths, and efficient constructions for k -wise independent random strings. Our algorithm implies a quantum speedup for solving Laplacian systems and for approximating a range of cut problems such as min cut and sparsest cut
From Independence to Expansion and Back Again
We consider the following fundamental problems: (1) Constructing
-independent hash functions with a space-time tradeoff close to Siegel's
lower bound. (2) Constructing representations of unbalanced expander graphs
having small size and allowing fast computation of the neighbor function. It is
not hard to show that these problems are intimately connected in the sense that
a good solution to one of them leads to a good solution to the other one. In
this paper we exploit this connection to present efficient, recursive
constructions of -independent hash functions (and hence expanders with a
small representation). While the previously most efficient construction
(Thorup, FOCS 2013) needed time quasipolynomial in Siegel's lower bound, our
time bound is just a logarithmic factor from the lower bound.Comment: An extended abstract of this paper was accepted to The 47th ACM
Symposium on Theory of Computing (STOC 2015). Copyright AC