241 research outputs found
Recommended from our members
Lower bounds for sparse recovery problems
Sparse recovery or compressed sensing is the problem of estimating a signal from noisy linear measurements of that signal. Sparse recovery has traditionally been used in areas like image acquisition, streaming algorithms, genetic testing, and, more recently, for image recovery tasks.
Over the last decade many techniques have been developed for sparse recovery under various guarantees. We develop new lower bound techniques and show the tightness of existing results for the following variants of the sparse recovery problem: (i) adaptive sparse recovery, (ii) sparse recovery under high SNR, (iii) deterministic L2 heavy hitters, and, (iv) compressed sensing with generative models.Computer Science
High Probability Frequency Moment Sketches
We consider the problem of sketching the p-th frequency moment of a vector, p>2, with multiplicative error at most 1 +/- epsilon and with high confidence 1-delta. Despite the long sequence of work on this problem, tight bounds on this quantity are only known for constant delta. While one can obtain an upper bound with error probability delta by repeating a sketching algorithm with constant error probability O(log(1/delta)) times in parallel, and taking the median of the outputs, we show this is a suboptimal algorithm! Namely, we show optimal upper and lower bounds of Theta(n^{1-2/p} log(1/delta) + n^{1-2/p} log^{2/p} (1/delta) log n) on the sketching dimension, for any constant approximation. Our result should be contrasted with results for estimating frequency moments for 1 <= p <= 2, for which we show the optimal algorithm for general delta is obtained by repeating the optimal algorithm for constant error probability O(log(1/delta)) times and taking the median output. We also obtain a matching lower bound for this problem, up to constant factors
Tight Lower Bound for Linear Sketches of Moments
The problem of estimating frequency moments of a data stream has attracted a
lot of attention since the onset of streaming algorithms [AMS99]. While the
space complexity for approximately computing the moment, for
has been settled [KNW10], for the exact complexity remains
open. For the current best algorithm uses words of
space [AKO11,BO10], whereas the lower bound is of [BJKS04].
In this paper, we show a tight lower bound of words
for the class of algorithms based on linear sketches, which store only a sketch
of input vector and some (possibly randomized) matrix . We note
that all known algorithms for this problem are linear sketches.Comment: In Proceedings of the 40th International Colloquium on Automata,
Languages and Programming (ICALP), Riga, Latvia, July 201
Tight Bounds for Sketching the Operator Norm, Schatten Norms, and Subspace Embeddings
We consider the following oblivious sketching problem: given epsilon in (0,1/3) and n >= d/epsilon^2, design a distribution D over R^{k * nd} and a function f: R^k * R^{nd} -> R}, so that for any n * d matrix A, Pr_{S sim D} [(1-epsilon) |A|_{op} = 2/3, where |A|_{op} = sup_{x:|x|_2 = 1} |Ax|_2 is the operator norm of A and S(A) denotes S * A, interpreting A as a vector in R^{nd}. We show a tight lower bound of k = Omega(d^2/epsilon^2) for this problem. Previously, Nelson and Nguyen (ICALP, 2014) considered the problem of finding a distribution D over R^{k * n} such that for any n * d matrix A, Pr_{S sim D}[forall x, (1-epsilon)|Ax|_2 = 2/3, which is called an oblivious subspace embedding (OSE). Our result considerably strengthens theirs, as it (1) applies only to estimating the operator norm, which can be estimated given any OSE, and (2) applies to distributions over general linear operators S which treat A as a vector and compute S(A), rather than the restricted class of linear operators corresponding to matrix multiplication. Our technique also implies the first tight bounds for approximating the Schatten p-norm for even integers p via general linear sketches, improving the previous lower bound from k = Omega(n^{2-6/p}) [Regev, 2014] to k = Omega(n^{2-4/p}). Importantly, for sketching the operator norm up to a factor of alpha, where alpha - 1 = Omega(1), we obtain a tight k = Omega(n^2/alpha^4) bound, matching the upper bound of Andoni and Nguyen (SODA, 2013), and improving the previous k = Omega(n^2/alpha^6) lower bound. Finally, we also obtain the first lower bounds for approximating Ky Fan norms
- …