44 research outputs found
New Subset Selection Algorithms for Low Rank Approximation: Offline and Online
Subset selection for the rank approximation of an matrix
offers improvements in the interpretability of matrices, as well as a variety
of computational savings. This problem is well-understood when the error
measure is the Frobenius norm, with various tight algorithms known even in
challenging models such as the online model, where an algorithm must select the
column subset irrevocably when the columns arrive one by one. In contrast, for
other matrix losses, optimal trade-offs between the subset size and
approximation quality have not been settled, even in the offline setting. We
give a number of results towards closing these gaps.
In the offline setting, we achieve nearly optimal bicriteria algorithms in
two settings. First, we remove a factor from a result of [SWZ19] when
the loss function is any entrywise loss with an approximate triangle inequality
and at least linear growth. Our result is tight for the loss. We give
a similar improvement for entrywise losses for , improving a
previous distortion of to . Our results come from a
technique which replaces the use of a well-conditioned basis with a slightly
larger spanning set for which any vector can be expressed as a linear
combination with small Euclidean norm. We show that this technique also gives
the first oblivious subspace embeddings for with distortion, which is nearly optimal and closes a long line of work.
In the online setting, we give the first online subset selection algorithm
for subspace approximation and entrywise low rank
approximation by implementing sensitivity sampling online, which is challenging
due to the sequential nature of sensitivity sampling. Our main technique is an
online algorithm for detecting when an approximately optimal subspace changes
substantially.Comment: To appear in STOC 2023; abstract shortene
-Regression in the Arbitrary Partition Model of Communication
We consider the randomized communication complexity of the distributed
-regression problem in the coordinator model, for . In this
problem, there is a coordinator and servers. The -th server receives
and and the coordinator would like to find a -approximate
solution to . Here
for convenience. This model, where the data is
additively shared across servers, is commonly referred to as the arbitrary
partition model.
We obtain significantly improved bounds for this problem. For , i.e.,
least squares regression, we give the first optimal bound of
bits.
For ,we obtain an upper bound. Notably, for sufficiently large,
our leading order term only depends linearly on rather than
quadratically. We also show communication lower bounds of for and for . Our bounds considerably improve previous bounds due to (Woodruff et al.
COLT, 2013) and (Vempala et al., SODA, 2020)