Faster Linear Algebra for Distance Matrices

Abstract

The distance matrix of a dataset XX of nn points with respect to a distance function ff represents all pairwise distances between points in XX induced by ff. Due to their wide applicability, distance matrices and related families of matrices have been the focus of many recent algorithmic works. We continue this line of research and take a broad view of algorithm design for distance matrices with the goal of designing fast algorithms, which are specifically tailored for distance matrices, for fundamental linear algebraic primitives. Our results include efficient algorithms for computing matrix-vector products for a wide class of distance matrices, such as the β„“1\ell_1 metric for which we get a linear runtime, as well as an Ξ©(n2)\Omega(n^2) lower bound for any algorithm which computes a matrix-vector product for the β„“βˆž\ell_{\infty} case, showing a separation between the β„“1\ell_1 and the β„“βˆž\ell_{\infty} metrics. Our upper bound results, in conjunction with recent works on the matrix-vector query model, have many further downstream applications, including the fastest algorithm for computing a relative error low-rank approximation for the distance matrix induced by β„“1\ell_1 and β„“22\ell_2^2 functions and the fastest algorithm for computing an additive error low-rank approximation for the β„“2\ell_2 metric, in addition to applications for fast matrix multiplication among others. We also give algorithms for constructing distance matrices and show that one can construct an approximate β„“2\ell_2 distance matrix in time faster than the bound implied by the Johnson-Lindenstrauss lemma.Comment: Selected as Oral for NeurIPS 202

    Similar works

    Full text

    thumbnail-image

    Available Versions