1,186 research outputs found
Coresets-Methods and History: A Theoreticians Design Pattern for Approximation and Streaming Algorithms
We present a technical survey on the state of the art approaches in data reduction and the coreset framework. These include geometric decompositions, gradient methods, random sampling, sketching and random projections. We further outline their importance for the design of streaming algorithms and give a brief overview on lower bounding techniques
Multi-GCN: Graph Convolutional Networks for Multi-View Networks, with Applications to Global Poverty
With the rapid expansion of mobile phone networks in developing countries,
large-scale graph machine learning has gained sudden relevance in the study of
global poverty. Recent applications range from humanitarian response and
poverty estimation to urban planning and epidemic containment. Yet the vast
majority of computational tools and algorithms used in these applications do
not account for the multi-view nature of social networks: people are related in
myriad ways, but most graph learning models treat relations as binary. In this
paper, we develop a graph-based convolutional network for learning on
multi-view networks. We show that this method outperforms state-of-the-art
semi-supervised learning algorithms on three different prediction tasks using
mobile phone datasets from three different developing countries. We also show
that, while designed specifically for use in poverty research, the algorithm
also outperforms existing benchmarks on a broader set of learning tasks on
multi-view networks, including node labelling in citation networks
Dimensionality Reduction for k-Means Clustering and Low Rank Approximation
We show how to approximate a data matrix with a much smaller
sketch that can be used to solve a general class of
constrained k-rank approximation problems to within error.
Importantly, this class of problems includes -means clustering and
unconstrained low rank approximation (i.e. principal component analysis). By
reducing data points to just dimensions, our methods generically
accelerate any exact, approximate, or heuristic algorithm for these ubiquitous
problems.
For -means dimensionality reduction, we provide relative
error results for many common sketching techniques, including random row
projection, column selection, and approximate SVD. For approximate principal
component analysis, we give a simple alternative to known algorithms that has
applications in the streaming setting. Additionally, we extend recent work on
column-based matrix reconstruction, giving column subsets that not only `cover'
a good subspace for \bv{A}, but can be used directly to compute this
subspace.
Finally, for -means clustering, we show how to achieve a
approximation by Johnson-Lindenstrauss projecting data points to just dimensions. This gives the first result that leverages the
specific structure of -means to achieve dimension independent of input size
and sublinear in
- …