19,070 research outputs found
Maximum Margin Multiclass Nearest Neighbors
We develop a general framework for margin-based multicategory classification
in metric spaces. The basic work-horse is a margin-regularized version of the
nearest-neighbor classifier. We prove generalization bounds that match the
state of the art in sample size and significantly improve the dependence on
the number of classes . Our point of departure is a nearly Bayes-optimal
finite-sample risk bound independent of . Although -free, this bound is
unregularized and non-adaptive, which motivates our main result: Rademacher and
scale-sensitive margin bounds with a logarithmic dependence on . As the best
previous risk estimates in this setting were of order , our bound is
exponentially sharper. From the algorithmic standpoint, in doubling metric
spaces our classifier may be trained on examples in time and
evaluated on new points in time
Dimensionality reduction with subgaussian matrices: a unified theory
We present a theory for Euclidean dimensionality reduction with subgaussian
matrices which unifies several restricted isometry property and
Johnson-Lindenstrauss type results obtained earlier for specific data sets. In
particular, we recover and, in several cases, improve results for sets of
sparse and structured sparse vectors, low-rank matrices and tensors, and smooth
manifolds. In addition, we establish a new Johnson-Lindenstrauss embedding for
data sets taking the form of an infinite union of subspaces of a Hilbert space
- …