39,221 research outputs found
Regression on fixed-rank positive semidefinite matrices: a Riemannian approach
The paper addresses the problem of learning a regression model parameterized
by a fixed-rank positive semidefinite matrix. The focus is on the nonlinear
nature of the search space and on scalability to high-dimensional problems. The
mathematical developments rely on the theory of gradient descent algorithms
adapted to the Riemannian geometry that underlies the set of fixed-rank
positive semidefinite matrices. In contrast with previous contributions in the
literature, no restrictions are imposed on the range space of the learned
matrix. The resulting algorithms maintain a linear complexity in the problem
size and enjoy important invariance properties. We apply the proposed
algorithms to the problem of learning a distance function parameterized by a
positive semidefinite matrix. Good performance is observed on classical
benchmarks
A Survey on Metric Learning for Feature Vectors and Structured Data
The need for appropriate ways to measure the distance or similarity between
data is ubiquitous in machine learning, pattern recognition and data mining,
but handcrafting such good metrics for specific problems is generally
difficult. This has led to the emergence of metric learning, which aims at
automatically learning a metric from data and has attracted a lot of interest
in machine learning and related fields for the past ten years. This survey
paper proposes a systematic review of the metric learning literature,
highlighting the pros and cons of each approach. We pay particular attention to
Mahalanobis distance metric learning, a well-studied and successful framework,
but additionally present a wide range of methods that have recently emerged as
powerful alternatives, including nonlinear metric learning, similarity learning
and local metric learning. Recent trends and extensions, such as
semi-supervised metric learning, metric learning for histogram data and the
derivation of generalization guarantees, are also covered. Finally, this survey
addresses metric learning for structured data, in particular edit distance
learning, and attempts to give an overview of the remaining challenges in
metric learning for the years to come.Comment: Technical report, 59 pages. Changes in v2: fixed typos and improved
presentation. Changes in v3: fixed typos. Changes in v4: fixed typos and new
method
A continuous-time analysis of distributed stochastic gradient
We analyze the effect of synchronization on distributed stochastic gradient
algorithms. By exploiting an analogy with dynamical models of biological quorum
sensing -- where synchronization between agents is induced through
communication with a common signal -- we quantify how synchronization can
significantly reduce the magnitude of the noise felt by the individual
distributed agents and by their spatial mean. This noise reduction is in turn
associated with a reduction in the smoothing of the loss function imposed by
the stochastic gradient approximation. Through simulations on model non-convex
objectives, we demonstrate that coupling can stabilize higher noise levels and
improve convergence. We provide a convergence analysis for strongly convex
functions by deriving a bound on the expected deviation of the spatial mean of
the agents from the global minimizer for an algorithm based on quorum sensing,
the same algorithm with momentum, and the Elastic Averaging SGD (EASGD)
algorithm. We discuss extensions to new algorithms which allow each agent to
broadcast its current measure of success and shape the collective computation
accordingly. We supplement our theoretical analysis with numerical experiments
on convolutional neural networks trained on the CIFAR-10 dataset, where we note
a surprising regularizing property of EASGD even when applied to the
non-distributed case. This observation suggests alternative second-order
in-time algorithms for non-distributed optimization that are competitive with
momentum methods.Comment: 9/14/19 : Final version, accepted for publication in Neural
Computation. 4/7/19 : Significant edits: addition of simulations, deep
network results, and revisions throughout. 12/28/18: Initial submissio
- …