28 research outputs found
Multi-view Metric Learning in Vector-valued Kernel Spaces
We consider the problem of metric learning for multi-view data and present a
novel method for learning within-view as well as between-view metrics in
vector-valued kernel spaces, as a way to capture multi-modal structure of the
data. We formulate two convex optimization problems to jointly learn the metric
and the classifier or regressor in kernel feature spaces. An iterative
three-step multi-view metric learning algorithm is derived from the
optimization problems. In order to scale the computation to large training
sets, a block-wise Nystr{\"o}m approximation of the multi-view kernel matrix is
introduced. We justify our approach theoretically and experimentally, and show
its performance on real-world datasets against relevant state-of-the-art
methods
Parametric machines: a fresh approach to architecture search
Using tools from category theory, we provide a framework where artificial
neural networks, and their architectures, can be formally described. We first
define the notion of machine in a general categorical context, and show how
simple machines can be combined into more complex ones. We explore finite- and
infinite-depth machines, which generalize neural networks and neural ordinary
differential equations. Borrowing ideas from functional analysis and kernel
methods, we build complete, normed, infinite-dimensional spaces of machines,
and discuss how to find optimal architectures and parameters -- within those
spaces -- to solve a given computational problem. In our numerical experiments,
these kernel-inspired networks can outperform classical neural networks when
the training dataset is small.Comment: 31 pages, 4 figure