4,498 research outputs found
Kernels for Vector-Valued Functions: a Review
Kernel methods are among the most popular techniques in machine learning.
From a frequentist/discriminative perspective they play a central role in
regularization theory as they provide a natural choice for the hypotheses space
and the regularization functional through the notion of reproducing kernel
Hilbert spaces. From a Bayesian/generative perspective they are the key in the
context of Gaussian processes, where the kernel function is also known as the
covariance function. Traditionally, kernel methods have been used in supervised
learning problem with scalar outputs and indeed there has been a considerable
amount of work devoted to designing and learning kernels. More recently there
has been an increasing interest in methods that deal with multiple outputs,
motivated partly by frameworks like multitask learning. In this paper, we
review different methods to design or learn valid kernel functions for multiple
outputs, paying particular attention to the connection between probabilistic
and functional methods
Multi-view Metric Learning in Vector-valued Kernel Spaces
We consider the problem of metric learning for multi-view data and present a
novel method for learning within-view as well as between-view metrics in
vector-valued kernel spaces, as a way to capture multi-modal structure of the
data. We formulate two convex optimization problems to jointly learn the metric
and the classifier or regressor in kernel feature spaces. An iterative
three-step multi-view metric learning algorithm is derived from the
optimization problems. In order to scale the computation to large training
sets, a block-wise Nystr{\"o}m approximation of the multi-view kernel matrix is
introduced. We justify our approach theoretically and experimentally, and show
its performance on real-world datasets against relevant state-of-the-art
methods
- …