7,175 research outputs found
Online Learning with Multiple Operator-valued Kernels
We consider the problem of learning a vector-valued function f in an online
learning setting. The function f is assumed to lie in a reproducing Hilbert
space of operator-valued kernels. We describe two online algorithms for
learning f while taking into account the output structure. A first contribution
is an algorithm, ONORMA, that extends the standard kernel-based online learning
algorithm NORMA from scalar-valued to operator-valued setting. We report a
cumulative error bound that holds both for classification and regression. We
then define a second algorithm, MONORMA, which addresses the limitation of
pre-defining the output structure in ONORMA by learning sequentially a linear
combination of operator-valued kernels. Our experiments show that the proposed
algorithms achieve good performance results with low computational cost
Extension of Wirtinger's Calculus to Reproducing Kernel Hilbert Spaces and the Complex Kernel LMS
Over the last decade, kernel methods for nonlinear processing have
successfully been used in the machine learning community. The primary
mathematical tool employed in these methods is the notion of the Reproducing
Kernel Hilbert Space. However, so far, the emphasis has been on batch
techniques. It is only recently, that online techniques have been considered in
the context of adaptive signal processing tasks. Moreover, these efforts have
only been focussed on real valued data sequences. To the best of our knowledge,
no adaptive kernel-based strategy has been developed, so far, for complex
valued signals. Furthermore, although the real reproducing kernels are used in
an increasing number of machine learning problems, complex kernels have not,
yet, been used, in spite of their potential interest in applications that deal
with complex signals, with Communications being a typical example. In this
paper, we present a general framework to attack the problem of adaptive
filtering of complex signals, using either real reproducing kernels, taking
advantage of a technique called \textit{complexification} of real RKHSs, or
complex reproducing kernels, highlighting the use of the complex gaussian
kernel. In order to derive gradients of operators that need to be defined on
the associated complex RKHSs, we employ the powerful tool of Wirtinger's
Calculus, which has recently attracted attention in the signal processing
community. To this end, in this paper, the notion of Wirtinger's calculus is
extended, for the first time, to include complex RKHSs and use it to derive
several realizations of the Complex Kernel Least-Mean-Square (CKLMS) algorithm.
Experiments verify that the CKLMS offers significant performance improvements
over several linear and nonlinear algorithms, when dealing with nonlinearities.Comment: 15 pages (double column), preprint of article accepted in IEEE Trans.
Sig. Pro
Learning Output Kernels for Multi-Task Problems
Simultaneously solving multiple related learning tasks is beneficial under a
variety of circumstances, but the prior knowledge necessary to correctly model
task relationships is rarely available in practice. In this paper, we develop a
novel kernel-based multi-task learning technique that automatically reveals
structural inter-task relationships. Building over the framework of output
kernel learning (OKL), we introduce a method that jointly learns multiple
functions and a low-rank multi-task kernel by solving a non-convex
regularization problem. Optimization is carried out via a block coordinate
descent strategy, where each subproblem is solved using suitable conjugate
gradient (CG) type iterative methods for linear operator equations. The
effectiveness of the proposed approach is demonstrated on pharmacological and
collaborative filtering data
- …