30,066 research outputs found
Multi-Output Learning via Spectral Filtering
In this paper we study a class of regularized kernel methods for vector-valued learning which are based on filtering the spectrum of the kernel matrix. The considered methods include Tikhonov regularization as a special case, as well as interesting alternatives such as vector-valued extensions of L2 boosting. Computational properties are discussed for various examples of kernels for vector-valued functions and the benefits of iterative techniques are illustrated. Generalizing previous results for the scalar case, we show finite sample bounds for the excess risk of the obtained estimator and, in turn, these results allow to prove consistency both for regression and multi-category classification. Finally, we present some promising results of the proposed algorithms on artificial and real data
Distributed Kernel Regression: An Algorithm for Training Collaboratively
This paper addresses the problem of distributed learning under communication
constraints, motivated by distributed signal processing in wireless sensor
networks and data mining with distributed databases. After formalizing a
general model for distributed learning, an algorithm for collaboratively
training regularized kernel least-squares regression estimators is derived.
Noting that the algorithm can be viewed as an application of successive
orthogonal projection algorithms, its convergence properties are investigated
and the statistical behavior of the estimator is discussed in a simplified
theoretical setting.Comment: To be presented at the 2006 IEEE Information Theory Workshop, Punta
del Este, Uruguay, March 13-17, 200
A new kernel-based approach for overparameterized Hammerstein system identification
In this paper we propose a new identification scheme for Hammerstein systems,
which are dynamic systems consisting of a static nonlinearity and a linear
time-invariant dynamic system in cascade. We assume that the nonlinear function
can be described as a linear combination of basis functions. We reconstruct
the coefficients of the nonlinearity together with the first samples of
the impulse response of the linear system by estimating an -dimensional
overparameterized vector, which contains all the combinations of the unknown
variables. To avoid high variance in these estimates, we adopt a regularized
kernel-based approach and, in particular, we introduce a new kernel tailored
for Hammerstein system identification. We show that the resulting scheme
provides an estimate of the overparameterized vector that can be uniquely
decomposed as the combination of an impulse response and coefficients of
the static nonlinearity. We also show, through several numerical experiments,
that the proposed method compares very favorably with two standard methods for
Hammerstein system identification.Comment: 17 pages, submitted to IEEE Conference on Decision and Control 201
- …