2,263 research outputs found
Solving Support Vector Machines in Reproducing Kernel Banach Spaces with Positive Definite Functions
In this paper we solve support vector machines in reproducing kernel Banach
spaces with reproducing kernels defined on nonsymmetric domains instead of the
traditional methods in reproducing kernel Hilbert spaces. Using the
orthogonality of semi-inner-products, we can obtain the explicit
representations of the dual (normalized-duality-mapping) elements of support
vector machine solutions. In addition, we can introduce the reproduction
property in a generalized native space by Fourier transform techniques such
that it becomes a reproducing kernel Banach space, which can be even embedded
into Sobolev spaces, and its reproducing kernel is set up by the related
positive definite function. The representations of the optimal solutions of
support vector machines (regularized empirical risks) in these reproducing
kernel Banach spaces are formulated explicitly in terms of positive definite
functions, and their finite numbers of coefficients can be computed by fixed
point iteration. We also give some typical examples of reproducing kernel
Banach spaces induced by Mat\'ern functions (Sobolev splines) so that their
support vector machine solutions are well computable as the classical
algorithms. Moreover, each of their reproducing bases includes information from
multiple training data points. The concept of reproducing kernel Banach spaces
offers us a new numerical tool for solving support vector machines.Comment: 26 page
A rescaled method for RBF approximation
In the recent paper [8], a new method to compute stable kernel-based
interpolants has been presented. This \textit{rescaled interpolation} method
combines the standard kernel interpolation with a properly defined rescaling
operation, which smooths the oscillations of the interpolant. Although
promising, this procedure lacks a systematic theoretical investigation. Through
our analysis, this novel method can be understood as standard kernel
interpolation by means of a properly rescaled kernel. This point of view allow
us to consider its error and stability properties
A rescaled method for RBF approximation
A new method to compute stable kernel-based interpolants
has been presented by the second and third authors. This rescaled interpolation method combines the
standard kernel interpolation with a properly defined rescaling operation, which
smooths the oscillations of the interpolant. Although promising, this procedure
lacks a systematic theoretical investigation.
Through our analysis, this novel method can be understood as standard
kernel interpolation by means of a properly rescaled kernel. This point of view
allow us to consider its error and stability properties.
First, we prove that the method is an instance of the Shepard\u2019s method,
when certain weight functions are used. In particular, the method can reproduce
constant functions.
Second, it is possible to define a modified set of cardinal functions strictly
related to the ones of the not-rescaled kernel. Through these functions, we
define a Lebesgue function for the rescaled interpolation process, and study its
maximum - the Lebesgue constant - in different settings.
Also, a preliminary theoretical result on the estimation of the interpolation
error is presented.
As an application, we couple our method with a partition of unity algorithm.
This setting seems to be the most promising, and we illustrate its behavior with
some experiments
- …