11,836 research outputs found

    Solving Support Vector Machines in Reproducing Kernel Banach Spaces with Positive Definite Functions

    Full text link
    In this paper we solve support vector machines in reproducing kernel Banach spaces with reproducing kernels defined on nonsymmetric domains instead of the traditional methods in reproducing kernel Hilbert spaces. Using the orthogonality of semi-inner-products, we can obtain the explicit representations of the dual (normalized-duality-mapping) elements of support vector machine solutions. In addition, we can introduce the reproduction property in a generalized native space by Fourier transform techniques such that it becomes a reproducing kernel Banach space, which can be even embedded into Sobolev spaces, and its reproducing kernel is set up by the related positive definite function. The representations of the optimal solutions of support vector machines (regularized empirical risks) in these reproducing kernel Banach spaces are formulated explicitly in terms of positive definite functions, and their finite numbers of coefficients can be computed by fixed point iteration. We also give some typical examples of reproducing kernel Banach spaces induced by Mat\'ern functions (Sobolev splines) so that their support vector machine solutions are well computable as the classical algorithms. Moreover, each of their reproducing bases includes information from multiple training data points. The concept of reproducing kernel Banach spaces offers us a new numerical tool for solving support vector machines.Comment: 26 page

    Nearest points and delta convex functions in Banach spaces

    Full text link
    Given a closed set CC in a Banach space (X,∥⋅∥)(X, \|\cdot\|), a point x∈Xx\in X is said to have a nearest point in CC if there exists z∈Cz\in C such that dC(x)=∥x−z∥d_C(x) =\|x-z\|, where dCd_C is the distance of xx from CC. We shortly survey the problem of studying how large is the set of points in XX which have nearest points in CC. We then discuss the topic of delta-convex functions and how it is related to finding nearest points.Comment: To appear in Bull. Aust. Math. So

    An Approximate Shapley-Folkman Theorem

    Full text link
    The Shapley-Folkman theorem shows that Minkowski averages of uniformly bounded sets tend to be convex when the number of terms in the sum becomes much larger than the ambient dimension. In optimization, Aubin and Ekeland [1976] show that this produces an a priori bound on the duality gap of separable nonconvex optimization problems involving finite sums. This bound is highly conservative and depends on unstable quantities, and we relax it in several directions to show that non convexity can have a much milder impact on finite sum minimization problems such as empirical risk minimization and multi-task classification. As a byproduct, we show a new version of Maurey's classical approximate Carath\'eodory lemma where we sample a significant fraction of the coefficients, without replacement, as well as a result on sampling constraints using an approximate Helly theorem, both of independent interest.Comment: Added constraint sampling result, simplified sampling results, reformat, et
    • …
    corecore