8 research outputs found

    Margins, Kernels and Non-linear Smoothed Perceptrons

    No full text
    Abstract We focus on the problem of finding a non-linear classification function that lies in a Reproducing Kernel Hilbert Space (RKHS) both from the primal point of view (finding a perfect separator when one exists) and the dual point of view (giving a certificate of non-existence), with special focus on generalizations of two classical schemes -the Perceptron (primal) and VonNeumann (dual) algorithms. We cast our problem as one of maximizing the regularized normalized hard-margin (ρ) in an RKHS and rephrase it in terms of a Mahalanobis dot-product/semi-norm associated with the kernel's (normalized and signed) Gram matrix. We derive an accelerated smoothed algorithm with a convergence rate of √ log n ρ given n separable points, which is strikingly similar to the classical kernelized Perceptron algorithm whose rate is 1 ρ 2 . When no such classifier exists, we prove a version of Gordan's separation theorem for RKHSs, and give a reinterpretation of negative margins. This allows us to give guarantees for a primal-dual algorithm that halts in min{ √ n |ρ| , √ n } iterations with a perfect separator in the RKHS if the primal is feasible or a dual -certificate of near-infeasibility
    corecore