33,328 research outputs found

    Regularized Regression Problem in hyper-RKHS for Learning Kernels

    Full text link
    This paper generalizes the two-stage kernel learning framework, illustrates its utility for kernel learning and out-of-sample extensions, and proves {asymptotic} convergence results for the introduced kernel learning model. Algorithmically, we extend target alignment by hyper-kernels in the two-stage kernel learning framework. The associated kernel learning task is formulated as a regression problem in a hyper-reproducing kernel Hilbert space (hyper-RKHS), i.e., learning on the space of kernels itself. To solve this problem, we present two regression models with bivariate forms in this space, including kernel ridge regression (KRR) and support vector regression (SVR) in the hyper-RKHS. By doing so, it provides significant model flexibility for kernel learning with outstanding performance in real-world applications. Specifically, our kernel learning framework is general, that is, the learned underlying kernel can be positive definite or indefinite, which adapts to various requirements in kernel learning. Theoretically, we study the convergence behavior of these learning algorithms in the hyper-RKHS and derive the learning rates. Different from the traditional approximation analysis in RKHS, our analyses need to consider the non-trivial independence of pairwise samples and the characterisation of hyper-RKHS. To the best of our knowledge, this is the first work in learning theory to study the approximation performance of regularized regression problem in hyper-RKHS.Comment: 25 pages, 3 figure

    Fermions on Thick Branes in the Background of Sine-Gordon Kinks

    Full text link
    A class of thick branes in the background of sine-Gordon kinks with a scalar potential V(ϕ)=p(1+cos2ϕq)V(\phi)=p(1+\cos\frac{2\phi}{q}) was constructed by R. Koley and S. Kar [Classical Quantum Gravity \textbf{22}, 753 (2005)]. In this paper, in the background of the warped geometry, we investigate the issue of localization of spin half fermions on these branes in the presence of two types of scalar-fermion couplings: ηΨˉϕΨ\eta\bar{\Psi}\phi\Psi and ηΨˉsinϕΨ\eta\bar{\Psi}\sin\phi \Psi. By presenting the mass-independent potentials in the corresponding Schr\"{o}dinger equations, we obtain the lowest Kaluza--Klein (KK) modes and a continuous gapless spectrum of KK states with m2>0m^2>0 for both types of couplings. For the Yukawa coupling ηΨˉϕΨ\eta\bar{\Psi}\phi\Psi, the effective potential of the right chiral fermions for positive qq and η\eta is always positive, hence only the effective potential of the left chiral fermions could trap the corresponding zero mode. This is a well-known conclusion which had been discussed extensively in the literature. However, for the coupling ηΨˉsinϕΨ\eta\bar{\Psi}\sin\phi \Psi, the effective potential of the right chiral fermions for positive qq and η\eta is no longer always positive. Although the value of the potential at the location of the brane is still positive, it has a series of wells and barriers on each side, which ensures that the right chiral fermion zero mode could be trapped. Thus we may draw the remarkable conclusion: for positive η\eta and qq, the potentials of both the left and right chiral fermions could trap the corresponding zero modes under certain restrictions.Comment: 22 pages, 21 figures, published version to appear in Phys. Rev.

    Extending twin support vector machine classifier for multi-category classification problems

    Get PDF
    © 2013 – IOS Press and the authors. All rights reservedTwin support vector machine classifier (TWSVM) was proposed by Jayadeva et al., which was used for binary classification problems. TWSVM not only overcomes the difficulties in handling the problem of exemplar unbalance in binary classification problems, but also it is four times faster in training a classifier than classical support vector machines. This paper proposes one-versus-all twin support vector machine classifiers (OVA-TWSVM) for multi-category classification problems by utilizing the strengths of TWSVM. OVA-TWSVM extends TWSVM to solve k-category classification problems by developing k TWSVM where in the ith TWSVM, we only solve the Quadratic Programming Problems (QPPs) for the ith class, and get the ith nonparallel hyperplane corresponding to the ith class data. OVA-TWSVM uses the well known one-versus-all (OVA) approach to construct a corresponding twin support vector machine classifier. We analyze the efficiency of the OVA-TWSVM theoretically, and perform experiments to test its efficiency on both synthetic data sets and several benchmark data sets from the UCI machine learning repository. Both the theoretical analysis and experimental results demonstrate that OVA-TWSVM can outperform the traditional OVA-SVMs classifier. Further experimental comparisons with other multiclass classifiers demonstrated that comparable performance could be achieved.This work is supported in part by the grant of the Fundamental Research Funds for the Central Universities of GK201102007 in PR China, and is also supported by Natural Science Basis Research Plan in Shaanxi Province of China (Program No.2010JM3004), and is at the same time supported by Chinese Academy of Sciences under the Innovative Group Overseas Partnership Grant as well as Natural Science Foundation of China Major International Joint Research Project (NO.71110107026)
    corecore