Support vector machines for classification of input vectors with different metrics

Abstract

In this paper, a generalization of support vector machines is explored where it is considered that input vectors have different ℓp norms for each class. It is proved that the optimization problem for binary classification by using the maximal margin principle with ℓp and ℓq norms only depends on the ℓp norm if 1 ≤ p ≤ q. Furthermore, the selection of a different bias in the classifier function is a consequence of the ℓq norm in this approach. Some commentaries on the most commonly used approaches of SVM are also given as particular cases

    Similar works