1 research outputs found
Support Vector Machine Classifier via Soft-Margin Loss
Support vector machine (SVM) has attracted great attentions for the last two
decades due to its extensive applications, and thus numerous optimization
models have been proposed. To distinguish all of them, in this paper, we
introduce a new model equipped with an soft-margin loss (dubbed as
-SVM) which well captures the nature of the binary classification.
Many of the existing convex/non-convex soft-margin losses can be viewed as a
surrogate of the soft-margin loss. Despite the discrete nature of
, we manage to establish the existence of global minimizer of the new
model as well as revealing the relationship among its minimizers and
KKT/P-stationary points. These theoretical properties allow us to take
advantage of the alternating direction method of multipliers. In addition, the
-support vector operator is introduced as a filter to prevent outliers
from being support vectors during the training process. Hence, the method is
expected to be relatively robust. Finally, numerical experiments demonstrate
that our proposed method generates better performance in terms of much shorter
computational time with much fewer number of support vectors when against with
some other leading methods in areas of SVM. When the data size gets bigger, its
advantage becomes more evident