The selection of model's parameters plays an important role in the
application of support vector classification (SVC). The commonly used method of
selecting model's parameters is the k-fold cross validation with grid search
(CV). It is extremely time-consuming because it needs to train a large number
of SVC models. In this paper, a new method is proposed to train SVC with the
selection of model's parameters. Firstly, training SVC with the selection of
model's parameters is modeled as a minimax optimization problem
(MaxMin-L2-SVC-NCH), in which the minimization problem is an optimization
problem of finding the closest points between two normal convex hulls
(L2-SVC-NCH) while the maximization problem is an optimization problem of
finding the optimal model's parameters. A lower time complexity can be expected
in MaxMin-L2-SVC-NCH because CV is abandoned. A gradient-based algorithm is
then proposed to solve MaxMin-L2-SVC-NCH, in which L2-SVC-NCH is solved by a
projected gradient algorithm (PGA) while the maximization problem is solved by
a gradient ascent algorithm with dynamic learning rate. To demonstrate the
advantages of the PGA in solving L2-SVC-NCH, we carry out a comparison of the
PGA and the famous sequential minimal optimization (SMO) algorithm after a SMO
algorithm and some KKT conditions for L2-SVC-NCH are provided. It is revealed
that the SMO algorithm is a special case of the PGA. Thus, the PGA can provide
more flexibility. The comparative experiments between MaxMin-L2-SVC-NCH and the
classical parameter selection models on public datasets show that
MaxMin-L2-SVC-NCH greatly reduces the number of models to be trained and the
test accuracy is not lost to the classical models. It indicates that
MaxMin-L2-SVC-NCH performs better than the other models. We strongly recommend
MaxMin-L2-SVC-NCH as a preferred model for SVC task