199,314 research outputs found

    Large Margin Distribution Machine Recursive Feature Elimination

    Get PDF
    We gratefully thank Dr Teng Zhang and Prof Zhi-Hua Zhou for providing the source code of “LDM” source code and their kind technical assistance. This work is supported by the National Natural Science Foundation of China (Nos. 61472159, 61572227) and Development Project of Jilin Province of China (Nos. 20160204022GX, 2017C033). This work is also partially supported by the 2015 Scottish Crucible Award funded by the Royal Society of Edinburgh and the 2016 PECE bursary provided by the Scottish Informatics & Computer Science Alliance (SICSA).Postprin

    Statistical Mechanics of Soft Margin Classifiers

    Full text link
    We study the typical learning properties of the recently introduced Soft Margin Classifiers (SMCs), learning realizable and unrealizable tasks, with the tools of Statistical Mechanics. We derive analytically the behaviour of the learning curves in the regime of very large training sets. We obtain exponential and power laws for the decay of the generalization error towards the asymptotic value, depending on the task and on general characteristics of the distribution of stabilities of the patterns to be learned. The optimal learning curves of the SMCs, which give the minimal generalization error, are obtained by tuning the coefficient controlling the trade-off between the error and the regularization terms in the cost function. If the task is realizable by the SMC, the optimal performance is better than that of a hard margin Support Vector Machine and is very close to that of a Bayesian classifier.Comment: 26 pages, 12 figures, submitted to Physical Review
    corecore