research

Classification with support hyperplanes

Abstract

A new classification method is proposed, called Support Hy-perplanes (SHs). To solve the binary classification task, SHs consider theset of all hyperplanes that do not make classification mistakes, referredto as semi-consistent hyperplanes. A test object is classified using thatsemi-consistent hyperplane, which is farthest away from it. In this way, agood balance between goodness-of-fit and model complexity is achieved,where model complexity is proxied by the distance between a test objectand a semi-consistent hyperplane. This idea of complexity resembles theone imputed in the width of the so-called margin between two classes,which arises in the context of Support Vector Machine learning. Classoverlap can be handled via the introduction of kernels and/or slack vari-ables. The performance of SHs against standard classifiers is promisingon several widely-used empirical data sets.Kernel methods;large margin and instance-based classifiers

    Similar works