3 research outputs found
Probabilistic Kernel Support Vector Machines
We propose a probabilistic enhancement of standard kernel Support Vector
Machines for binary classification, in order to address the case when, along
with given data sets, a description of uncertainty (e.g., error bounds) may be
available on each datum. In the present paper, we specifically consider
Gaussian distributions to model uncertainty. Thereby, our data consist of pairs
, , along with an indicator
to declare membership in one of two categories for each pair.
These pairs may be viewed to represent the mean and covariance, respectively,
of random vectors taking values in a suitable linear space (typically
). Thus, our setting may also be viewed as a modification of
Support Vector Machines to classify distributions, albeit, at present, only
Gaussian ones. We outline the formalism that allows computing suitable
classifiers via a natural modification of the standard "kernel trick." The main
contribution of this work is to point out a suitable kernel function for
applying Support Vector techniques to the setting of uncertain data for which a
detailed uncertainty description is also available (herein, "Gaussian points").Comment: 6 pages, 6 figure
A hybrid constructive algorithm incorporating teaching-learning based optimization for neural network training
In neural networks, simultaneous determination of the optimum structure and weights is a challenge. This paper proposes a combination of teaching-learning based optimization (TLBO) algorithm and a constructive algorithm (CA) to cope with the challenge. In literature, TLBO is used to choose proper weights, while CA is adopted to construct different structures in order to select the proper one. In this study, the basic TLBO algorithm along with an improved version of this algorithm for network weights selection are utilized. Meanwhile, as a constructive algorithm, a novel modification to multiple operations, using statistical tests (MOST), is applied and tested to choose the proper structure. The proposed combinatorial algorithms are applied to ten classification problems and two-time-series prediction problems, as the benchmark. The results are evaluated based on training and testing error, network complexity and mean-square error. The experimental results illustrate that the proposed hybrid method of the modified MOST constructive algorithm and the improved TLBO (MCO-ITLBO) algorithm outperform the others; moreover, they have been proven by Wilcoxon statistical tests as well. The proposed method demonstrates less average error with less complexity in the network structure