6 research outputs found

    Intelligent Data Mining using Kernel Functions and Information Criteria

    Get PDF
    Radial Basis Function (RBF) Neural Networks and Support Vector Machines (SVM) are two powerful kernel related intelligent data mining techniques. The current major problems with these methods are over-fitting and the existence of too many free parameters. The way to select the parameters can directly affect the generalization performance(test error) of theses models. Current practice in how to choose the model parameters is an art, rather than a science in this research area. Often, some parameters are predetermined, or randomly chosen. Other parameters are selected through repeated experiments that are time consuming, costly, and computationally very intensive. In this dissertation, we provide a two-stage analytical hybrid-training algorithm by building a bridge among regression tree, EM algorithm, and Radial Basis Function Neural Networks together. Information Complexity (ICOMP) criterion of Bozdogan along with other information based criteria are introduced and applied to control the model complexity, and to decide the optimal number of kernel functions. In the first stage of the hybrid, regression tree and EM algorithm are used to determine the kernel function parameters. In the second stage of the hybrid, the weights (coefficients) are calculated and information criteria are scored. Kernel Principal Component Analysis (KPCA) using EM algorithm for feature selection and data preprocessing is also introduced and studied. Adaptive Support Vector Machines (ASVM) and some efficient algorithms are given to deal with massive data sets in support vector classifications. Versatility and efficiency of the new proposed approaches are studied on real data sets and via Monte Carlo sim- ulation experiments

    Many regression algorithms, one unified model — A review

    Get PDF
    International audienceRegression is the process of learning relationships between inputs and continuous outputs from example data, which enables predictions for novel inputs. The history of regression is closely related to the history of artificial neural networks since the seminal work of Rosenblatt (1958). The aims of this paper are to provide an overview of many regression algorithms, and to demonstrate how the function representation whose parameters they regress fall into two classes: a weighted sum of basis functions, or a mixture of linear models. Furthermore, we show that the former is a special case of the latter. Our ambition is thus to provide a deep understanding of the relationship between these algorithms, that, despite being derived from very different principles, use a function representation that can be captured within one unified model. Finally, step-by-step derivations of the algorithms from first principles and visualizations of their inner workings allow this article to be used as a tutorial for those new to regression

    Combining Regression Trees and Radial Basis Function Networks

    No full text
    We describe a method for nonparametric regression which combines regression trees with radial basis function networks. The method is similar to that of Kubat [1998], who was first to suggest such a combination, but has some significant improvements. We demonstrate the features of the new method, compare its performance with other methods on DELVE data sets and apply it to a real world problem involving the classification of soybean plants from digital images. 1 Introduction This paper describes a novel method for nonparametric regression involving a combination of two older methods, namely, regression trees and radial basis function (RBF) neural networks. By nonparametric regression (also known as supervised learning) we mean the problem of estimating the functional relationship between an input vector x 2 R n and an output scalar y given only a set of noisy samples f(x i ; y i )g p i=1 (the training set) without any knowledge of the form of the functional relationship. 1.1..

    Combining Regression Trees and Radial Basis Function Networks

    No full text
    corecore