488 research outputs found
Theoretical Interpretations and Applications of Radial Basis Function Networks
Medical applications usually used Radial Basis Function Networks just as Artificial Neural Networks. However, RBFNs are Knowledge-Based Networks that can be interpreted in several way: Artificial Neural Networks, Regularization Networks, Support Vector Machines, Wavelet Networks, Fuzzy Controllers, Kernel Estimators, Instanced-Based Learners. A survey of their interpretations and of their corresponding learning algorithms is provided as well as a brief survey on dynamic learning algorithms. RBFNs' interpretations can suggest applications that are particularly interesting in medical domains
Priors Stabilizers and Basis Functions: From Regularization to Radial, Tensor and Additive Splines
We had previously shown that regularization principles lead to approximation schemes, as Radial Basis Functions, which are equivalent to networks with one layer of hidden units, called Regularization Networks. In this paper we show that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models, Breiman's hinge functions and some forms of Projection Pursuit Regression. In the probabilistic interpretation of regularization, the different classes of basis functions correspond to different classes of prior probabilities on the approximating function spaces, and therefore to different types of smoothness assumptions. In the final part of the paper, we also show a relation between activation functions of the Gaussian and sigmoidal type
Concentration inequalities of the cross-validation estimate for stable predictors
In this article, we derive concentration inequalities for the
cross-validation estimate of the generalization error for stable predictors in
the context of risk assessment. The notion of stability has been first
introduced by \cite{DEWA79} and extended by \cite{KEA95}, \cite{BE01} and
\cite{KUNIY02} to characterize class of predictors with infinite VC dimension.
In particular, this covers -nearest neighbors rules, bayesian algorithm
(\cite{KEA95}), boosting,... General loss functions and class of predictors are
considered. We use the formalism introduced by \cite{DUD03} to cover a large
variety of cross-validation procedures including leave-one-out
cross-validation, -fold cross-validation, hold-out cross-validation (or
split sample), and the leave--out cross-validation.
In particular, we give a simple rule on how to choose the cross-validation,
depending on the stability of the class of predictors. In the special case of
uniform stability, an interesting consequence is that the number of elements in
the test set is not required to grow to infinity for the consistency of the
cross-validation procedure. In this special case, the particular interest of
leave-one-out cross-validation is emphasized
Statistical properties of the method of regularization with periodic Gaussian reproducing kernel
The method of regularization with the Gaussian reproducing kernel is popular
in the machine learning literature and successful in many practical
applications.
In this paper we consider the periodic version of the Gaussian kernel
regularization.
We show in the white noise model setting, that in function spaces of very
smooth functions, such as the infinite-order Sobolev space and the space of
analytic functions, the method under consideration is asymptotically minimax;
in finite-order Sobolev spaces, the method is rate optimal, and the efficiency
in terms of constant when compared with the minimax estimator is reasonably
high. The smoothing parameters in the periodic Gaussian regularization can be
chosen adaptively without loss of asymptotic efficiency. The results derived in
this paper give a partial explanation of the success of the
Gaussian reproducing kernel in practice. Simulations are carried out to study
the finite sample properties of the periodic Gaussian regularization.Comment: Published by the Institute of Mathematical Statistics
(http://www.imstat.org) in the Annals of Statistics
(http://www.imstat.org/aos/) at http://dx.doi.org/10.1214/00905360400000045
- …