1,477 research outputs found
Artificial Neural Networks
Artificial neural networks (ANNs) constitute a class of flexible nonlinear models designed to mimic biological neural systems. In this entry, we introduce ANN using familiar econometric terminology and provide an overview of ANN modeling approach and its implementation methods.
A Nonparametric Ensemble Binary Classifier and its Statistical Properties
In this work, we propose an ensemble of classification trees (CT) and
artificial neural networks (ANN). Several statistical properties including
universal consistency and upper bound of an important parameter of the proposed
classifier are shown. Numerical evidence is also provided using various real
life data sets to assess the performance of the model. Our proposed
nonparametric ensemble classifier doesn't suffer from the `curse of
dimensionality' and can be used in a wide variety of feature selection cum
classification problems. Performance of the proposed model is quite better when
compared to many other state-of-the-art models used for similar situations
Approximation paper, part 1
In this paper we discuss approximations between neural nets, fuzzy expert systems, fuzzy controllers, and continuous processes
Elementary Derivative Tasks and Neural Net Multiscale Analysis of Tasks
Neural nets are known to be universal approximators. In particular, formal
neurons implementing wavelets have been shown to build nets able to approximate
any multidimensional task. Such very specialized formal neurons may be,
however, difficult to obtain biologically and/or industrially. In this paper we
relax the constraint of a strict ``Fourier analysis'' of tasks. Rather, we use
a finite number of more realistic formal neurons implementing elementary tasks
such as ``window'' or ``Mexican hat'' responses, with adjustable widths. This
is shown to provide a reasonably efficient, practical and robust,
multifrequency analysis. A training algorithm, optimizing the task with respect
to the widths of the responses, reveals two distinct training modes. The first
mode induces some of the formal neurons to become identical, hence promotes
``derivative tasks''. The other mode keeps the formal neurons distinct.Comment: latex neurondlt.tex, 7 files, 6 figures, 9 pages [SPhT-T01/064],
submitted to Phys. Rev.
Universal Approximation Depth and Errors of Narrow Belief Networks with Discrete Units
We generalize recent theoretical work on the minimal number of layers of
narrow deep belief networks that can approximate any probability distribution
on the states of their visible units arbitrarily well. We relax the setting of
binary units (Sutskever and Hinton, 2008; Le Roux and Bengio, 2008, 2010;
Mont\'ufar and Ay, 2011) to units with arbitrary finite state spaces, and the
vanishing approximation error to an arbitrary approximation error tolerance.
For example, we show that a -ary deep belief network with layers of width for some can approximate any probability
distribution on without exceeding a Kullback-Leibler
divergence of . Our analysis covers discrete restricted Boltzmann
machines and na\"ive Bayes models as special cases.Comment: 19 pages, 5 figures, 1 tabl
- …