11,615 research outputs found
Analysis of Neural Networks in Terms of Domain Functions
Despite their success-story, artificial neural networks have one major disadvantage compared to other techniques: the inability to explain comprehensively how a trained neural network reaches its output; neural networks are not only (incorrectly) seen as a "magic tool" but possibly even more as a mysterious "black box". Although much research has already been done to "open the box," there is a notable hiatus in known publications on analysis of neural networks. So far, mainly sensitivity analysis and rule extraction methods have been used to analyze neural networks. However, these can only be applied in a limited subset of the problem domains where neural network solutions are encountered. In this paper we propose a wider applicable method which, for a given problem domain, involves identifying basic functions with which users in that domain are already familiar, and describing trained neural networks, or parts thereof, in terms of those basic functions. This will provide a comprehensible description of the neural network's function and, depending on the chosen base functions, it may also provide an insight into the neural network' s inner "reasoning." It could further be used to optimize neural network systems. An analysis in terms of base functions may even make clear how to (re)construct a superior system using those base functions, thus using the neural network as a construction advisor
Theoretical Interpretations and Applications of Radial Basis Function Networks
Medical applications usually used Radial Basis Function Networks just as Artificial Neural Networks. However, RBFNs are Knowledge-Based Networks that can be interpreted in several way: Artificial Neural Networks, Regularization Networks, Support Vector Machines, Wavelet Networks, Fuzzy Controllers, Kernel Estimators, Instanced-Based Learners. A survey of their interpretations and of their corresponding learning algorithms is provided as well as a brief survey on dynamic learning algorithms. RBFNs' interpretations can suggest applications that are particularly interesting in medical domains
Boolean Dynamics with Random Couplings
This paper reviews a class of generic dissipative dynamical systems called
N-K models. In these models, the dynamics of N elements, defined as Boolean
variables, develop step by step, clocked by a discrete time variable. Each of
the N Boolean elements at a given time is given a value which depends upon K
elements in the previous time step.
We review the work of many authors on the behavior of the models, looking
particularly at the structure and lengths of their cycles, the sizes of their
basins of attraction, and the flow of information through the systems. In the
limit of infinite N, there is a phase transition between a chaotic and an
ordered phase, with a critical phase in between.
We argue that the behavior of this system depends significantly on the
topology of the network connections. If the elements are placed upon a lattice
with dimension d, the system shows correlations related to the standard
percolation or directed percolation phase transition on such a lattice. On the
other hand, a very different behavior is seen in the Kauffman net in which all
spins are equally likely to be coupled to a given spin. In this situation,
coupling loops are mostly suppressed, and the behavior of the system is much
more like that of a mean field theory.
We also describe possible applications of the models to, for example, genetic
networks, cell differentiation, evolution, democracy in social systems and
neural networks.Comment: 69 pages, 16 figures, Submitted to Springer Applied Mathematical
Sciences Serie
A characterization of the Edge of Criticality in Binary Echo State Networks
Echo State Networks (ESNs) are simplified recurrent neural network models
composed of a reservoir and a linear, trainable readout layer. The reservoir is
tunable by some hyper-parameters that control the network behaviour. ESNs are
known to be effective in solving tasks when configured on a region in
(hyper-)parameter space called \emph{Edge of Criticality} (EoC), where the
system is maximally sensitive to perturbations hence affecting its behaviour.
In this paper, we propose binary ESNs, which are architecturally equivalent to
standard ESNs but consider binary activation functions and binary recurrent
weights. For these networks, we derive a closed-form expression for the EoC in
the autonomous case and perform simulations in order to assess their behavior
in the case of noisy neurons and in the presence of a signal. We propose a
theoretical explanation for the fact that the variance of the input plays a
major role in characterizing the EoC
Cognitive networks: brains, internet, and civilizations
In this short essay, we discuss some basic features of cognitive activity at
several different space-time scales: from neural networks in the brain to
civilizations. One motivation for such comparative study is its heuristic
value. Attempts to better understand the functioning of "wetware" involved in
cognitive activities of central nervous system by comparing it with a computing
device have a long tradition. We suggest that comparison with Internet might be
more adequate. We briefly touch upon such subjects as encoding, compression,
and Saussurean trichotomy langue/langage/parole in various environments.Comment: 16 page
- …