33,613 research outputs found
Spiking Neural P Systems: Stronger Normal Forms
Spiking neural P systems are computing devices recently introduced as a
bridge between spiking neural nets and membrane computing. Thanks to the rapid research
in this eld there exists already a series of both theoretical and application studies.
In this paper we focus on normal forms of these systems while preserving their computational
power. We study combinations of existing normal forms, showing that certain
groups of them can be combined without loss of computational power, thus answering
partially open problems stated in. We also extend some of the already known normal
forms for spiking neural P systems considering determinism and strong acceptance
condition. Normal forms can speed-up development and simplify future proofs in this
area
Discontinuities in recurrent neural networks
This paper studies the computational power of various discontinuous
real computational models that are based on the classical analog
recurrent neural network (ARNN). This ARNN consists of finite number
of neurons; each neuron computes a polynomial net-function and a
sigmoid-like continuous activation-function.
The authors introducePostprint (published version
The computational power and complexity of discrete feedforward neural networks
The number of binary functions that can be defined on a set of L vectors in R^N equals 2^L . For L>N the total number of threshold functions in N-dimensional space grows polynomially (2^N(N-1))while the total number of Boolean functions, definable on N binary inputs, growsexponentially ( 2^2^2), and as N increases a percentage of threshold functions in relation to the total number of Boolean functions - goes to zero. This means that for the realization of a majority of tasks a neural network must possess at least two or three layers. The examples of small computational problems are arithmetic functions, like multiplication, division, addition, exponentiation or comparison and sorting. This article analyses some aspects of two- and more than two layers of threshold and Boolean circuits (feedforward neural nets), connected with their computational power and node, edge and weight complexity
- …