96,052 research outputs found
Computational Capabilities of Analog and Evolving Neural Networks over Infinite Input Streams
International audienceAnalog and evolving recurrent neural networks are super-Turing powerful. Here, we consider analog and evolving neural nets over infinite input streams. We then characterize the topological complexity of their ω-languages as a function of the specific analog or evolving weights that they employ. As a consequence, two infinite hierarchies of classes of analog and evolving neural networks based on the complexity of their underlying weights can be derived. These results constitute an optimal refinement of the super-Turing expressive power of analog and evolving neural networks. They show that analog and evolving neural nets represent natural models for oracle-based infinite computation
Superpositional Quantum Network Topologies
We introduce superposition-based quantum networks composed of (i) the
classical perceptron model of multilayered, feedforward neural networks and
(ii) the algebraic model of evolving reticular quantum structures as described
in quantum gravity. The main feature of this model is moving from particular
neural topologies to a quantum metastructure which embodies many differing
topological patterns. Using quantum parallelism, training is possible on
superpositions of different network topologies. As a result, not only classical
transition functions, but also topology becomes a subject of training. The main
feature of our model is that particular neural networks, with different
topologies, are quantum states. We consider high-dimensional dissipative
quantum structures as candidates for implementation of the model.Comment: 10 pages, LaTeX2
- …