11,370 research outputs found
A Broad Class of Discrete-Time Hypercomplex-Valued Hopfield Neural Networks
In this paper, we address the stability of a broad class of discrete-time
hypercomplex-valued Hopfield-type neural networks. To ensure the neural
networks belonging to this class always settle down at a stationary state, we
introduce novel hypercomplex number systems referred to as real-part
associative hypercomplex number systems. Real-part associative hypercomplex
number systems generalize the well-known Cayley-Dickson algebras and real
Clifford algebras and include the systems of real numbers, complex numbers,
dual numbers, hyperbolic numbers, quaternions, tessarines, and octonions as
particular instances. Apart from the novel hypercomplex number systems, we
introduce a family of hypercomplex-valued activation functions called
-projection functions. Broadly speaking, a
-projection function projects the activation potential onto the
set of all possible states of a hypercomplex-valued neuron. Using the theory
presented in this paper, we confirm the stability analysis of several
discrete-time hypercomplex-valued Hopfield-type neural networks from the
literature. Moreover, we introduce and provide the stability analysis of a
general class of Hopfield-type neural networks on Cayley-Dickson algebras
Neural Networks Architecture Evaluation in a Quantum Computer
In this work, we propose a quantum algorithm to evaluate neural networks
architectures named Quantum Neural Network Architecture Evaluation (QNNAE). The
proposed algorithm is based on a quantum associative memory and the learning
algorithm for artificial neural networks. Unlike conventional algorithms for
evaluating neural network architectures, QNNAE does not depend on
initialization of weights. The proposed algorithm has a binary output and
results in 0 with probability proportional to the performance of the network.
And its computational cost is equal to the computational cost to train a neural
network
- …