8,835 research outputs found

    A Broad Class of Discrete-Time Hypercomplex-Valued Hopfield Neural Networks

    Full text link
    In this paper, we address the stability of a broad class of discrete-time hypercomplex-valued Hopfield-type neural networks. To ensure the neural networks belonging to this class always settle down at a stationary state, we introduce novel hypercomplex number systems referred to as real-part associative hypercomplex number systems. Real-part associative hypercomplex number systems generalize the well-known Cayley-Dickson algebras and real Clifford algebras and include the systems of real numbers, complex numbers, dual numbers, hyperbolic numbers, quaternions, tessarines, and octonions as particular instances. Apart from the novel hypercomplex number systems, we introduce a family of hypercomplex-valued activation functions called B\mathcal{B}-projection functions. Broadly speaking, a B\mathcal{B}-projection function projects the activation potential onto the set of all possible states of a hypercomplex-valued neuron. Using the theory presented in this paper, we confirm the stability analysis of several discrete-time hypercomplex-valued Hopfield-type neural networks from the literature. Moreover, we introduce and provide the stability analysis of a general class of Hopfield-type neural networks on Cayley-Dickson algebras

    Machine Learning and Neural Networks for Real-Time Scheduling

    Get PDF
    This paper aims to serve as an efficient survey of the processes, problems, and methodologies surrounding the use of Neural Networks, specifically Hopfield-Type, in order to solve Hard-Real-Time Scheduling problems. Our primary goal is to demystify the field of Neural Networks research and properly describe the methods in which Real-Time scheduling problems may be approached when using neural networks. Furthermore, to give an introduction of sorts on this niche topic in a niche field. This survey is derived from four main papers, namely: “A Neurodynamic Approach for Real-Time Scheduling via Maximizing Piecewise Linear Utility” and “Scheduling Multiprocessor Job with Resource and Timing Constraints Using Neural Networks” . “Solving Real Time Scheduling Problems with Hopfield-type Neural Networks” and “Neural Networks for Multiprocessor Real-Time Scheduling

    New Insights on Learning Rules for Hopfield Networks: Memory and Objective Function Minimisation

    Full text link
    Hopfield neural networks are a possible basis for modelling associative memory in living organisms. After summarising previous studies in the field, we take a new look at learning rules, exhibiting them as descent-type algorithms for various cost functions. We also propose several new cost functions suitable for learning. We discuss the role of biases (the external inputs) in the learning process in Hopfield networks. Furthermore, we apply Newtons method for learning memories, and experimentally compare the performances of various learning rules. Finally, to add to the debate whether allowing connections of a neuron to itself enhances memory capacity, we numerically investigate the effects of self coupling. Keywords: Hopfield Networks, associative memory, content addressable memory, learning rules, gradient descent, attractor networksComment: 8 pages, IEEE-Xplore, 2020 International Joint Conference on Neural Networks (IJCNN), Glasgo

    Parallel Hopfield Networks

    Get PDF
    We introduce a novel type of neural network, termed the parallelHopfield network, that can simultaneously effect the dynamics of many different, independent Hopfield networks in parallel in the same piece of neural hardware. Numerically we find that under certain conditions, each Hopfield subnetwork has a finite memory capacity approaching that of the equivalent isolated attractor network, while a simple signal-to-noise analysis sheds qualitative, and some quantitative, insight into the workings (and failures) of the system

    Asymptotic stability of impulsive high-order Hopfield typeneural networks

    Get PDF
    AbstractIn this paper, we discuss impulsive high-order Hopfield type neural networks. Investigating their global asymptotic stability, by using Lyapunov function method, sufficient conditions that guarantee global asymptotic stability of networks are given. These criteria can be used to analyse the dynamics of biological neural systems or to design globally stable artificial neural networks. Two numerical examples are given to illustrate the effectiveness of the proposed method

    Capacity for patterns and sequences in Kanerva's SDM as compared to other associative memory models

    Get PDF
    The information capacity of Kanerva's Sparse Distributed Memory (SDM) and Hopfield-type neural networks is investigated. Under the approximations used, it is shown that the total information stored in these systems is proportional to the number connections in the network. The proportionality constant is the same for the SDM and Hopfield-type models independent of the particular model, or the order of the model. The approximations are checked numerically. This same analysis can be used to show that the SDM can store sequences of spatiotemporal patterns, and the addition of time-delayed connections allows the retrieval of context dependent temporal patterns. A minor modification of the SDM can be used to store correlated patterns

    Phase Diagram and Storage Capacity of Sequence Processing Neural Networks

    Full text link
    We solve the dynamics of Hopfield-type neural networks which store sequences of patterns, close to saturation. The asymmetry of the interaction matrix in such models leads to violation of detailed balance, ruling out an equilibrium statistical mechanical analysis. Using generating functional methods we derive exact closed equations for dynamical order parameters, viz. the sequence overlap and correlation- and response functions, in the thermodynamic limit. We calculate the time translation invariant solutions of these equations, describing stationary limit-cycles, which leads to a phase diagram. The effective retarded self-interaction usually appearing in symmetric models is here found to vanish, which causes a significantly enlarged storage capacity of αc0.269\alpha_c\sim 0.269, compared to \alpha_\c\sim 0.139 for Hopfield networks storing static patterns. Our results are tested against extensive computer simulations and excellent agreement is found.Comment: 17 pages Latex2e, 2 postscript figure
    corecore