2,263 research outputs found
A Broad Class of Discrete-Time Hypercomplex-Valued Hopfield Neural Networks
In this paper, we address the stability of a broad class of discrete-time
hypercomplex-valued Hopfield-type neural networks. To ensure the neural
networks belonging to this class always settle down at a stationary state, we
introduce novel hypercomplex number systems referred to as real-part
associative hypercomplex number systems. Real-part associative hypercomplex
number systems generalize the well-known Cayley-Dickson algebras and real
Clifford algebras and include the systems of real numbers, complex numbers,
dual numbers, hyperbolic numbers, quaternions, tessarines, and octonions as
particular instances. Apart from the novel hypercomplex number systems, we
introduce a family of hypercomplex-valued activation functions called
-projection functions. Broadly speaking, a
-projection function projects the activation potential onto the
set of all possible states of a hypercomplex-valued neuron. Using the theory
presented in this paper, we confirm the stability analysis of several
discrete-time hypercomplex-valued Hopfield-type neural networks from the
literature. Moreover, we introduce and provide the stability analysis of a
general class of Hopfield-type neural networks on Cayley-Dickson algebras
Stochastic stability of uncertain Hopfield neural networks with discrete and distributed delays
This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2006 Elsevier Ltd.This Letter is concerned with the global asymptotic stability analysis problem for a class of uncertain stochastic Hopfield neural networks with discrete and distributed time-delays. By utilizing a LyapunovāKrasovskii functional, using the well-known S-procedure and conducting stochastic analysis, we show that the addressed neural networks are robustly, globally, asymptotically stable if a convex optimization problem is feasible. Then, the stability criteria are derived in terms of linear matrix inequalities (LMIs), which can be effectively solved by some standard numerical packages. The main results are also extended to the multiple time-delay case. Two numerical examples are given to demonstrate the usefulness of the proposed global stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, and the Alexander von Humboldt Foundation of Germany
Robust stability for stochastic Hopfield neural networks with time delays
This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2006 Elsevier Ltd.In this paper, the asymptotic stability analysis problem is considered for a class of uncertain stochastic neural networks with time delays and parameter uncertainties. The delays are time-invariant, and the uncertainties are norm-bounded that enter into all the network parameters. The aim of this paper is to establish easily verifiable conditions under which the delayed neural network is robustly asymptotically stable in the mean square for all admissible parameter uncertainties. By employing a LyapunovāKrasovskii functional and conducting the stochastic analysis, a linear matrix inequality (LMI) approach is developed to derive the stability criteria. The proposed criteria can be checked readily by using some standard numerical packages, and no tuning of parameters is required. Examples are provided to demonstrate the effectiveness and applicability of the proposed criteria.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, and the Alexander von Humboldt Foundation of German
Numerical Implementation of Gradient Algorithms
A numerical method for computational implementation of gradient dynamical systems is presented. The method is based upon the development of geometric integration numerical methods, which aim at preserving the dynamical properties of the original ordinary differential
equation under discretization. In particular, the proposed method belongs to the class of discrete gradients methods, which substitute the gradient of the continuous equation with a discrete gradient, leading to a map that possesses the same Lyapunov function of the dynamical system,
thus preserving the qualitative properties regardless of the step size. In this work, we apply a discrete gradient method to the implementation of Hopfield neural networks. Contrary to most geometric integration
methods, the proposed algorithm can be rewritten in explicit form, which considerably improves its performance and stability. Simulation results show that the preservation of the Lyapunov function leads to an improved performance, compared to the conventional discretization.Spanish Government project no. TIN2010-16556 Junta de AndalucĆa project no. P08-TIC-04026 Agencia EspaƱola de CooperaciĆ³n Internacional
para el Desarrollo project no. A2/038418/1
An analog feedback associative memory
A method for the storage of analog vectors, i.e., vectors whose components are real-valued, is developed for the Hopfield continuous-time network. An important requirement is that each memory vector has to be an asymptotically stable (i.e. attractive) equilibrium of the network. Some of the limitations imposed by the continuous Hopfield model on the set of vectors that can be stored are pointed out. These limitations can be relieved by choosing a network containing visible as well as hidden units. An architecture consisting of several hidden layers and a visible layer, connected in a circular fashion, is considered. It is proved that the two-layer case is guaranteed to store any number of given analog vectors provided their number does not exceed 1 + the number of neurons in the hidden layer. A learning algorithm that correctly adjusts the locations of the equilibria and guarantees their asymptotic stability is developed. Simulation results confirm the effectiveness of the approach
Discrete-time recurrent neural networks with time-varying delays: Exponential stability analysis
This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier LtdThis Letter is concerned with the analysis problem of exponential stability for a class of discrete-time recurrent neural networks (DRNNs) with time delays. The delay is of the time-varying nature, and the activation functions are assumed to be neither differentiable nor strict monotonic. Furthermore, the description of the activation functions is more general than the recently commonly used Lipschitz conditions. Under such mild conditions, we first prove the existence of the equilibrium point. Then, by employing a LyapunovāKrasovskii functional, a unified linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the DRNNs to be globally exponentially stable. It is shown that the delayed DRNNs are globally exponentially stable if a certain LMI is solvable, where the feasibility of such an LMI can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China (05KJB110154), the NSF of Jiangsu Province of China (BK2006064), and the National Natural Science Foundation of China (10471119)
Recommended from our members
A delay-dependent LMI approach to dynamics analysis of discrete-time recurrent neural networks with time-varying delays
This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier Ltd.In this Letter, the analysis problem for the existence and stability of periodic solutions is investigated for a class of general discrete-time recurrent neural networks with time-varying delays. For the neural networks under study, a generalized activation function is considered, and the traditional assumptions on the boundedness, monotony and differentiability of the activation functions are removed. By employing the latest free-weighting matrix method, an appropriate LyapunovāKrasovskii functional is constructed and several sufficient conditions are established to ensure the existence, uniqueness, and globally exponential stability of the periodic solution for the addressed neural network. The conditions are dependent on both the lower bound and upper bound of the time-varying time delays. Furthermore, the conditions are expressed in terms of the linear matrix inequalities (LMIs), which can be checked numerically using the effective LMI toolbox in MATLAB. Two simulation examples are given to show the effectiveness and less conservatism of the proposed criteria.This work was supported in part by the National Natural Science Foundation of China under Grant 50608072, an International Joint Project sponsored by the Royal Society of the UK and the National Natural Science Foundation of China, and the Alexander von Humboldt Foundation of Germany
- ā¦