780 research outputs found

    Recent Advances and Applications of Fractional-Order Neural Networks

    Get PDF
    This paper focuses on the growth, development, and future of various forms of fractional-order neural networks. Multiple advances in structure, learning algorithms, and methods have been critically investigated and summarized. This also includes the recent trends in the dynamics of various fractional-order neural networks. The multiple forms of fractional-order neural networks considered in this study are Hopfield, cellular, memristive, complex, and quaternion-valued based networks. Further, the application of fractional-order neural networks in various computational fields such as system identification, control, optimization, and stability have been critically analyzed and discussed

    Stability analysis of discrete-time recurrent neural networks with stochastic delay

    Get PDF
    This paper is concerned with the stability analysis of discrete-time recurrent neural networks (RNNs) with time delays as random variables drawn from some probability distribution. By introducing the variation probability of the time delay, a common delayed discrete-time RNN system is transformed into one with stochastic parameters. Improved conditions for the mean square stability of these systems are obtained by employing new Lyapunov functions and novel techniques are used to achieve delay dependence. The merit of the proposed conditions lies in its reduced conservatism, which is made possible by considering not only the range of the time delays, but also the variation probability distribution. A numerical example is provided to show the advantages of the proposed conditions. © 2009 IEEE.published_or_final_versio

    Fixed Points and Exponential Stability for Impulsive Time-Delays BAM Neural Networks via LMI Approach and Contraction Mapping Principle

    Get PDF
    The fixed point technique has been employed in the stability analysis of time-delays bidirectional associative memory (BAM) neural networks with impulse. By formulating a contraction mapping in a product space, a new LMI-based exponential stability criterion was derived. Lately, fixed point methods have educed various good results inspiring this work, but those criteria cannot be programmed by a computer. In this paper, LMI conditions of the obtained result can be applicable to computer Matlab LMI toolbox which meets the need of the large-scale calculation in real engineering. Moreover, a numerical example and a comparable table are presented to illustrate the effectiveness of the proposed methods

    Finite-Time Stability of Fractional-Order BAM Neural Networks with Distributed Delay

    Get PDF
    Based on the theory of fractional calculus, the generalized Gronwall inequality and estimates of mittag-Leffer functions, the finite-time stability of Caputo fractional-order BAM neural networks with distributed delay is investigated in this paper. An illustrative example is also given to demonstrate the effectiveness of the obtained result

    Combined Convex Technique on Delay-Distribution-Dependent Stability for Delayed Neural Networks

    Get PDF
    Together with the Lyapunov-Krasovskii functional approach and an improved delay-partitioning idea, one novel sufficient condition is derived to guarantee a class of delayed neural networks to be asymptotically stable in the mean-square sense, in which the probabilistic variable delay and both of delay variation limits can be measured. Through combining the reciprocal convex technique and convex technique one, the criterion is presented via LMIs and its solvability heavily depends on the sizes of both time-delay range and its variations, which can become much less conservative than those present ones by thinning the delay intervals. Finally, it can be demonstrated by four numerical examples that our idea reduces the conservatism more effectively than some earlier reported ones

    ψ-type stability of reaction–diffusion neural networks with time-varying discrete delays and bounded distributed delays

    Get PDF
    In this paper, the ψ-type stability and robust ψ-type stability for reaction–diffusion neural networks (RDNNs) with Dirichlet boundary conditions, time-varying discrete delays and bounded distributed delays are investigated, respectively. Firstly, we analyze the ψ-type stability and robust ψ-type stability of RDNNs with time-varying discrete delays by means of ψ-type functions combined with some inequality techniques, and put forward several ψ-type stability criteria for the considered networks. Additionally, the models of RDNNs with bounded distributed delays are established and some sufficient conditions to guarantee the ψ-type stability and robust ψ-type stability are given. Lastly, two examples are provided to confirm the effectiveness of the derived results

    An analysis of stability of a class of neutral-type neural networks with discrete time delays

    Get PDF
    The problem of existence, uniqueness, and global asymptotic stability is considered for the class of neutral-type neural network model with discrete time delays. By employing a suitable Lyapunov functional and using the homeomorphism mapping theorem, we derive some new delay-independent sufficient conditions for the existence, uniqueness, and global asymptotic stability of the equilibrium point for this class of neutral-type systems. The obtained conditions basically establish some norm and matrix inequalities involving the network parameters of the neural system. The main advantage of the proposed results is that they can be expressed in terms of network parameters only. Some comparative examples are also given to compare our results with the previous corresponding results and demonstrate the effectiveness of the results presented.Publisher's Versio
    corecore