43 research outputs found

    Recent Advances and Applications of Fractional-Order Neural Networks

    Get PDF
    This paper focuses on the growth, development, and future of various forms of fractional-order neural networks. Multiple advances in structure, learning algorithms, and methods have been critically investigated and summarized. This also includes the recent trends in the dynamics of various fractional-order neural networks. The multiple forms of fractional-order neural networks considered in this study are Hopfield, cellular, memristive, complex, and quaternion-valued based networks. Further, the application of fractional-order neural networks in various computational fields such as system identification, control, optimization, and stability have been critically analyzed and discussed

    Global stability of Clifford-valued Takagi-Sugeno fuzzy neural networks with time-varying delays and impulses

    Get PDF
    summary:In this study, we consider the Takagi-Sugeno (T-S) fuzzy model to examine the global asymptotic stability of Clifford-valued neural networks with time-varying delays and impulses. In order to achieve the global asymptotic stability criteria, we design a general network model that includes quaternion-, complex-, and real-valued networks as special cases. First, we decompose the nn-dimensional Clifford-valued neural network into 2mn2^mn-dimensional real-valued counterparts in order to solve the noncommutativity of Clifford numbers multiplication. Then, we prove the new global asymptotic stability criteria by constructing an appropriate Lyapunov-Krasovskii functionals (LKFs) and employing Jensen's integral inequality together with the reciprocal convex combination method. All the results are proven using linear matrix inequalities (LMIs). Finally, a numerical example is provided to show the effectiveness of the achieved results

    New Exponential Stability Conditions of Switched BAM Neural Networks with Delays

    Get PDF
    The exponential stability problem is considered in this paper for discrete-time switched BAM neural networks with time delay. The average dwell time method is introduced to deal with the exponential stability analysis of the systems for the first time. By constructing a new switching-dependent Lyapunov-Krasovskii functional, some new delay-dependent criteria are developed, which guarantee the exponential stability. A numerical example is provided to demonstrate the potential and effectiveness of the proposed algorithms

    Event-Triggered State Estimation for a Class of Delayed Recurrent Neural Networks with Sampled-Data Information

    Get PDF
    The paper investigates the state estimation problem for a class of recurrent neural networks with sampled-data information and time-varying delays. The main purpose is to estimate the neuron states through output sampled measurement; a novel event-triggered scheme is proposed, which can lead to a significant reduction of the information communication burden in the network; the feature of this scheme is that whether or not the sampled data should be transmitted is determined by the current sampled data and the error between the current sampled data and the latest transmitted data. By using a delayed-input approach, the error dynamic system is equivalent to a dynamic system with two different time-varying delays. Based on the Lyapunov-krasovskii functional approach, a state estimator of the considered neural networks can be achieved by solving some linear matrix inequalities, which can be easily facilitated by using the standard numerical software. Finally, a numerical example is provided to show the effectiveness of the proposed event-triggered scheme
    corecore