123 research outputs found

    Lag synchronization of switched neural networks via neural activation function and applications in image encryption

    Get PDF
    This paper investigates the problem of global exponential lag synchronization of a class of switched neural networks with time-varying delays via neural activation function and applications in image encryption. The controller is dependent on the output of the system in the case of packed circuits, since it is hard to measure the inner state of the circuits. Thus, it is critical to design the controller based on the neuron activation function. Comparing the results, in this paper, with the existing ones shows that we improve and generalize the results derived in the previous literature. Several examples are also given to illustrate the effectiveness and potential applications in image encryption

    ψ-type stability of reaction–diffusion neural networks with time-varying discrete delays and bounded distributed delays

    Get PDF
    In this paper, the ψ-type stability and robust ψ-type stability for reaction–diffusion neural networks (RDNNs) with Dirichlet boundary conditions, time-varying discrete delays and bounded distributed delays are investigated, respectively. Firstly, we analyze the ψ-type stability and robust ψ-type stability of RDNNs with time-varying discrete delays by means of ψ-type functions combined with some inequality techniques, and put forward several ψ-type stability criteria for the considered networks. Additionally, the models of RDNNs with bounded distributed delays are established and some sufficient conditions to guarantee the ψ-type stability and robust ψ-type stability are given. Lastly, two examples are provided to confirm the effectiveness of the derived results

    Exponential multistability of memristive Cohen-Grossberg neural networks with stochastic parameter perturbations

    Get PDF
    © 2020 Elsevier Ltd. All rights reserved. This manuscript is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Licence http://creativecommons.org/licenses/by-nc-nd/4.0/.Due to instability being induced easily by parameter disturbances of network systems, this paper investigates the multistability of memristive Cohen-Grossberg neural networks (MCGNNs) under stochastic parameter perturbations. It is demonstrated that stable equilibrium points of MCGNNs can be flexibly located in the odd-sequence or even-sequence regions. Some sufficient conditions are derived to ensure the exponential multistability of MCGNNs under parameter perturbations. It is found that there exist at least (w+2) l (or (w+1) l) exponentially stable equilibrium points in the odd-sequence (or the even-sequence) regions. In the paper, two numerical examples are given to verify the correctness and effectiveness of the obtained results.Peer reviewe

    Recent Advances and Applications of Fractional-Order Neural Networks

    Get PDF
    This paper focuses on the growth, development, and future of various forms of fractional-order neural networks. Multiple advances in structure, learning algorithms, and methods have been critically investigated and summarized. This also includes the recent trends in the dynamics of various fractional-order neural networks. The multiple forms of fractional-order neural networks considered in this study are Hopfield, cellular, memristive, complex, and quaternion-valued based networks. Further, the application of fractional-order neural networks in various computational fields such as system identification, control, optimization, and stability have been critically analyzed and discussed

    Delay-Dependent Dynamics of Switched Cohen-Grossberg Neural Networks with Mixed Delays

    Get PDF
    This paper aims at studying the problem of the dynamics of switched Cohen-Grossberg neural networks with mixed delays by using Lyapunov functional method, average dwell time (ADT) method, and linear matrix inequalities (LMIs) technique. Some conditions on the uniformly ultimate boundedness, the existence of an attractors, the globally exponential stability of the switched Cohen-Grossberg neural networks are developed. Our results extend and complement some earlier publications
    • …
    corecore