213 research outputs found
Robustness analysis of Cohen-Grossberg neural network with piecewise constant argument and stochastic disturbances
Robustness of neural networks has been a hot topic in recent years. This paper mainly studies the robustness of the global exponential stability of Cohen-Grossberg neural networks with a piecewise constant argument and stochastic disturbances, and discusses the problem of whether the Cohen-Grossberg neural networks can still maintain global exponential stability under the perturbation of the piecewise constant argument and stochastic disturbances. By using stochastic analysis theory and inequality techniques, the interval length of the piecewise constant argument and the upper bound of the noise intensity are derived by solving transcendental equations. In the end, we offer several examples to illustrate the efficacy of the findings
Exponential Lag Synchronization of Cohen-Grossberg Neural Networks with Discrete and Distributed Delays on Time Scales
In this article, we investigate exponential lag synchronization results for
the Cohen-Grossberg neural networks (C-GNNs) with discrete and distributed
delays on an arbitrary time domain by applying feedback control. We formulate
the problem by using the time scales theory so that the results can be applied
to any uniform or non-uniform time domains. Also, we provide a comparison of
results that shows that obtained results are unified and generalize the
existing results. Mainly, we use the unified matrix-measure theory and Halanay
inequality to establish these results. In the last section, we provide two
simulated examples for different time domains to show the effectiveness and
generality of the obtained analytical results.Comment: 20 pages, 18 figure
Global exponential convergence of delayed inertial Cohen–Grossberg neural networks
In this paper, the exponential convergence of delayed inertial Cohen–Grossberg neural networks (CGNNs) is studied. Two methods are adopted to discuss the inertial CGNNs, one is expressed as two first-order differential equations by selecting a variable substitution, and the other does not change the order of the system based on the nonreduced-order method. By establishing appropriate Lyapunov function and using inequality techniques, sufficient conditions are obtained to ensure that the discussed model converges exponentially to a ball with the prespecified convergence rate. Finally, two simulation examples are proposed to illustrate the validity of the theorem results
Global attractive set of neural networks with neutral item
This paper investigates the global attractive set of neural networks with neutral item. To better deal with the neutral terms, different types of activation functions are considered. Based on matrix measures, inequality techniques, and Lyapunov theory, three new types of Lyapunov functions are designed to find the global attractive set of the system. We give out a simulation example to verify the validity of theory results. The result is very inclusive, whether the system has equilibrium or not. As long as the system is stable, we can find its global attractive set
Global Mittag-Leffler stability of Caputo fractional-order fuzzy inertial neural networks with delay
This paper deals with the global Mittag-Leffler stability (GMLS) of Caputo fractional-order fuzzy inertial neural networks with time delay (CFOFINND). Based on Lyapunov stability theory and global fractional Halanay inequalities, the existence of unique equilibrium point and GMLS of CFOFINND have been established. A numerical example is given to illustrate the effectiveness of our results
Fixed-Time Gradient Flows for Solving Constrained Optimization: A Unified Approach
The accelerated method in solving optimization problems has always been an
absorbing topic. Based on the fixed-time (FxT) stability of nonlinear dynamical
systems, we provide a unified approach for designing FxT gradient flows
(FxTGFs). First, a general class of nonlinear functions in designing FxTGFs is
provided. A unified method for designing first-order FxTGFs is shown under
PolyakL jasiewicz inequality assumption, a weaker condition than strong
convexity. When there exist both bounded and vanishing disturbances in the
gradient flow, a specific class of nonsmooth robust FxTGFs with disturbance
rejection is presented. Under the strict convexity assumption, Newton-based
FxTGFs is given and further extended to solve time-varying optimization.
Besides, the proposed FxTGFs are further used for solving equation-constrained
optimization. Moreover, an FxT proximal gradient flow with a wide range of
parameters is provided for solving nonsmooth composite optimization. To show
the effectiveness of various FxTGFs, the static regret analysis for several
typical FxTGFs are also provided in detail. Finally, the proposed FxTGFs are
applied to solve two network problems, i.e., the network consensus problem and
solving a system linear equations, respectively, from the respective of
optimization. Particularly, by choosing component-wisely sign-preserving
functions, these problems can be solved in a distributed way, which extends the
existing results. The accelerated convergence and robustness of the proposed
FxTGFs are validated in several numerical examples stemming from practical
applications
Brain Computations and Connectivity [2nd edition]
This is an open access title available under the terms of a CC BY-NC-ND 4.0 International licence. It is free to read on the Oxford Academic platform and offered as a free PDF download from OUP and selected open access locations.
Brain Computations and Connectivity is about how the brain works. In order to understand this, it is essential to know what is computed by different brain systems; and how the computations are performed.
The aim of this book is to elucidate what is computed in different brain systems; and to describe current biologically plausible computational approaches and models of how each of these brain systems computes.
Understanding the brain in this way has enormous potential for understanding ourselves better in health and in disease. Potential applications of this understanding are to the treatment of the brain in disease; and to artificial intelligence which will benefit from knowledge of how the brain performs many of its extraordinarily impressive functions.
This book is pioneering in taking this approach to brain function: to consider what is computed by many of our brain systems; and how it is computed, and updates by much new evidence including the connectivity of the human brain the earlier book: Rolls (2021) Brain Computations: What and How, Oxford University Press.
Brain Computations and Connectivity will be of interest to all scientists interested in brain function and how the brain works, whether they are from neuroscience, or from medical sciences including neurology and psychiatry, or from the area of computational science including machine learning and artificial intelligence, or from areas such as theoretical physics
Nonlinear Systems
Open Mathematics is a challenging notion for theoretical modeling, technical analysis, and numerical simulation in physics and mathematics, as well as in many other fields, as highly correlated nonlinear phenomena, evolving over a large range of time scales and length scales, control the underlying systems and processes in their spatiotemporal evolution. Indeed, available data, be they physical, biological, or financial, and technologically complex systems and stochastic systems, such as mechanical or electronic devices, can be managed from the same conceptual approach, both analytically and through computer simulation, using effective nonlinear dynamics methods. The aim of this Special Issue is to highlight papers that show the dynamics, control, optimization and applications of nonlinear systems. This has recently become an increasingly popular subject, with impressive growth concerning applications in engineering, economics, biology, and medicine, and can be considered a veritable contribution to the literature. Original papers relating to the objective presented above are especially welcome subjects. Potential topics include, but are not limited to: Stability analysis of discrete and continuous dynamical systems; Nonlinear dynamics in biological complex systems; Stability and stabilization of stochastic systems; Mathematical models in statistics and probability; Synchronization of oscillators and chaotic systems; Optimization methods of complex systems; Reliability modeling and system optimization; Computation and control over networked systems
SATURATED AND ASYMMETRIC SATURATED IMPULSIVE CONTROL SYNCHRONIZATION OF COUPLED DELAYED INERTIAL NEURAL NETWORKS WITH TIME-VARYING DELAYS
This paper considers control systems with impulses that are saturated and asymmetrically saturated which are used to examine the synchronization of inertial neural networks (INNs) with time-varying delay and coupling delays. Under the theoretical discussions, mixed delays, such as transmission delay and coupling delay are presented for inertial neural networks. The addressed INNs are transformed into first order differential equations utilizing variable transformation on INNs and then certain adequate conditions are derived for the exponential synchronization of the addressed model by substituting saturation nonlinearity with a dead-zone function. In addition, an asymmetric saturated impulsive control approach is given to realize the exponential synchronization of addressed INNs in the leader-following synchronization pattern. Finally, simulation results are used to validate the theoretical research findings
- …