95 research outputs found

    Novel Lagrange sense exponential stability criteria for time-delayed stochastic Cohen–Grossberg neural networks with Markovian jump parameters: A graph-theoretic approach

    Get PDF
    This paper concerns the issues of exponential stability in Lagrange sense for a class of stochastic Cohen–Grossberg neural networks (SCGNNs) with Markovian jump and mixed time delay effects. A systematic approach of constructing a global Lyapunov function for SCGNNs with mixed time delays and Markovian jumping is provided by applying the association of Lyapunov method and graph theory results. Moreover, by using some inequality techniques in Lyapunov-type and coefficient-type theorems we attain two kinds of sufficient conditions to ensure the global exponential stability (GES) through Lagrange sense for the addressed SCGNNs. Ultimately, some examples with numerical simulations are given to demonstrate the effectiveness of the acquired result

    pth moment exponential stability of stochastic fuzzy Cohen–Grossberg neural networks with discrete and distributed delays

    Get PDF
    In this paper, stochastic fuzzy Cohen–Grossberg neural networks with discrete and distributed delays are investigated. By using Lyapunov function and the Ito differential formula, some sufficient conditions for the pth moment exponential stability of such stochastic fuzzy Cohen–Grossberg neural networks with discrete and distributed delays are established. An example is given to illustrate the feasibility of our main theoretical findings. Finally, the paper ends with a brief conclusion. Methodology and achieved results is to be presented

    Nonlinear Systems

    Get PDF
    Open Mathematics is a challenging notion for theoretical modeling, technical analysis, and numerical simulation in physics and mathematics, as well as in many other fields, as highly correlated nonlinear phenomena, evolving over a large range of time scales and length scales, control the underlying systems and processes in their spatiotemporal evolution. Indeed, available data, be they physical, biological, or financial, and technologically complex systems and stochastic systems, such as mechanical or electronic devices, can be managed from the same conceptual approach, both analytically and through computer simulation, using effective nonlinear dynamics methods. The aim of this Special Issue is to highlight papers that show the dynamics, control, optimization and applications of nonlinear systems. This has recently become an increasingly popular subject, with impressive growth concerning applications in engineering, economics, biology, and medicine, and can be considered a veritable contribution to the literature. Original papers relating to the objective presented above are especially welcome subjects. Potential topics include, but are not limited to: Stability analysis of discrete and continuous dynamical systems; Nonlinear dynamics in biological complex systems; Stability and stabilization of stochastic systems; Mathematical models in statistics and probability; Synchronization of oscillators and chaotic systems; Optimization methods of complex systems; Reliability modeling and system optimization; Computation and control over networked systems

    Harnessing Neural Dynamics as a Computational Resource

    Get PDF
    Researchers study nervous systems at levels of scale spanning several orders of magnitude, both in terms of time and space. While some parts of the brain are well understood at specific levels of description, there are few overarching theories that systematically bridge low-level mechanism and high-level function. The Neural Engineering Framework (NEF) is an attempt at providing such a theory. The NEF enables researchers to systematically map dynamical systems—corresponding to some hypothesised brain function—onto biologically constrained spiking neural networks. In this thesis, we present several extensions to the NEF that broaden both the range of neural resources that can be harnessed for spatiotemporal computation and the range of available biological constraints. Specifically, we suggest a method for harnessing the dynamics inherent in passive dendritic trees for computation, allowing us to construct single-layer spiking neural networks that, for some functions, achieve substantially lower errors than larger multi-layer networks. Furthermore, we suggest “temporal tuning” as a unifying approach to harnessing temporal resources for computation through time. This allows modellers to directly constrain networks to temporal tuning observed in nature, in ways not previously well-supported by the NEF. We then explore specific examples of neurally plausible dynamics using these techniques. In particular, we propose a new “information erasure” technique for constructing LTI systems generating temporal bases. Such LTI systems can be used to establish an optimal basis for spatiotemporal computation. We demonstrate how this captures “time cells” that have been observed throughout the brain. As well, we demonstrate the viability of our extensions by constructing an adaptive filter model of the cerebellum that successfully reproduces key features of eyeblink conditioning observed in neurobiological experiments. Outside the cognitive sciences, our work can help exploit resources available on existing neuromorphic computers, and inform future neuromorphic hardware design. In machine learning, our spatiotemporal NEF populations map cleanly onto the Legendre Memory Unit (LMU), a promising artificial neural network architecture for stream-to-stream processing that outperforms competing approaches. We find that one of our LTI systems derived through “information erasure” may serve as a computationally less expensive alternative to the LTI system commonly used in the LMU

    Dynamics of Macrosystems; Proceedings of a Workshop, September 3-7, 1984

    Get PDF
    There is an increasing awareness of the important and persuasive role that instability and random, chaotic motion play in the dynamics of macrosystems. Further research in the field should aim at providing useful tools, and therefore the motivation should come from important questions arising in specific macrosystems. Such systems include biochemical networks, genetic mechanisms, biological communities, neutral networks, cognitive processes and economic structures. This list may seem heterogeneous, but there are similarities between evolution in the different fields. It is not surprising that mathematical methods devised in one field can also be used to describe the dynamics of another. IIASA is attempting to make progress in this direction. With this aim in view this workshop was held at Laxenburg over the period 3-7 September 1984. These Proceedings cover a broad canvas, ranging from specific biological and economic problems to general aspects of dynamical systems and evolutionary theory
    corecore