168 research outputs found

    Global asymptotic stability of nonautonomous Cohen-Grossberg neural network models with infinite delays

    Get PDF
    For a general Cohen-Grossberg neural network model with potentially unbounded time-varying coeffi cients and infi nite distributed delays, we give su fficient conditions for its global asymptotic stability. The model studied is general enough to include, as subclass, the most of famous neural network models such as Cohen-Grossberg, Hopfi eld, and bidirectional associative memory. Contrary to usual in the literature, in the proofs we do not use Lyapunov functionals. As illustrated, the results are applied to several concrete models studied in the literature and a comparison of results shows that our results give new global stability criteria for several neural network models and improve some earlier publications.The second author research was suported by the Research Centre of Mathematics of the University of Minho with the Portuguese Funds from the "Fundacao para a Ciencia e a Tecnologia", through the project PEstOE/MAT/UI0013/2014. The authors thank the referee for valuable comments

    Global exponential stability of nonautonomous neural network models with continuous distributed delays

    Get PDF
    For a family of non-autonomous differential equations with distributed delays, we give sufficient conditions for the global exponential stability of an equilibrium point. This family includes most of the delayed models of neural networks of Hopfield type, with time-varying coefficients and distributed delays. For these models, we establish sufficient conditions for their global exponential stability. The existence and global exponential stability of a periodic solution is also addressed. A comparison of results shows that these results are general, news, and add something new to some earlier publications.Fundação para a Ciência e a Tecnologia (FCT

    Existence and stability of a periodic solution of a general difference equation with applications to neural networks with a delay in the leakage terms

    Full text link
    In this paper, a new global exponential stability criterion is obtained for a general multidimensional delay difference equation using induction arguments. In the cases that the difference equation is periodic, we prove the existence of a periodic solution by constructing a type of Poincar\'e map. The results are used to obtain stability criteria for a general discrete-time neural network model with a delay in the leakage terms. As particular cases, we obtain new stability criteria for neural network models recently studied in the literature, in particular for low-order and high-order Hopfield and Bidirectional Associative Memory(BAM).Comment: 20 pages, 3 figure

    Existence and Global Exponential Stability of Periodic Solution to Cohen-Grossberg BAM Neural Networks with Time-Varying Delays

    Get PDF
    We investigate first the existence of periodic solution in general Cohen-Grossberg BAM neural networks with multiple time-varying delays by means of using degree theory. Then using the existence result of periodic solution and constructing a Lyapunov functional, we discuss global exponential stability of periodic solution for the above neural networks. Our result on global exponential stability of periodic solution is different from the existing results. In our result, the hypothesis for monotonicity ineqiality conditions in the works of Xia (2010) Chen and Cao (2007) on the behaved functions is removed and the assumption for boundedness in the works of Zhang et al. (2011) and Li et al. (2009) is also removed. We just require that the behaved functions satisfy sign conditions and activation functions are globally Lipschitz continuous

    Contrastive learning and neural oscillations

    Get PDF
    The concept of Contrastive Learning (CL) is developed as a family of possible learning algorithms for neural networks. CL is an extension of Deterministic Boltzmann Machines to more general dynamical systems. During learning, the network oscillates between two phases. One phase has a teacher signal and one phase has no teacher signal. The weights are updated using a learning rule that corresponds to gradient descent on a contrast function that measures the discrepancy between the free network and the network with a teacher signal. The CL approach provides a general unified framework for developing new learning algorithms. It also shows that many different types of clamping and teacher signals are possible. Several examples are given and an analysis of the landscape of the contrast function is proposed with some relevant predictions for the CL curves. An approach that may be suitable for collective analog implementations is described. Simulation results and possible extensions are briefly discussed together with a new conjecture regarding the function of certain oscillations in the brain. In the appendix, we also examine two extensions of contrastive learning to time-dependent trajectories

    Integration of continuous-time dynamics in a spiking neural network simulator

    Full text link
    Contemporary modeling approaches to the dynamics of neural networks consider two main classes of models: biologically grounded spiking neurons and functionally inspired rate-based units. The unified simulation framework presented here supports the combination of the two for multi-scale modeling approaches, the quantitative validation of mean-field approaches by spiking network simulations, and an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most efficient spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. We further demonstrate the broad applicability of the framework by considering various examples from the literature ranging from random networks to neural field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation

    Real-Time Anisotropic Diffusion using Space-Variant Vision

    Full text link
    Many computer and robot vision applications require multi-scale image analysis. Classically, this has been accomplished through the use of a linear scale-space, which is constructed by convolution of visual input with Gaussian kernels of varying size (scale). This has been shown to be equivalent to the solution of a linear diffusion equation on an infinite domain, as the Gaussian is the Green's function of such a system (Koenderink, 1984). Recently, much work has been focused on the use of a variable conductance function resulting in anisotropic diffusion described by a nonlinear partial differential equation (PDF). The use of anisotropic diffusion with a conductance coefficient which is a decreasing function of the gradient magnitude has been shown to enhance edges, while decreasing some types of noise (Perona and Malik, 1987). Unfortunately, the solution of the anisotropic diffusion equation requires the numerical integration of a nonlinear PDF which is a costly process when carried out on a fixed mesh such as a typical image. In this paper we show that the complex log transformation, variants of which are universally used in mammalian retino-cortical systems, allows the nonlinear diffusion equation to be integrated at exponentially enhanced rates due to the non-uniform mesh spacing inherent in the log domain. The enhanced integration rates, coupled with the intrinsic compression of the complex log transformation, yields a seed increase of between two and three orders of magnitude, providing a means of performing real-time image enhancement using anisotropic diffusion.Office of Naval Research (N00014-95-I-0409

    Dynamical Analysis of DTNN with Impulsive Effect

    Get PDF
    We present dynamical analysis of discrete-time delayed neural networks with impulsive effect. Under impulsive effect, we derive some new criteria for the invariance and attractivity of discretetime neural networks by using decomposition approach and delay difference inequalities. Our results improve or extend the existing ones

    Системи диференциални уравнения и невронни мрежи със закъснения и импулси

    Get PDF
    Department of Mathematics & Statistics, College of Science, Sultan Qaboos University, Muscat, Sultanate of Oman и ИМИ-БАН, 16.06.2014 г., присъждане на научна степен "доктор на науките" на Валерий Ковачев по научна специалност 01.01.13. математическо моделиране и приложение на математиката. [Covachev Valery Hristov; Ковачев Валерий Христов
    corecore