707 research outputs found

    On the almost sure running maxima of solutions of affine stochastic functional differential equations

    Get PDF
    This paper studies the large fluctuations of solutions of scalar and finite-dimensional affine stochastic functional differential equations with finite memory as well as related nonlinear equations. We find conditions under which the exact almost sure growth rate of the running maximum of each component of the system can be determined, both for affine and nonlinear equations. The proofs exploit the fact that an exponentially decaying fundamental solution of the underlying deterministic equation is sufficient to ensure that the solution of the affine equation converges to a stationary Gaussian process

    Dynamical Behaviors of Stochastic Hopfield Neural Networks with Both Time-Varying and Continuously Distributed Delays

    Get PDF
    This paper investigates dynamical behaviors of stochastic Hopfield neural networks with both time-varying and continuously distributed delays. By employing the Lyapunov functional theory and linear matrix inequality, some novel criteria on asymptotic stability, ultimate boundedness, and weak attractor are derived. Finally, an example is given to illustrate the correctness and effectiveness of our theoretical results

    A Computational Investigation of Neural Dynamics and Network Structure

    No full text
    With the overall goal of illuminating the relationship between neural dynamics and neural network structure, this thesis presents a) a computer model of a network infrastructure capable of global broadcast and competition, and b) a study of various convergence properties of spike-timing dependent plasticity (STDP) in a recurrent neural network. The first part of the thesis explores the parameter space of a possible Global Neuronal Workspace (GNW) realised in a novel computational network model using stochastic connectivity. The structure of this model is analysed in light of the characteristic dynamics of a GNW: broadcast, reverberation, and competition. It is found even with careful consideration of the balance between excitation and inhibition, the structural choices do not allow agreement with the GNW dynamics, and the implications of this are addressed. An additional level of competition – access competition – is added, discussed, and found to be more conducive to winner-takes-all competition. The second part of the thesis investigates the formation of synaptic structure due to neural and synaptic dynamics. From previous theoretical and modelling work, it is predicted that homogeneous stimulation in a recurrent neural network with STDP will create a self-stabilising equilibrium amongst synaptic weights, while heterogeneous stimulation will induce structured synaptic changes. A new factor in modulating the synaptic weight equilibrium is suggested from the experimental evidence presented: anti-correlation due to inhibitory neurons. It is observed that the synaptic equilibrium creates competition amongst synapses, and those specifically stimulated during heterogeneous stimulation win out. Further investigation is carried out in order to assess the effect that more complex STDP rules would have on synaptic dynamics, varying parameters of a trace STDP model. There is little qualitative effect on synaptic dynamics under low frequency (< 25Hz) conditions, justifying the use of simple STDP until further experimental or theoretical evidence suggests otherwise

    New Stability Criterion for Takagi-Sugeno Fuzzy Cohen-Grossberg Neural Networks with Probabilistic Time-Varying Delays

    Get PDF
    A new global asymptotic stability criterion of Takagi-Sugeno fuzzy Cohen-Grossberg neural networks with probabilistic time-varying delays was derived, in which the diffusion item can play its role. Owing to deleting the boundedness conditions on amplification functions, the main result is a novelty to some extent. Besides, there is another novelty in methods, for Lyapunov-Krasovskii functional is the positive definite form of p powers, which is different from those of existing literature. Moreover, a numerical example illustrates the effectiveness of the proposed methods

    Mean Square Exponential Stability of Stochastic Cohen-Grossberg Neural Networks with Unbounded Distributed Delays

    Get PDF
    This paper addresses the issue of mean square exponential stability of stochastic Cohen-Grossberg neural networks (SCGNN), whose state variables are described by stochastic nonlinear integrodifferential equations. With the help of Lyapunov function, stochastic analysis technique, and inequality techniques, some novel sufficient conditions on mean square exponential stability for SCGNN are given. Furthermore, we also establish some sufficient conditions for checking exponential stability for Cohen-Grossberg neural networks with unbounded distributed delays

    P

    Get PDF
    This paper investigates the problem of pth moment exponential stability for a class of stochastic neural networks with time-varying delays and distributed delays under nonlinear impulsive perturbations. By means of Lyapunov functionals, stochastic analysis and differential inequality technique, criteria on pth moment exponential stability of this model are derived. The results of this paper are completely new and complement and improve some of the previously known results (Stamova and Ilarionov (2010), Zhang et al. (2005), Li (2010), Ahmed and Stamova (2008), Huang et al. (2008), Huang et al. (2008), and Stamova (2009)). An example is employed to illustrate our feasible results
    corecore