781 research outputs found

    Multi-almost periodicity and invariant basins of general neural networks under almost periodic stimuli

    Full text link
    In this paper, we investigate convergence dynamics of 2N2^N almost periodic encoded patterns of general neural networks (GNNs) subjected to external almost periodic stimuli, including almost periodic delays. Invariant regions are established for the existence of 2N2^N almost periodic encoded patterns under two classes of activation functions. By employing the property of M\mathscr{M}-cone and inequality technique, attracting basins are estimated and some criteria are derived for the networks to converge exponentially toward 2N2^N almost periodic encoded patterns. The obtained results are new, they extend and generalize the corresponding results existing in previous literature.Comment: 28 pages, 4 figure

    Global exponential synchronization of quaternion-valued memristive neural networks with time delays

    Get PDF
    This paper extends the memristive neural networks (MNNs) to quaternion field, a new class of neural networks named quaternion-valued memristive neural networks (QVMNNs) is then established, and the problem of drive-response global synchronization of this type of networks is investigated in this paper. Two cases are taken into consideration: one is with the conventional differential inclusion assumption, the other without. Criteria for the global synchronization of these two cases are achieved respectively by appropriately choosing the Lyapunov functional and applying some inequality techniques. Finally, corresponding simulation examples are presented to demonstrate the correctness of the proposed results derived in this paper

    Nonlinear dynamics of full-range CNNs with time-varying delays and variable coefficients

    Get PDF
    In the article, the dynamical behaviours of the full-range cellular neural networks (FRCNNs) with variable coefficients and time-varying delays are considered. Firstly, the improved model of the FRCNNs is proposed, and the existence and uniqueness of the solution are studied by means of differential inclusions and set-valued analysis. Secondly, by using the Hardy inequality, the matrix analysis, and the Lyapunov functional method, we get some criteria for achieving the globally exponential stability (GES). Finally, some examples are provided to verify the correctness of the theoretical results

    Finite-time stabilization for fractional-order inertial neural networks with time varying delays

    Get PDF
    This paper deals with the finite-time stabilization of fractional-order inertial neural network with varying time-delays (FOINNs). Firstly, by correctly selected variable substitution, the system is transformed into a first-order fractional differential equation. Secondly, by building Lyapunov functionalities and using analytical techniques, as well as new control algorithms (which include the delay-dependent and delay-free controller), novel and effective criteria are established to attain the finite-time stabilization of the addressed system. Finally, two examples are used to illustrate the effectiveness and feasibility of the obtained results

    Contrastive learning and neural oscillations

    Get PDF
    The concept of Contrastive Learning (CL) is developed as a family of possible learning algorithms for neural networks. CL is an extension of Deterministic Boltzmann Machines to more general dynamical systems. During learning, the network oscillates between two phases. One phase has a teacher signal and one phase has no teacher signal. The weights are updated using a learning rule that corresponds to gradient descent on a contrast function that measures the discrepancy between the free network and the network with a teacher signal. The CL approach provides a general unified framework for developing new learning algorithms. It also shows that many different types of clamping and teacher signals are possible. Several examples are given and an analysis of the landscape of the contrast function is proposed with some relevant predictions for the CL curves. An approach that may be suitable for collective analog implementations is described. Simulation results and possible extensions are briefly discussed together with a new conjecture regarding the function of certain oscillations in the brain. In the appendix, we also examine two extensions of contrastive learning to time-dependent trajectories

    Pulse shape and voltage-dependent synchronization in spiking neuron networks

    Full text link
    Pulse-coupled spiking neural networks are a powerful tool to gain mechanistic insights into how neurons self-organize to produce coherent collective behavior. These networks use simple spiking neuron models, such as the θ\theta-neuron or the quadratic integrate-and-fire (QIF) neuron, that replicate the essential features of real neural dynamics. Interactions between neurons are modeled with infinitely narrow pulses, or spikes, rather than the more complex dynamics of real synapses. To make these networks biologically more plausible, it has been proposed that they must also account for the finite width of the pulses, which can have a significant impact on the network dynamics. However, the derivation and interpretation of these pulses is contradictory and the impact of the pulse shape on the network dynamics is largely unexplored. Here, I take a comprehensive approach to pulse-coupling in networks of QIF and θ\theta-neurons. I argue that narrow pulses activate voltage-dependent synaptic conductances and show how to implement them in QIF neurons such that their effect can last through the phase after the spike. Using an exact low-dimensional description for networks of globally coupled spiking neurons, I prove for instantaneous interactions that collective oscillations emerge due to an effective coupling through the mean voltage. I analyze the impact of the pulse shape by means of a family of smooth pulse functions with arbitrary finite width and symmetric or asymmetric shapes. For symmetric pulses, the resulting voltage-coupling is little effective in synchronizing neurons, but pulses that are slightly skewed to the phase after the spike readily generate collective oscillations. The results unveil a voltage-dependent spike synchronization mechanism in neural networks, which is facilitated by pulses of finite width and complementary to traditional synaptic transmission.Comment: 38 pages, 11 figure

    Computation and Learning in High Dimensions (hybrid meeting)

    Get PDF
    The most challenging problems in science often involve the learning and accurate computation of high dimensional functions. High-dimensionality is a typical feature for a multitude of problems in various areas of science. The so-called curse of dimensionality typically negates the use of traditional numerical techniques for the solution of high-dimensional problems. Instead, novel theoretical and computational approaches need to be developed to make them tractable and to capture fine resolutions and relevant features. Paradoxically, increasing computational power may even serve to heighten this demand, since the wealth of new computational data itself becomes a major obstruction. Extracting essential information from complex problem-inherent structures and developing rigorous models to quantify the quality of information in a high-dimensional setting pose challenging tasks from both theoretical and numerical perspective. This has led to the emergence of several new computational methodologies, accounting for the fact that by now well understood methods drawing on spatial localization and mesh-refinement are in their original form no longer viable. Common to these approaches is the nonlinearity of the solution method. For certain problem classes, these methods have drastically advanced the frontiers of computability. The most visible of these new methods is deep learning. Although the use of deep neural networks has been extremely successful in certain application areas, their mathematical understanding is far from complete. This workshop proposed to deepen the understanding of the underlying mathematical concepts that drive this new evolution of computational methods and to promote the exchange of ideas emerging in various disciplines about how to treat multiscale and high-dimensional problems
    • …
    corecore