1,399 research outputs found

    Multi-almost periodicity and invariant basins of general neural networks under almost periodic stimuli

    Full text link
    In this paper, we investigate convergence dynamics of 2N2^N almost periodic encoded patterns of general neural networks (GNNs) subjected to external almost periodic stimuli, including almost periodic delays. Invariant regions are established for the existence of 2N2^N almost periodic encoded patterns under two classes of activation functions. By employing the property of M\mathscr{M}-cone and inequality technique, attracting basins are estimated and some criteria are derived for the networks to converge exponentially toward 2N2^N almost periodic encoded patterns. The obtained results are new, they extend and generalize the corresponding results existing in previous literature.Comment: 28 pages, 4 figure

    The Power of Linear Recurrent Neural Networks

    Full text link
    Recurrent neural networks are a powerful means to cope with time series. We show how a type of linearly activated recurrent neural networks, which we call predictive neural networks, can approximate any time-dependent function f(t) given by a number of function values. The approximation can effectively be learned by simply solving a linear equation system; no backpropagation or similar methods are needed. Furthermore, the network size can be reduced by taking only most relevant components. Thus, in contrast to others, our approach not only learns network weights but also the network architecture. The networks have interesting properties: They end up in ellipse trajectories in the long run and allow the prediction of further values and compact representations of functions. We demonstrate this by several experiments, among them multiple superimposed oscillators (MSO), robotic soccer, and predicting stock prices. Predictive neural networks outperform the previous state-of-the-art for the MSO task with a minimal number of units.Comment: 22 pages, 14 figures and tables, revised implementatio

    Almost periodic solutions of retarded SICNNs with functional response on piecewise constant argument

    Get PDF
    We consider a new model for shunting inhibitory cellular neural networks, retarded functional differential equations with piecewise constant argument. The existence and exponential stability of almost periodic solutions are investigated. An illustrative example is provided.Comment: 24 pages, 1 figur

    Vibrational control of chaos in artificial neural networks

    Get PDF
    Neural networks with chaotic baseline behavior are interesting for their experimental bases in both biological relevancy and engineering applicability. In the engineering case, the literature still lacks a robust study of the interrelationship between particular chaotic baseline network dynamics and \u27online\u27 or \u27driving\u27 inputs. We ask the question, for a particular neural network with chaotic baseline behavior, what periodic inputs of minimal magnitude have a stabilizing effect on network dynamics? A genetic algorithm is developed for the task. A systematic comparison of different genetic operators is carried out where each operator-combination is ranked by the optimality of solutions found. The algorithm reaches acceptable results and _finds input sequences with largest elements on the order of 10^3. Lastly, an illustration of the complexity of the fitness space is produced by brute-force sampling period-2 inputs and plotting a fitness map of their stabilizing effect on the network

    A Survey on Continuous Time Computations

    Full text link
    We provide an overview of theories of continuous time computation. These theories allow us to understand both the hardness of questions related to continuous time dynamical systems and the computational power of continuous time analog models. We survey the existing models, summarizing results, and point to relevant references in the literature

    Perspectives on Multi-Level Dynamics

    Get PDF
    As Physics did in previous centuries, there is currently a common dream of extracting generic laws of nature in economics, sociology, neuroscience, by focalising the description of phenomena to a minimal set of variables and parameters, linked together by causal equations of evolution whose structure may reveal hidden principles. This requires a huge reduction of dimensionality (number of degrees of freedom) and a change in the level of description. Beyond the mere necessity of developing accurate techniques affording this reduction, there is the question of the correspondence between the initial system and the reduced one. In this paper, we offer a perspective towards a common framework for discussing and understanding multi-level systems exhibiting structures at various spatial and temporal levels. We propose a common foundation and illustrate it with examples from different fields. We also point out the difficulties in constructing such a general setting and its limitations
    corecore