1,399 research outputs found
Recommended from our members
A delay-dependent LMI approach to dynamics analysis of discrete-time recurrent neural networks with time-varying delays
This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier Ltd.In this Letter, the analysis problem for the existence and stability of periodic solutions is investigated for a class of general discrete-time recurrent neural networks with time-varying delays. For the neural networks under study, a generalized activation function is considered, and the traditional assumptions on the boundedness, monotony and differentiability of the activation functions are removed. By employing the latest free-weighting matrix method, an appropriate Lyapunov–Krasovskii functional is constructed and several sufficient conditions are established to ensure the existence, uniqueness, and globally exponential stability of the periodic solution for the addressed neural network. The conditions are dependent on both the lower bound and upper bound of the time-varying time delays. Furthermore, the conditions are expressed in terms of the linear matrix inequalities (LMIs), which can be checked numerically using the effective LMI toolbox in MATLAB. Two simulation examples are given to show the effectiveness and less conservatism of the proposed criteria.This work was supported in part by the National Natural Science Foundation of China under Grant 50608072, an International Joint Project sponsored by the Royal Society of the UK and the National Natural Science Foundation of China, and the Alexander von Humboldt Foundation of Germany
Multi-almost periodicity and invariant basins of general neural networks under almost periodic stimuli
In this paper, we investigate convergence dynamics of almost periodic
encoded patterns of general neural networks (GNNs) subjected to external almost
periodic stimuli, including almost periodic delays. Invariant regions are
established for the existence of almost periodic encoded patterns under
two classes of activation functions. By employing the property of
-cone and inequality technique, attracting basins are estimated
and some criteria are derived for the networks to converge exponentially toward
almost periodic encoded patterns. The obtained results are new, they
extend and generalize the corresponding results existing in previous
literature.Comment: 28 pages, 4 figure
The Power of Linear Recurrent Neural Networks
Recurrent neural networks are a powerful means to cope with time series. We
show how a type of linearly activated recurrent neural networks, which we call
predictive neural networks, can approximate any time-dependent function f(t)
given by a number of function values. The approximation can effectively be
learned by simply solving a linear equation system; no backpropagation or
similar methods are needed. Furthermore, the network size can be reduced by
taking only most relevant components. Thus, in contrast to others, our approach
not only learns network weights but also the network architecture. The networks
have interesting properties: They end up in ellipse trajectories in the long
run and allow the prediction of further values and compact representations of
functions. We demonstrate this by several experiments, among them multiple
superimposed oscillators (MSO), robotic soccer, and predicting stock prices.
Predictive neural networks outperform the previous state-of-the-art for the MSO
task with a minimal number of units.Comment: 22 pages, 14 figures and tables, revised implementatio
Almost periodic solutions of retarded SICNNs with functional response on piecewise constant argument
We consider a new model for shunting inhibitory cellular neural networks,
retarded functional differential equations with piecewise constant argument.
The existence and exponential stability of almost periodic solutions are
investigated. An illustrative example is provided.Comment: 24 pages, 1 figur
Vibrational control of chaos in artificial neural networks
Neural networks with chaotic baseline behavior are interesting for their experimental bases in both biological relevancy and engineering applicability. In the engineering case, the literature still lacks a robust study of the interrelationship between particular chaotic baseline network dynamics and \u27online\u27 or \u27driving\u27 inputs. We ask the question, for a particular neural network with chaotic baseline behavior, what periodic inputs of minimal magnitude have a stabilizing effect on network dynamics? A genetic algorithm is developed for the task. A systematic comparison of different genetic operators is carried out where each operator-combination is ranked by the optimality of solutions found. The algorithm reaches acceptable results and _finds input sequences with largest elements on the order of 10^3. Lastly, an illustration of the complexity of the fitness space is produced by brute-force sampling period-2 inputs and plotting a fitness map of their stabilizing effect on the network
A Survey on Continuous Time Computations
We provide an overview of theories of continuous time computation. These
theories allow us to understand both the hardness of questions related to
continuous time dynamical systems and the computational power of continuous
time analog models. We survey the existing models, summarizing results, and
point to relevant references in the literature
Perspectives on Multi-Level Dynamics
As Physics did in previous centuries, there is currently a common dream of
extracting generic laws of nature in economics, sociology, neuroscience, by
focalising the description of phenomena to a minimal set of variables and
parameters, linked together by causal equations of evolution whose structure
may reveal hidden principles. This requires a huge reduction of dimensionality
(number of degrees of freedom) and a change in the level of description. Beyond
the mere necessity of developing accurate techniques affording this reduction,
there is the question of the correspondence between the initial system and the
reduced one. In this paper, we offer a perspective towards a common framework
for discussing and understanding multi-level systems exhibiting structures at
various spatial and temporal levels. We propose a common foundation and
illustrate it with examples from different fields. We also point out the
difficulties in constructing such a general setting and its limitations
- …