1,373 research outputs found

    Almost periodic solutions of retarded SICNNs with functional response on piecewise constant argument

    Get PDF
    We consider a new model for shunting inhibitory cellular neural networks, retarded functional differential equations with piecewise constant argument. The existence and exponential stability of almost periodic solutions are investigated. An illustrative example is provided.Comment: 24 pages, 1 figur

    A Survey on Continuous Time Computations

    Full text link
    We provide an overview of theories of continuous time computation. These theories allow us to understand both the hardness of questions related to continuous time dynamical systems and the computational power of continuous time analog models. We survey the existing models, summarizing results, and point to relevant references in the literature

    Existence and Exponential Stability of Solutions for Stochastic Cellular Neural Networks with Piecewise Constant Argument

    Get PDF
    By using the concept of differential equations with piecewise constant argument of generalized type, a model of stochastic cellular neural networks with piecewise constant argument is developed. Sufficient conditions are obtained for the existence and uniqueness of the equilibrium point for the addressed neural networks. pth moment exponential stability is investigated by means of Lyapunov functional, stochastic analysis, and inequality technique. The results in this paper improve and generalize some of the previous ones. An example with numerical simulations is given to illustrate our results

    Collective stability of networks of winner-take-all circuits

    Full text link
    The neocortex has a remarkably uniform neuronal organization, suggesting that common principles of processing are employed throughout its extent. In particular, the patterns of connectivity observed in the superficial layers of the visual cortex are consistent with the recurrent excitation and inhibitory feedback required for cooperative-competitive circuits such as the soft winner-take-all (WTA). WTA circuits offer interesting computational properties such as selective amplification, signal restoration, and decision making. But, these properties depend on the signal gain derived from positive feedback, and so there is a critical trade-off between providing feedback strong enough to support the sophisticated computations, while maintaining overall circuit stability. We consider the question of how to reason about stability in very large distributed networks of such circuits. We approach this problem by approximating the regular cortical architecture as many interconnected cooperative-competitive modules. We demonstrate that by properly understanding the behavior of this small computational module, one can reason over the stability and convergence of very large networks composed of these modules. We obtain parameter ranges in which the WTA circuit operates in a high-gain regime, is stable, and can be aggregated arbitrarily to form large stable networks. We use nonlinear Contraction Theory to establish conditions for stability in the fully nonlinear case, and verify these solutions using numerical simulations. The derived bounds allow modes of operation in which the WTA network is multi-stable and exhibits state-dependent persistent activities. Our approach is sufficiently general to reason systematically about the stability of any network, biological or technological, composed of networks of small modules that express competition through shared inhibition.Comment: 7 Figure

    Stationary bumps in a piecewise smooth neural field model with synaptic depression

    Get PDF
    We analyze the existence and stability of stationary pulses or bumps in a one–dimensional piecewise smooth neural field model with synaptic depression. The continuum dynamics is described in terms of a nonlocal integrodifferential equation, in which the integral kernel represents the spatial distribution of synaptic weights between populations of neurons whose mean firing rate is taken to be a Heaviside function of local activity. Synaptic depression dynamically reduces the strength of synaptic weights in response to increases in activity. We show that in the case of a Mexican hat weight distribution, there exists a stable bump for sufficiently weak synaptic depression. However, as synaptic depression becomes stronger, the bump became unstable with respect to perturbations that shift the boundary of the bump, leading to the formation of a traveling pulse. The local stability of a bump is determined by the spectrum of a piecewise linear operator that keeps track of the sign of perturbations of the bump boundary. This results in a number of differences from previous studies of neural field models with Heaviside firing rate functions, where any discontinuities appear inside convolutions so that the resulting dynamical system is smooth. We also extend our results to the case of radially symmetric bumps in two–dimensional neural field models

    How single neuron properties shape chaotic dynamics and signal transmission in random neural networks

    Full text link
    While most models of randomly connected networks assume nodes with simple dynamics, nodes in realistic highly connected networks, such as neurons in the brain, exhibit intrinsic dynamics over multiple timescales. We analyze how the dynamical properties of nodes (such as single neurons) and recurrent connections interact to shape the effective dynamics in large randomly connected networks. A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate units shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single units. For the case of two-dimensional rate units with strong adaptation, we find that the network exhibits a state of "resonant chaos", characterized by robust, narrow-band stochastic oscillations. The coherence of stochastic oscillations is maximal at the onset of chaos and their correlation time scales with the adaptation timescale of single units. Surprisingly, the resonance frequency can be predicted from the properties of isolated units, even in the presence of heterogeneity in the adaptation parameters. In the presence of these internally-generated chaotic fluctuations, the transmission of weak, low-frequency signals is strongly enhanced by adaptation, whereas signal transmission is not influenced by adaptation in the non-chaotic regime. Our theoretical framework can be applied to other mechanisms at the level of single nodes, such as synaptic filtering, refractoriness or spike synchronization. These results advance our understanding of the interaction between the dynamics of single units and recurrent connectivity, which is a fundamental step toward the description of biologically realistic network models in the brain, or, more generally, networks of other physical or man-made complex dynamical units
    corecore