19 research outputs found

    Almost periodic solutions of retarded SICNNs with functional response on piecewise constant argument

    Get PDF
    We consider a new model for shunting inhibitory cellular neural networks, retarded functional differential equations with piecewise constant argument. The existence and exponential stability of almost periodic solutions are investigated. An illustrative example is provided.Comment: 24 pages, 1 figur

    Stability analysis for periodic solutions of fuzzy shunting inhibitory CNNs with delays

    Get PDF
    https://advancesindifferenceequations.springeropen.com/articles/10.1186/s13662-019-2321-z#rightslinkWe consider fuzzy shunting inhibitory cellular neural networks (FSICNNs) with time-varying coefficients and constant delays. By virtue of continuation theorem of coincidence degree theory and Cauchy–Schwartz inequality, we prove the existence of periodic solutions for FSICNNs. Furthermore, by employing a suitable Lyapunov functional we establish sufficient criteria which ensure global exponential stability of the periodic solutions. Numerical simulations that support the theoretical discussions are depicted

    Mean almost periodicity and moment exponential stability of discrete-time stochastic shunting inhibitory cellular neural networks with time delays

    Get PDF
    summary:By using the semi-discrete method of differential equations, a new version of discrete analogue of stochastic shunting inhibitory cellular neural networks (SICNNs) is formulated, which gives a more accurate characterization for continuous-time stochastic SICNNs than that by Euler scheme. Firstly, the existence of the 2th mean almost periodic sequence solution of the discrete-time stochastic SICNNs is investigated with the help of Minkowski inequality, Hölder inequality and Krasnoselskii's fixed point theorem. Secondly, the moment global exponential stability of the discrete-time stochastic SICNNs is also studied by using some analytical skills and the proof of contradiction. Finally, two examples are given to demonstrate that our results are feasible. By numerical simulations, we discuss the effect of stochastic perturbation on the almost periodicity and global exponential stability of the discrete-time stochastic SICNNs

    Analysis of global asymptotic stability and pseudo almost periodic solution of a class of chaotic neural networks

    Get PDF
    In this paper we give sufficient conditions ensuring the existence and uniqueness of pseudo almost periodic solution of a class of delayed chaotic neural networks. Further, we study the global asymptotic stability (GAS) of the considered model and give a set of criteria on (GAS) by constructing new Lyapunov functional

    Flexible Image Recognition Software Toolbox (First)

    Get PDF
    The deep convolutional neural network is a new concept in the neural network field, and research is still going on to improve network performance. These networks are used for recognizing patterns in data, as they provide shift invariance, automatic extraction of local features, by using local receptive fields, and improved generalization, by using weight sharing techniques. The main purpose of this thesis is to create a Flexible Image Recognition Software Toolbox (FIRST), which is a software package that allows users to build custom deep networks, while also having ready-made versions of popular deep networks, such as Lecun's LeNet-1, LeNet-5 and LeNet-7. This software package is created for designing, training and simulating deep networks. The goal is to reduce the amount of time required by users to implement any particular network. To design this software package, a general modular framework is introduced, in which simulation and gradient calculations are derived. Due to this modularity and generality, FIRST provides flexibility to users in easily designing specific complex or deep networks. FIRST includes several training algorithms, such as Resilient Backpropagation, Scaled Conjugate Gradient and Steepest Descent. This thesis also describes the usage of the FIRST software and the design of functions used in the software. It also provides information about how to create custom networks. The thesis includes two sample training sessions that demonstrate how to use the FIRST software. One example is phoneme recognition in 1D speech data. The second example is handwritten digit recognition in 2D images.Electrical Engineerin

    26th Annual Computational Neuroscience Meeting (CNS*2017): Part 3 - Meeting Abstracts - Antwerp, Belgium. 15–20 July 2017

    Get PDF
    This work was produced as part of the activities of FAPESP Research,\ud Disseminations and Innovation Center for Neuromathematics (grant\ud 2013/07699-0, S. Paulo Research Foundation). NLK is supported by a\ud FAPESP postdoctoral fellowship (grant 2016/03855-5). ACR is partially\ud supported by a CNPq fellowship (grant 306251/2014-0)

    Visual Cortical Traveling Waves: From Spontaneous Spiking Populations to Stimulus-Evoked Models of Short-Term Prediction

    Get PDF
    Thanks to recent advances in neurotechnology, waves of activity sweeping across entire cortical regions are now routinely observed. Moreover, these waves have been found to impact neural responses as well as perception, and the responses themselves are found to be structured as traveling waves. How exactly do these waves arise? Do they confer any computational advantages? These traveling waves represent an opportunity for an expanded theory of neural computation, in which their dynamic local network activity may complement the moment-to-moment variability of our sensory experience. This thesis aims to help uncover the origin and role of traveling waves in the visual cortex through three Works. In Work 1, by simulating a network of conductance-based spiking neurons with realistically large network size and synaptic density, distance-dependent horizontal axonal time delays were found to be important for the widespread emergence of spontaneous traveling waves consistent with those in vivo. Furthermore, these waves were found to be a dynamic mechanism of gain modulation that may explain the in-vivo result of their modulation of perception. In Work 2, the Kuramoto oscillator model was formulated in the complex domain to study a network possessing distance-dependent time delays. Like in Work 1, these delays produced traveling waves, and the eigenspectrum of the complex-valued delayed matrix, containing a delay operator, provided an analytical explanation of them. In Work 3, the model from Work 2 was adapted into a recurrent neural network for the task of forecasting the frames of videos, with the question of how such a biologically constrained model may be useful in visual computation. We found that the wave activity emergent in this network was helpful, as they were tightly linked with high forecast performance, and shuffle controls revealed simultaneous abolishment of both the waves and performance. All together, these works shed light on the possible origins and uses of traveling waves in the visual cortex. In particular, time delays profoundly shape the spatiotemporal dynamics into traveling waves. This was confirmed numerically (Work 1) and analytically (Work 2). In Work 3, these waves were found to aid in the dynamic computation of visual forecasting

    Brain-Inspired Computing

    Get PDF
    This open access book constitutes revised selected papers from the 4th International Workshop on Brain-Inspired Computing, BrainComp 2019, held in Cetraro, Italy, in July 2019. The 11 papers presented in this volume were carefully reviewed and selected for inclusion in this book. They deal with research on brain atlasing, multi-scale models and simulation, HPC and data infra-structures for neuroscience as well as artificial and natural neural architectures

    A Decade of Neural Networks: Practical Applications and Prospects

    Get PDF
    The Jet Propulsion Laboratory Neural Network Workshop, sponsored by NASA and DOD, brings together sponsoring agencies, active researchers, and the user community to formulate a vision for the next decade of neural network research and application prospects. While the speed and computing power of microprocessors continue to grow at an ever-increasing pace, the demand to intelligently and adaptively deal with the complex, fuzzy, and often ill-defined world around us remains to a large extent unaddressed. Powerful, highly parallel computing paradigms such as neural networks promise to have a major impact in addressing these needs. Papers in the workshop proceedings highlight benefits of neural networks in real-world applications compared to conventional computing techniques. Topics include fault diagnosis, pattern recognition, and multiparameter optimization
    corecore