634 research outputs found
Lyapunov-based synchronization of two coupled chaotic Hindmarsh-Rose neurons
This paper addresses the synchronization of coupled chaotic Hindmarsh-Rose neurons. A sufficient condition for self-synchronization is first attained by using Lyapunov method. Also, to achieve the synchronization between two coupled Hindmarsh-Rose neurons when the self-synchronization condition not satisfied, a Lyapunov-based nonlinear control law is proposed and its asymptotic stability is proved. To verify the effectiveness of the proposed method, numerical simulations are performed
Synchronization in Networks of Hindmarsh-Rose Neurons
Synchronization is deemed to play an important role in information processing in many neuronal systems. In this work, using a well known technique due to Pecora and Carroll, we investigate the existence of a synchronous state and the bifurcation diagram of a network of synaptically coupled neurons described by the Hindmarsh-Rose model. Through the analysis of the bifurcation diagram, the different dynamics of the possible synchronous states are evidenced. Furthermore, the influence of the topology on the synchronization properties of the network is shown through an exampl
Efficient synchronization of structurally adaptive coupled Hindmarsh-Rose neurons
The use of spikes to carry information between brain areas implies complete
or partial synchronization of the neurons involved. The degree of
synchronization reached by two coupled systems and the energy cost of
maintaining their synchronized behaviour is highly dependent on the nature of
the systems. For non-identical systems the maintenance of a synchronized regime
is energetically a costly process. In this work, we study conditions under
which two non-identical electrically coupled neurons can reach an efficient
regime of synchronization at low energy cost. We show that the energy
consumption required to keep the synchronized regime can be spontaneously
reduced if the receiving neuron has adaptive mechanisms able to bring its
biological parameters closer in value to the corresponding ones in the sending
neuron
Do brain networks evolve by maximizing their information flow capacity?
We propose a working hypothesis supported by numerical simulations that brain networks evolve based on the principle of the maximization of their internal information flow capacity. We find that synchronous behavior and capacity of information flow of the evolved networks reproduce well the same behaviors observed in the brain dynamical networks of Caenorhabditis elegans and humans, networks of Hindmarsh-Rose neurons with graphs given by these brain networks. We make a strong case to verify our hypothesis by showing that the neural networks with the closest graph distance to the brain networks of Caenorhabditis elegans and humans are the Hindmarsh-Rose neural networks evolved with coupling strengths that maximize information flow capacity. Surprisingly, we find that global neural synchronization levels decrease during brain evolution, reflecting on an underlying global no Hebbian-like evolution process, which is driven by no Hebbian-like learning behaviors for some of the clusters during evolution, and Hebbian-like learning rules for clusters where neurons increase their synchronization
Energy efficiency of information transmission by electrically coupled neurons
The generation of spikes by neurons is energetically a costly process. This
paper studies the consumption of energy and the information entropy in the
signalling activity of a model neuron both when it is supposed isolated and
when it is coupled to another neuron by an electrical synapse. The neuron has
been modelled by a four dimensional Hindmarsh-Rose type kinetic model for which
an energy function has been deduced. For the isolated neuron values of energy
consumption and information entropy at different signalling regimes have been
computed. For two neurons coupled by a gap junction we have analyzed the roles
of the membrane and synapse in the contribution of the energy that is required
for their organized signalling. Computational results are provided for cases of
identical and nonidentical neurons coupled by unidirectional and bidirectional
gap junctions. One relevant result is that there are values of the coupling
strength at which the organized signalling of two neurons induced by the gap
junction takes place at relatively low values of energy consumption and the
ratio of mutual information to energy consumption is relatively high.
Therefore, communicating at these coupling values could be energetically the
most efficient option
Chaotic image encryption using hopfield and hindmarshârose neurons implemented on FPGA
Chaotic systems implemented by artificial neural networks are good candidates for data encryption. In this manner, this paper introduces the cryptographic application of the Hopfield and the HindmarshâRose neurons. The contribution is focused on finding suitable coefficient values of the neurons to generate robust random binary sequences that can be used in image encryption. This task is performed by evaluating the bifurcation diagrams from which one chooses appropriate coefficient values of the mathematical models that produce high positive Lyapunov exponent and KaplanâYorke dimension values, which are computed using TISEAN. The randomness of both the Hopfield and the HindmarshâRose neurons is evaluated from chaotic time series data by performing National Institute of Standard and Technology (NIST) tests. The implementation of both neurons is done using field-programmable gate arrays whose architectures are used to develop an encryption system for RGB images. The success of the encryption system is confirmed by performing correlation, histogram, variance, entropy, and Number of Pixel Change Rate (NPCR) tests
- âŠ