research

Synchronization and information transmission in networks

Abstract

The amount of information produced by a network may be measured by the mutual information rate. This measure, the Kolmogorov-Sinai entropy and the synchronization interval are expressed in terms of the transversal Lyapunov exponents. Thus, these concepts are related and we proved that the larger the synchronization is, the larger the rate with which information is exchanged between nodes in the network. In fact, as the coupling parameter increases, the mutual information rate increases to a maximum at the synchronization interval and then decreases. Moreover, the Kolmogorov-Sinai entropy decreases until reaching a minimum at the synchronization interval and then increases. We present some numerical simulations considering two different versions of coupling two maps, a complete network and a lattice, which confirmed our theoretical results

    Similar works