5,638 research outputs found

    Robust chaos generation by a perceptron

    Full text link
    The properties of time series generated by a perceptron with monotonic and non-monotonic transfer function, where the next input vector is determined from past output values, are examined. Analysis of the parameter space reveals the following main finding: a perceptron with a monotonic function can produce fragile chaos only whereas a non-monotonic function can generate robust chaos as well. For non-monotonic functions, the dimension of the attractor can be controlled monotonically by tuning a natural parameter in the model.Comment: 7 pages, 5 figures (reduced quality), accepted for publication in EuroPhysics Letter

    Learning and generation of long-range correlated sequences

    Full text link
    We study the capability to learn and to generate long-range, power-law correlated sequences by a fully connected asymmetric network. The focus is set on the ability of neural networks to extract statistical features from a sequence. We demonstrate that the average power-law behavior is learnable, namely, the sequence generated by the trained network obeys the same statistical behavior. The interplay between a correlated weight matrix and the sequence generated by such a network is explored. A weight matrix with a power-law correlation function along the vertical direction, gives rise to a sequence with a similar statistical behavior.Comment: 5 pages, 3 figures, accepted for publication in Physical Review

    The Entropy of a Binary Hidden Markov Process

    Full text link
    The entropy of a binary symmetric Hidden Markov Process is calculated as an expansion in the noise parameter epsilon. We map the problem onto a one-dimensional Ising model in a large field of random signs and calculate the expansion coefficients up to second order in epsilon. Using a conjecture we extend the calculation to 11th order and discuss the convergence of the resulting series

    Learning and predicting time series by neural networks

    Full text link
    Artificial neural networks which are trained on a time series are supposed to achieve two abilities: firstly to predict the series many time steps ahead and secondly to learn the rule which has produced the series. It is shown that prediction and learning are not necessarily related to each other. Chaotic sequences can be learned but not predicted while quasiperiodic sequences can be well predicted but not learned.Comment: 5 page

    Secure exchange of information by synchronization of neural networks

    Full text link
    A connection between the theory of neural networks and cryptography is presented. A new phenomenon, namely synchronization of neural networks is leading to a new method of exchange of secret messages. Numerical simulations show that two artificial networks being trained by Hebbian learning rule on their mutual outputs develop an antiparallel state of their synaptic weights. The synchronized weights are used to construct an ephemeral key exchange protocol for a secure transmission of secret data. It is shown that an opponent who knows the protocol and all details of any transmission of the data has no chance to decrypt the secret message, since tracking the weights is a hard problem compared to synchronization. The complexity of the generation of the secure channel is linear with the size of the network.Comment: 11 pages, 5 figure

    Generation of unpredictable time series by a Neural Network

    Full text link
    A perceptron that learns the opposite of its own output is used to generate a time series. We analyse properties of the weight vector and the generated sequence, like the cycle length and the probability distribution of generated sequences. A remarkable suppression of the autocorrelation function is explained, and connections to the Bernasconi model are discussed. If a continuous transfer function is used, the system displays chaotic and intermittent behaviour, with the product of the learning rate and amplification as a control parameter.Comment: 11 pages, 14 figures; slightly expanded and clarified, mistakes corrected; accepted for publication in PR

    Pulses of chaos synchronization in coupled map chains with delayed transmission

    Full text link
    Pulses of synchronization in chaotic coupled map lattices are discussed in the context of transmission of information. Synchronization and desynchronization propagate along the chain with different velocities which are calculated analytically from the spectrum of convective Lyapunov exponents. Since the front of synchronization travels slower than the front of desynchronization, the maximal possible chain length for which information can be transmitted by modulating the first unit of the chain is bounded.Comment: 4 pages, 6 figures, updated version as published in PR

    Public channel cryptography by synchronization of neural networks and chaotic maps

    Full text link
    Two different kinds of synchronization have been applied to cryptography: Synchronization of chaotic maps by one common external signal and synchronization of neural networks by mutual learning. By combining these two mechanisms, where the external signal to the chaotic maps is synchronized by the nets, we construct a hybrid network which allows a secure generation of secret encryption keys over a public channel. The security with respect to attacks, recently proposed by Shamir et al, is increased by chaotic synchronization.Comment: 4 page
    • …
    corecore