2,427 research outputs found

    Learning and generation of long-range correlated sequences

    Full text link
    We study the capability to learn and to generate long-range, power-law correlated sequences by a fully connected asymmetric network. The focus is set on the ability of neural networks to extract statistical features from a sequence. We demonstrate that the average power-law behavior is learnable, namely, the sequence generated by the trained network obeys the same statistical behavior. The interplay between a correlated weight matrix and the sequence generated by such a network is explored. A weight matrix with a power-law correlation function along the vertical direction, gives rise to a sequence with a similar statistical behavior.Comment: 5 pages, 3 figures, accepted for publication in Physical Review

    Robust chaos generation by a perceptron

    Full text link
    The properties of time series generated by a perceptron with monotonic and non-monotonic transfer function, where the next input vector is determined from past output values, are examined. Analysis of the parameter space reveals the following main finding: a perceptron with a monotonic function can produce fragile chaos only whereas a non-monotonic function can generate robust chaos as well. For non-monotonic functions, the dimension of the attractor can be controlled monotonically by tuning a natural parameter in the model.Comment: 7 pages, 5 figures (reduced quality), accepted for publication in EuroPhysics Letter

    The Entropy of a Binary Hidden Markov Process

    Full text link
    The entropy of a binary symmetric Hidden Markov Process is calculated as an expansion in the noise parameter epsilon. We map the problem onto a one-dimensional Ising model in a large field of random signs and calculate the expansion coefficients up to second order in epsilon. Using a conjecture we extend the calculation to 11th order and discuss the convergence of the resulting series

    Secure exchange of information by synchronization of neural networks

    Full text link
    A connection between the theory of neural networks and cryptography is presented. A new phenomenon, namely synchronization of neural networks is leading to a new method of exchange of secret messages. Numerical simulations show that two artificial networks being trained by Hebbian learning rule on their mutual outputs develop an antiparallel state of their synaptic weights. The synchronized weights are used to construct an ephemeral key exchange protocol for a secure transmission of secret data. It is shown that an opponent who knows the protocol and all details of any transmission of the data has no chance to decrypt the secret message, since tracking the weights is a hard problem compared to synchronization. The complexity of the generation of the secure channel is linear with the size of the network.Comment: 11 pages, 5 figure

    Statistical mechanical aspects of joint source-channel coding

    Full text link
    An MN-Gallager Code over Galois fields, qq, based on the Dynamical Block Posterior probabilities (DBP) for messages with a given set of autocorrelations is presented with the following main results: (a) for a binary symmetric channel the threshold, fcf_c, is extrapolated for infinite messages using the scaling relation for the median convergence time, tmed1/(fcf)t_{med} \propto 1/(f_c-f); (b) a degradation in the threshold is observed as the correlations are enhanced; (c) for a given set of autocorrelations the performance is enhanced as qq is increased; (d) the efficiency of the DBP joint source-channel coding is slightly better than the standard gzip compression method; (e) for a given entropy, the performance of the DBP algorithm is a function of the decay of the correlation function over large distances.Comment: 6 page

    Learning and predicting time series by neural networks

    Full text link
    Artificial neural networks which are trained on a time series are supposed to achieve two abilities: firstly to predict the series many time steps ahead and secondly to learn the rule which has produced the series. It is shown that prediction and learning are not necessarily related to each other. Chaotic sequences can be learned but not predicted while quasiperiodic sequences can be well predicted but not learned.Comment: 5 page

    Training a perceptron by a bit sequence: Storage capacity

    Full text link
    A perceptron is trained by a random bit sequence. In comparison to the corresponding classification problem, the storage capacity decreases to alpha_c=1.70\pm 0.02 due to correlations between input and output bits. The numerical results are supported by a signal to noise analysis of Hebbian weights.Comment: LaTeX, 13 pages incl. 4 figures and 1 tabl

    Generation of unpredictable time series by a Neural Network

    Full text link
    A perceptron that learns the opposite of its own output is used to generate a time series. We analyse properties of the weight vector and the generated sequence, like the cycle length and the probability distribution of generated sequences. A remarkable suppression of the autocorrelation function is explained, and connections to the Bernasconi model are discussed. If a continuous transfer function is used, the system displays chaotic and intermittent behaviour, with the product of the learning rate and amplification as a control parameter.Comment: 11 pages, 14 figures; slightly expanded and clarified, mistakes corrected; accepted for publication in PR
    corecore