13,412 research outputs found
Learning and generation of long-range correlated sequences
We study the capability to learn and to generate long-range, power-law
correlated sequences by a fully connected asymmetric network. The focus is set
on the ability of neural networks to extract statistical features from a
sequence. We demonstrate that the average power-law behavior is learnable,
namely, the sequence generated by the trained network obeys the same
statistical behavior. The interplay between a correlated weight matrix and the
sequence generated by such a network is explored. A weight matrix with a
power-law correlation function along the vertical direction, gives rise to a
sequence with a similar statistical behavior.Comment: 5 pages, 3 figures, accepted for publication in Physical Review
Optimum Asymptotic Multiuser Efficiency of Pseudo-Orthogonal Randomly Spread CDMA
A -user pseudo-orthogonal (PO) randomly spread CDMA system, equivalent to
transmission over a subset of single-user Gaussian channels, is
introduced. The high signal-to-noise ratio performance of the PO-CDMA is
analyzed by rigorously deriving its asymptotic multiuser efficiency (AME) in
the large system limit. Interestingly, the -optimized PO-CDMA transceiver
scheme yields an AME which is practically equal to 1 for system loads smaller
than 0.1 and lower bounded by 1/4 for increasing loads. As opposed to the
vanishing efficiency of linear multiuser detectors, the derived efficiency is
comparable to the ultimate CDMA efficiency achieved for the intractable optimal
multiuser detector.Comment: WIC 27th Symposium on Information Theory in the Benelux, 200
Neural Cryptography
Two neural networks which are trained on their mutual output bits show a
novel phenomenon: The networks synchronize to a state with identical time
dependent weights. It is shown how synchronization by mutual learning can be
applied to cryptography: secret key exchange over a public channel.Comment: 9th International Conference on Neural Information Processing,
Singapore, Nov. 200
Robust chaos generation by a perceptron
The properties of time series generated by a perceptron with monotonic and
non-monotonic transfer function, where the next input vector is determined from
past output values, are examined. Analysis of the parameter space reveals the
following main finding: a perceptron with a monotonic function can produce
fragile chaos only whereas a non-monotonic function can generate robust chaos
as well. For non-monotonic functions, the dimension of the attractor can be
controlled monotonically by tuning a natural parameter in the model.Comment: 7 pages, 5 figures (reduced quality), accepted for publication in
EuroPhysics Letter
Interacting neural networks and cryptography
Two neural networks which are trained on their mutual output bits are
analysed using methods of statistical physics. The exact solution of the
dynamics of the two weight vectors shows a novel phenomenon: The networks
synchronize to a state with identical time dependent weights. Extending the
models to multilayer networks with discrete weights, it is shown how
synchronization by mutual learning can be applied to secret key exchange over a
public channel.Comment: Invited talk for the meeting of the German Physical Societ
- …