6,099 research outputs found

    Learning and generation of long-range correlated sequences

    Full text link
    We study the capability to learn and to generate long-range, power-law correlated sequences by a fully connected asymmetric network. The focus is set on the ability of neural networks to extract statistical features from a sequence. We demonstrate that the average power-law behavior is learnable, namely, the sequence generated by the trained network obeys the same statistical behavior. The interplay between a correlated weight matrix and the sequence generated by such a network is explored. A weight matrix with a power-law correlation function along the vertical direction, gives rise to a sequence with a similar statistical behavior.Comment: 5 pages, 3 figures, accepted for publication in Physical Review

    Secure and linear cryptosystems using error-correcting codes

    Full text link
    A public-key cryptosystem, digital signature and authentication procedures based on a Gallager-type parity-check error-correcting code are presented. The complexity of the encryption and the decryption processes scale linearly with the size of the plaintext Alice sends to Bob. The public-key is pre-corrupted by Bob, whereas a private-noise added by Alice to a given fraction of the ciphertext of each encrypted plaintext serves to increase the secure channel and is the cornerstone for digital signatures and authentication. Various scenarios are discussed including the possible actions of the opponent Oscar as an eavesdropper or as a disruptor

    Secure exchange of information by synchronization of neural networks

    Full text link
    A connection between the theory of neural networks and cryptography is presented. A new phenomenon, namely synchronization of neural networks is leading to a new method of exchange of secret messages. Numerical simulations show that two artificial networks being trained by Hebbian learning rule on their mutual outputs develop an antiparallel state of their synaptic weights. The synchronized weights are used to construct an ephemeral key exchange protocol for a secure transmission of secret data. It is shown that an opponent who knows the protocol and all details of any transmission of the data has no chance to decrypt the secret message, since tracking the weights is a hard problem compared to synchronization. The complexity of the generation of the secure channel is linear with the size of the network.Comment: 11 pages, 5 figure

    Learning and predicting time series by neural networks

    Full text link
    Artificial neural networks which are trained on a time series are supposed to achieve two abilities: firstly to predict the series many time steps ahead and secondly to learn the rule which has produced the series. It is shown that prediction and learning are not necessarily related to each other. Chaotic sequences can be learned but not predicted while quasiperiodic sequences can be well predicted but not learned.Comment: 5 page

    Generation of unpredictable time series by a Neural Network

    Full text link
    A perceptron that learns the opposite of its own output is used to generate a time series. We analyse properties of the weight vector and the generated sequence, like the cycle length and the probability distribution of generated sequences. A remarkable suppression of the autocorrelation function is explained, and connections to the Bernasconi model are discussed. If a continuous transfer function is used, the system displays chaotic and intermittent behaviour, with the product of the learning rate and amplification as a control parameter.Comment: 11 pages, 14 figures; slightly expanded and clarified, mistakes corrected; accepted for publication in PR

    Training a perceptron by a bit sequence: Storage capacity

    Full text link
    A perceptron is trained by a random bit sequence. In comparison to the corresponding classification problem, the storage capacity decreases to alpha_c=1.70\pm 0.02 due to correlations between input and output bits. The numerical results are supported by a signal to noise analysis of Hebbian weights.Comment: LaTeX, 13 pages incl. 4 figures and 1 tabl

    Do rBST-Free and Organic Milk Stigmatize Conventionally Produced Milk?

    Get PDF
    Producers are continually seeking to differentiate their products in the marketplace. A common approach is via labeling where differences in production methods are marketed. Yet, positive labeling for the new product has the potential to stigmatize the conventionally produced product by highlighting perceived problems with the product. The net economic result can be negative to producers as the conventional product that dominates the market is stigmatized by the new product that has little market share, and this leads to consumers decreasing their willingness to pay for the conventional product. This experimental research identifies this stigma effect in the case of milk, where the presentation of rBST-Free milk reduces consumers' willingness to purchase conventional milk.Demand and Price Analysis,

    Genetic attack on neural cryptography

    Full text link
    Different scaling properties for the complexity of bidirectional synchronization and unidirectional learning are essential for the security of neural cryptography. Incrementing the synaptic depth of the networks increases the synchronization time only polynomially, but the success of the geometric attack is reduced exponentially and it clearly fails in the limit of infinite synaptic depth. This method is improved by adding a genetic algorithm, which selects the fittest neural networks. The probability of a successful genetic attack is calculated for different model parameters using numerical simulations. The results show that scaling laws observed in the case of other attacks hold for the improved algorithm, too. The number of networks needed for an effective attack grows exponentially with increasing synaptic depth. In addition, finite-size effects caused by Hebbian and anti-Hebbian learning are analyzed. These learning rules converge to the random walk rule if the synaptic depth is small compared to the square root of the system size.Comment: 8 pages, 12 figures; section 5 amended, typos correcte

    Synchronization of unidirectional time delay chaotic networks and the greatest common divisor

    Full text link
    We present the interplay between synchronization of unidirectional coupled chaotic nodes with heterogeneous delays and the greatest common divisor (GCD) of loops composing the oriented graph. In the weak chaos region and for GCD=1 the network is in chaotic zero-lag synchronization, whereas for GCD=m>1 synchronization of m-sublattices emerges. Complete synchronization can be achieved when all chaotic nodes are influenced by an identical set of delays and in particular for the limiting case of homogeneous delays. Results are supported by simulations of chaotic systems, self-consistent and mixing arguments, as well as analytical solutions of Bernoulli maps.Comment: 7 pages, 5 figure
    • …
    corecore