2,895 research outputs found

    Secure exchange of information by synchronization of neural networks

    Full text link
    A connection between the theory of neural networks and cryptography is presented. A new phenomenon, namely synchronization of neural networks is leading to a new method of exchange of secret messages. Numerical simulations show that two artificial networks being trained by Hebbian learning rule on their mutual outputs develop an antiparallel state of their synaptic weights. The synchronized weights are used to construct an ephemeral key exchange protocol for a secure transmission of secret data. It is shown that an opponent who knows the protocol and all details of any transmission of the data has no chance to decrypt the secret message, since tracking the weights is a hard problem compared to synchronization. The complexity of the generation of the secure channel is linear with the size of the network.Comment: 11 pages, 5 figure

    California College Promise Program Characteristics and Perceptions from the Field

    Get PDF
    This report describes College Promise programs in California, including the number of College Promise programs, features, and general perceptions held by practitioners, leaders, and policymakers

    Cryptography based on neural networks - analytical results

    Full text link
    Mutual learning process between two parity feed-forward networks with discrete and continuous weights is studied analytically, and we find that the number of steps required to achieve full synchronization between the two networks in the case of discrete weights is finite. The synchronization process is shown to be non-self-averaging and the analytical solution is based on random auxiliary variables. The learning time of an attacker that is trying to imitate one of the networks is examined analytically and is found to be much longer than the synchronization time. Analytical results are found to be in agreement with simulations

    Nonlocal mechanism for cluster synchronization in neural circuits

    Full text link
    The interplay between the topology of cortical circuits and synchronized activity modes in distinct cortical areas is a key enigma in neuroscience. We present a new nonlocal mechanism governing the periodic activity mode: the greatest common divisor (GCD) of network loops. For a stimulus to one node, the network splits into GCD-clusters in which cluster neurons are in zero-lag synchronization. For complex external stimuli, the number of clusters can be any common divisor. The synchronized mode and the transients to synchronization pinpoint the type of external stimuli. The findings, supported by an information mixing argument and simulations of Hodgkin Huxley population dynamic networks with unidirectional connectivity and synaptic noise, call for reexamining sources of correlated activity in cortex and shorter information processing time scales.Comment: 8 pges, 6 figure

    Mutual learning in a tree parity machine and its application to cryptography

    Full text link
    Mutual learning of a pair of tree parity machines with continuous and discrete weight vectors is studied analytically. The analysis is based on a mapping procedure that maps the mutual learning in tree parity machines onto mutual learning in noisy perceptrons. The stationary solution of the mutual learning in the case of continuous tree parity machines depends on the learning rate where a phase transition from partial to full synchronization is observed. In the discrete case the learning process is based on a finite increment and a full synchronized state is achieved in a finite number of steps. The synchronization of discrete parity machines is introduced in order to construct an ephemeral key-exchange protocol. The dynamic learning of a third tree parity machine (an attacker) that tries to imitate one of the two machines while the two still update their weight vectors is also analyzed. In particular, the synchronization times of the naive attacker and the flipping attacker recently introduced in [1] are analyzed. All analytical results are found to be in good agreement with simulation results

    On time's arrow in Ehrenfest models with reversible deterministic dynamics

    Full text link
    We introduce a deterministic, time-reversible version of the Ehrenfest urn model. The distribution of first-passage times from equilibrium to non-equilibrium states and vice versa is calculated. We find that average times for transition to non-equilibrium always scale exponentially with the system size, whereas the time scale for relaxation to equilibrium depends on microscopic dynamics. To illustrate this, we also look at deterministic and stochastic versions of the Ehrenfest model with a distribution of microscopic relaxation times.Comment: 6 pages, 7 figures, revte

    The most creative organization in the world? The BBC, 'creativity' and managerial style

    Get PDF
    The managerial styles of two BBC directors-general, John Birt and Greg Dyke, have often been contrasted but not so far analysed from the perspective of their different views of 'creative management'. This article first addresses the orthodox reading of 'Birtism'; second, it locates Dyke's 'creative' turn in the wider context of fashionable neo-management theory and UK government creative industries policy; third, it details Dyke's drive to change the BBC's culture; and finally, it concludes with some reflections on the uncertainties inherent in managing a creative organisation

    Training a perceptron in a discrete weight space

    Full text link
    On-line and batch learning of a perceptron in a discrete weight space, where each weight can take 2L+12 L+1 different values, are examined analytically and numerically. The learning algorithm is based on the training of the continuous perceptron and prediction following the clipped weights. The learning is described by a new set of order parameters, composed of the overlaps between the teacher and the continuous/clipped students. Different scenarios are examined among them on-line learning with discrete/continuous transfer functions and off-line Hebb learning. The generalization error of the clipped weights decays asymptotically as exp(−Kα2)exp(-K \alpha^2)/exp(−e∣λ∣α)exp(-e^{|\lambda| \alpha}) in the case of on-line learning with binary/continuous activation functions, respectively, where α\alpha is the number of examples divided by N, the size of the input vector and KK is a positive constant that decays linearly with 1/L. For finite NN and LL, a perfect agreement between the discrete student and the teacher is obtained for α∝Lln⁥(NL)\alpha \propto \sqrt{L \ln(NL)}. A crossover to the generalization error ∝1/α\propto 1/\alpha, characterized continuous weights with binary output, is obtained for synaptic depth L>O(N)L > O(\sqrt{N}).Comment: 10 pages, 5 figs., submitted to PR

    Interplay of composition, structure, magnetism, and superconductivity in SmFeAs1-xPxO1-y

    Full text link
    Polycrystalline samples and single crystals of SmFeAs1-xPxO1-y were synthesized and grown employing different synthesis methods and annealing conditions. Depending on the phosphorus and oxygen content, the samples are either magnetic or superconducting. In the fully oxygenated compounds the main impact of phosphorus substitution is to suppress the N\'eel temperature TN of the spin density wave (SDW) state, and to strongly reduce the local magnetic field in the SDW state, as deduced from muon spin rotation measurements. On the other hand the superconducting state is observed in the oxygen deficient samples only after heat treatment under high pressure. Oxygen deficiency as a result of synthesis at high pressure brings the Sm-O layer closer to the superconducting As/P-Fe-As/P block and provides additional electron transfer. Interestingly, the structural modifications in response to this variation of the electron count are significantly different when phosphorus is partly substituting arsenic. Point contact spectra are well described with two superconducting gaps. Magnetic and resistance measurements on single crystals indicate an in-plane magnetic penetration depth of 200 nm and an anisotropy of the upper critical field slope of 4-5. PACS number(s): 74.70.Xa, 74.62.Bf, 74.25.-q, 81.20.-nComment: 36 pages, 13 figures, 2 table
    • 

    corecore