Complex bit sequences generated by a perceptron that learns the opposite of its own prediction are studied, and the success of a student perceptron trained on this sequence is calculated. A system of interacting perceptrons with a directed flow of information is solved analytically. A symmetry breaking phase transition is found with increasing learning rate. A system of interacting perceptrons can develop a good strategy for the problem of adaptive competition known as the minority game. Simple models of neural networks describe a wide variety of phenomena in neurobiology and information theory. Neural networks are systems of elements interacting by adaptive couplings which are trained by a set of examples. After training they function as content addressable associative memory, as classifiers or as prediction algorithms. Using methods of statistical physics many of these phenomena have been elucidated analytically for infinitely large single neural networks . Many phenomena in biology, social science and computer science may be modelled by
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.