7,620 research outputs found

    A Neural Network model with Bidirectional Whitening

    Full text link
    We present here a new model and algorithm which performs an efficient Natural gradient descent for Multilayer Perceptrons. Natural gradient descent was originally proposed from a point of view of information geometry, and it performs the steepest descent updates on manifolds in a Riemannian space. In particular, we extend an approach taken by the "Whitened neural networks" model. We make the whitening process not only in feed-forward direction as in the original model, but also in the back-propagation phase. Its efficacy is shown by an application of this "Bidirectional whitened neural networks" model to a handwritten character recognition data (MNIST data).Comment: 16page

    Bootstrap for neural model selection

    Full text link
    Bootstrap techniques (also called resampling computation techniques) have introduced new advances in modeling and model evaluation. Using resampling methods to construct a series of new samples which are based on the original data set, allows to estimate the stability of the parameters. Properties such as convergence and asymptotic normality can be checked for any particular observed data set. In most cases, the statistics computed on the generated data sets give a good idea of the confidence regions of the estimates. In this paper, we debate on the contribution of such methods for model selection, in the case of feedforward neural networks. The method is described and compared with the leave-one-out resampling method. The effectiveness of the bootstrap method, versus the leave-one-out methode, is checked through a number of examples.Comment: A la suite de la conf\'{e}rence ESANN 200

    Error correcting code using tree-like multilayer perceptron

    Full text link
    An error correcting code using a tree-like multilayer perceptron is proposed. An original message \mbi{s}^0 is encoded into a codeword \boldmath{y}_0 using a tree-like committee machine (committee tree) or a tree-like parity machine (parity tree). Based on these architectures, several schemes featuring monotonic or non-monotonic units are introduced. The codeword \mbi{y}_0 is then transmitted via a Binary Asymmetric Channel (BAC) where it is corrupted by noise. The analytical performance of these schemes is investigated using the replica method of statistical mechanics. Under some specific conditions, some of the proposed schemes are shown to saturate the Shannon bound at the infinite codeword length limit. The influence of the monotonicity of the units on the performance is also discussed.Comment: 23 pages, 3 figures, Content has been extended and revise

    Phase Transitions of Neural Networks

    Full text link
    The cooperative behaviour of interacting neurons and synapses is studied using models and methods from statistical physics. The competition between training error and entropy may lead to discontinuous properties of the neural network. This is demonstrated for a few examples: Perceptron, associative memory, learning from examples, generalization, multilayer networks, structure recognition, Bayesian estimate, on-line training, noise estimation and time series generation.Comment: Plenary talk for MINERVA workshop on mesoscopics, fractals and neural networks, Eilat, March 1997 Postscript Fil

    Replica Symmetry Breaking and the Kuhn-Tucker Cavity Method in simple and multilayer Perceptrons

    Full text link
    Within a Kuhn-Tucker cavity method introduced in a former paper, we study optimal stability learning for situations, where in the replica formalism the replica symmetry may be broken, namely (i) the case of a simple perceptron above the critical loading, and (ii) the case of two-layer AND-perceptrons, if one learns with maximal stability. We find that the deviation of our cavity solution from the replica symmetric one in these cases is a clear indication of the necessity of replica symmetry breaking. In any case the cavity solution tends to underestimate the storage capabilities of the networks.Comment: 32 pages, LaTex Source with 9 .eps-files enclosed, accepted by J. Phys I (France
    corecore