16,023 research outputs found

    Functional Optimisation of Online Algorithms in Multilayer Neural Networks

    Full text link
    We study the online dynamics of learning in fully connected soft committee machines in the student-teacher scenario. The locally optimal modulation function, which determines the learning algorithm, is obtained from a variational argument in such a manner as to maximise the average generalisation error decay per example. Simulations results for the resulting algorithm are presented for a few cases. The symmetric phase plateaux are found to be vastly reduced in comparison to those found when online backpropagation algorithms are used. A discussion of the implementation of these ideas as practical algorithms is given

    Coherent 100G Nonlinear Compensation with Single-Step Digital Backpropagation

    Full text link
    Enhanced-SSFM digital backpropagation (DBP) is experimentally demonstrated and compared to conventional DBP. A 112 Gb/s PM-QPSK signal is transmitted over a 3200 km dispersion-unmanaged link. The intradyne coherent receiver includes single-step digital backpropagation based on the enhanced-SSFM algorithm. In comparison, conventional DBP requires twenty steps to achieve the same performance. An analysis of the computational complexity and structure of the two algorithms reveals that the overall complexity and power consumption of DBP are reduced by a factor of 16 with respect to a conventional implementation, while the computation time is reduced by a factor of 20. As a result, the proposed algorithm enables a practical and effective implementation of DBP in real-time optical receivers, with only a moderate increase of the computational complexity, power consumption, and latency with respect to a simple feed-forward equalizer for dispersion compensation.Comment: This work has been presented at Optical Networks Design & Modeling (ONDM) 2015, Pisa, Italy, May 11-14, 201

    Comparative performance of some popular ANN algorithms on benchmark and function approximation problems

    Full text link
    We report an inter-comparison of some popular algorithms within the artificial neural network domain (viz., Local search algorithms, global search algorithms, higher order algorithms and the hybrid algorithms) by applying them to the standard benchmarking problems like the IRIS data, XOR/N-Bit parity and Two Spiral. Apart from giving a brief description of these algorithms, the results obtained for the above benchmark problems are presented in the paper. The results suggest that while Levenberg-Marquardt algorithm yields the lowest RMS error for the N-bit Parity and the Two Spiral problems, Higher Order Neurons algorithm gives the best results for the IRIS data problem. The best results for the XOR problem are obtained with the Neuro Fuzzy algorithm. The above algorithms were also applied for solving several regression problems such as cos(x) and a few special functions like the Gamma function, the complimentary Error function and the upper tail cumulative χ2\chi^2-distribution function. The results of these regression problems indicate that, among all the ANN algorithms used in the present study, Levenberg-Marquardt algorithm yields the best results. Keeping in view the highly non-linear behaviour and the wide dynamic range of these functions, it is suggested that these functions can be also considered as standard benchmark problems for function approximation using artificial neural networks.Comment: 18 pages 5 figures. Accepted in Pramana- Journal of Physic
    • …
    corecore