5,544 research outputs found

    Parallelizing Feed-Forward Artificial Neural Networks on Transputers

    Get PDF
    This thesis is about parallelizing the training phase of a feed-forward, artificial neural network. More specifically, we develop and analyze a number of parallelizations of the widely used neural net learning algorithm called back-propagation. We describe two different strategies for parallelizing the back-propagation algorithm. A number of parallelizations employing these strategies have been implemented on a system of 48 transputers, permitting us to evaluate and analyze their performances based on the results of actual runs
    corecore