Mathematical modeling of hidden layer architecture in artificial neural networks

Abstract

The performance of a multilayer artificial neural network is very much depends on the architecture of the hidden layers. Therefore, modeling of hidden layer architecture has become a research challenge. At present most of the models of hidden layer architecture have been confined to neural networks with one hidden layer. However, this approach may not be the most appropriate solution for the given task. In this research we have come up with an approach to model hidden layer architecture with arbitrary number of layers and neurons. An approach has been presented to trim the hidden layer architecture during the training cycle while meets the pre-defined error rate. The experiments show that new theory can train artificial neural networks with lesser training time through a simpler architecture that maintains the same error rate as the Back propagation

    Similar works