1 research outputs found

    A BENCHMARK FOR THE RATE OF CONVERGENCE IN NEURAL NETWORK CLASSIFICATION ALGORITHMS

    No full text
    The purpose of this paper is to demonstrate a new benchmark for comparing the rate of convergence in neural network classification algorithms. The benchmark produces datasets with controllable complexity that can be used by an algorithm. The dataset generator uses the concept of random numbers and linear normalization to generate the data. In a case of a one-layer perceptron, the output datasets are sensitive to weight or bias of the perceptron. A Matlab implemented algorithm analyzed the sample datasets and the benchmark results. The results demonstrate that the convergence time varies based on some selected specifications of the generated dataset. This benchmark and the generated datasets can be used by researches that work on neural network algorithms and are looking for a straightforward and flexible dataset to examine and evaluate the efficiency of neural network classification algorithms
    corecore