1 research outputs found

    Adaptive Moment Estimation To Minimize Square Error In Backpropagation Algorithm

    Get PDF
    Back - propagation Neural Network has weaknesses such as errors of gradient descent training slowly of error function, training time is too long and is easy to fall into local optimum. Back - propagation algorithm is one of the artificial neural network training algorithm that has weaknesses such as the convergence of long, over-fitting and easy to get stuck in local optima. Back - propagation is used to minimize errors in each iteration. This paper investigates and evaluates the performance of Adaptive Moment Estimation (ADAM) to minimize the squared error in back - propagation gradient descent algorithm. Adaptive Estimation moment can speed up the training and achieve the level of acceleration to get linear. ADAM can adapt to changes in the system, and can optimize many parameters with a low calculation. The results of the study indicate that the performance of adaptive moment estimation can minimize the squared error in the output of neural networks
    corecore