47,445 research outputs found

    Direct Feedback Alignment with Sparse Connections for Local Learning

    Get PDF
    Recent advances in deep neural networks (DNNs) owe their success to training algorithms that use backpropagation and gradient-descent. Backpropagation, while highly effective on von Neumann architectures, becomes inefficient when scaling to large networks. Commonly referred to as the weight transport problem, each neuron's dependence on the weights and errors located deeper in the network require exhaustive data movement which presents a key problem in enhancing the performance and energy-efficiency of machine-learning hardware. In this work, we propose a bio-plausible alternative to backpropagation drawing from advances in feedback alignment algorithms in which the error computation at a single synapse reduces to the product of three scalar values. Using a sparse feedback matrix, we show that a neuron needs only a fraction of the information previously used by the feedback alignment algorithms. Consequently, memory and compute can be partitioned and distributed whichever way produces the most efficient forward pass so long as a single error can be delivered to each neuron. Our results show orders of magnitude improvement in data movement and 2×2\times improvement in multiply-and-accumulate operations over backpropagation. Like previous work, we observe that any variant of feedback alignment suffers significant losses in classification accuracy on deep convolutional neural networks. By transferring trained convolutional layers and training the fully connected layers using direct feedback alignment, we demonstrate that direct feedback alignment can obtain results competitive with backpropagation. Furthermore, we observe that using an extremely sparse feedback matrix, rather than a dense one, results in a small accuracy drop while yielding hardware advantages. All the code and results are available under https://github.com/bcrafton/ssdfa.Comment: 15 pages, 8 figure

    Nonparametric Weight Initialization of Neural Networks via Integral Representation

    Full text link
    A new initialization method for hidden parameters in a neural network is proposed. Derived from the integral representation of the neural network, a nonparametric probability distribution of hidden parameters is introduced. In this proposal, hidden parameters are initialized by samples drawn from this distribution, and output parameters are fitted by ordinary linear regression. Numerical experiments show that backpropagation with proposed initialization converges faster than uniformly random initialization. Also it is shown that the proposed method achieves enough accuracy by itself without backpropagation in some cases.Comment: For ICLR2014, revised into 9 pages; revised into 12 pages (with supplements

    The Comparison of ReliefF and C.45 for Feature Selection on Heart Disease Classification Using Backpropagation

    Get PDF
    One of the datasets used to predict heart disease is UCI dataset. unfortunately, the dataset contains missing data. the missing data dramatically affects the performance of the backpropagation classification method. One of the techniques used to handle missing data is feature selection. This study compares the ReliefF and the C4.5 algorithm in feature selection to handle missing data. The results of these algorithms are applied to the classification of heart disease using the Backpropagation. The results will be measured based on accuracy, precision, and recall. The performance results of the ReliefF and Backpropagation are an accuracy of 82.653%, a precision of 82.7%, and a recall of 82.7%. The performance results of the C4.5 and backpropagation are an accuracy of 80.61%, a precision of 80.4%, and a recall of 80.6%. Based on the results it can be concluded that the ReliefF gives better performance results on backpropagation than the performance results of the C4.5. Although, the results of C4.5 are below ReliefF but the results are quite satisfactory because of the accuracy, precision and recall results obtained above 80%. This shows that ReliefF and C4.5 can select features that affect the UCI heart disease patient dataset

    Perbandingan Metode Regresi Logistik Biner Dan Metode Backpropagation Dalam Menentukan Model Terbaik Untuk Klasifikasi Pengguna Program Keluarga Berencana

    Full text link
    Indonesia is one of the highest population density in the world has high birth level. One of the regulation to get the population density lower than before that is used by Government is Family Planning Program. On the reality, not all of the productive age join this program. The method is Binary Logistic Regression and Backpropagation. The predictor variables that is researched are husband's age, wife's age, age of the last child, count of children, husband's education, wife's education, husband's job, wife's job and the level of family prosperity. The aim of the research is to compare the classification accuracy between Binary Logistic Regression and Backpropagation. The result of the research by binary logistic regression method, shows the variables that affect the status of KB user is age of the last child and wife's education with the classification accuracy are 66.98%, and the classification accuracy of Backpropagation are 67,30%. The conclution based on the research that is the Backpropagation is better than Binary Logistic Regression when classification the status of KB user in Semarang on March 2013 until Januari 2014

    PERFORMANCE ENHANCEMENT OF BACKPROPAGATIONALGORITHM USING MOMENTUM AND LEARNINGRATEWITH A CASE STUDY ON FINGERPRINT RECOGNITION

    Get PDF
    Artificial Neural Network (ANN) is a branch of artificial intelligence theory that has been used in various applications such as pattern recognition. The advantages of ANN as a system is the ability to imitate human thoughts in computational intelligence such as pattern recognition. ANN is useful to do modelling prediction, error detection and control systems with artificial intelligence approaches and computational design. There are 3 methods that commonly used in ANN heuristic rule, delta-delta rule, and delta- bar-delta rule. Delta-bar-delta rule that use by backpropagation method is the best algorithm to solve the problem input to the network [5]. By applying learning rate [3] in backpropagation algorithm, learning process will be more stable and faster in finding the optimal in the delta (stepsize) by reducing error for optimal solution. Shao and Zheng [4] apply momentum in backpropagation algorithm and the result shows that the error sequence is monotonously decreased during the training procedure and the algorithm is weakly convergent, the gradient of error sequence converges to zero as the training iteration goes on. Fingerprint is one of Biometric identity measurement using pattern recognition that is important to determine the accuracy of personal identification. Fingerprints had strong nature of unchangeable over time and each person is different from the others from one person to another. Conventional biometric fingerprint technology sometimes is inaccurate because the fingerprint position is alterated in scanner tools. This disadvantage can be minimize using ANN method with Backpropagation algorithm. Fingerprint recognition using standard backpropagation shows 66,91% average accuracy and 225 seconds of average training time. The accuracy increases by adding momentum and learning rate with gradual value in Backpropagation algorithm. Average accuracy of 80,9% can be achieved using combination of momentum and learningrate, and 144 seconds average training time. Keywords: Neural Networks, fingerprint patterns, Backpropagation, momentum, learningrat

    Kajian Adaptive Neuro-Fuzzy Inference System (ANFIS) Dalam Memprediksi Penerimaan Mahasiswa Baru Pada Universitas Buana Perjuangan Karawang

    Get PDF
    Abstract - The process of admitting new students is an annual routine activity that occurs in a university. This activity is the starting point of the process of searching for prospective new students who meet the criteria expected by the college. One of the colleges that holds new student admissions every year is Buana Perjuangan University, Karawang. There have been several studies that have been conducted on predictions of new students by other researchers, but the results have not been very satisfying, especially problems with the level of accuracy and error. Research on ANFIS studies to predict new students as a solution to the problem of accuracy. This study uses two ANFIS models, namely Backpropagation and Hybrid techniques. The application of the Adaptive Neuro-Fuzzy Inference System (ANFIS) model in the predictions of new students at Buana Perjuangan University, Karawang was successful. Based on the results of training, the Backpropagation technique has an error rate of 0.0394 and the Hybrid technique has an error rate of 0.0662. Based on the predictive accuracy value that has been done, the Backpropagation technique has an accuracy of 4.8 for the value of Mean Absolute Deviation (MAD) and 0.156364623 for the value of Mean Absolute Percentage Error (MAPE). Meanwhile, based on the Mean Absolute Deviation (MAD) value, the Backpropagation technique has a value of 0.5 and 0.09516671 for the Mean Absolute Percentage Error (MAPE) value. So it can be concluded that the Hybrid technique has a better level of accuracy than the Backpropation technique in predicting the number of new students at the University of Buana Perjuangan Karawang.   Keywords: ANFIS, Backpropagation, Hybrid, Predictio
    • …
    corecore