18,681 research outputs found

    Pattern Classification using Artificial Neural Networks

    Get PDF
    Classification is a data mining (machine learning) technique used to predict group membership for data instances. Pattern Classification involves building a function that maps the input feature space to an output space of two or more than two classes.Neural Networks (NN) are an effective tool in the field of pattern classification, using training and testing data to build a model. However, the success of the networks is highly dependent on the performance of the training process and hence the training algorithm. Many training algorithms have been proposed so far to improve the performance of neural networks. In this project, we shall make a comparative study of training feedforward neural network using the three algorithms - Backpropagation Algorithm, Modified Backpropagation Algorithm and Optical Backpropagation Algorithm. These algorithms differ only on the basis of their error functions.We shall train the neural networks using these algorithms and taking 75 instances from the iris dataset (taken from the UCI repository and then normalised) ; 25 from each class. The total number of epochs required to reach the degree of accuracy is referred to as the convergence rate. The basic criteria of comparison process are the convergence rate and the classification accuracy. To check the efficiency of the three training algorithms, graphs are plotted between No. of Epochs vs. Mean Square Error(MSE). The training process continues till M.S.E falls to a value 0.01. The effect of using the momentum and learning rate on the performance of algorithm are also observed. The comparison is then extended to compare the performance of multilayer feedforward network with Probabilistic network

    Analog hardware for delta-backpropagation neural networks

    Get PDF
    This is a fully parallel analog backpropagation learning processor which comprises a plurality of programmable resistive memory elements serving as synapse connections whose values can be weighted during learning with buffer amplifiers, summing circuits, and sample-and-hold circuits arranged in a plurality of neuron layers in accordance with delta-backpropagation algorithms modified so as to control weight changes due to circuit drift

    A Circuit-Based Neural Network with Hybrid Learning of Backpropagation and Random Weight Change Algorithms.

    Get PDF
    A hybrid learning method of a software-based backpropagation learning and a hardware-based RWC learning is proposed for the development of circuit-based neural networks. The backpropagation is known as one of the most efficient learning algorithms. A weak point is that its hardware implementation is extremely difficult. The RWC algorithm, which is very easy to implement with respect to its hardware circuits, takes too many iterations for learning. The proposed learning algorithm is a hybrid one of these two. The main learning is performed with a software version of the BP algorithm, firstly, and then, learned weights are transplanted on a hardware version of a neural circuit. At the time of the weight transplantation, a significant amount of output error would occur due to the characteristic difference between the software and the hardware. In the proposed method, such error is reduced via a complementary learning of the RWC algorithm, which is implemented in a simple hardware. The usefulness of the proposed hybrid learning system is verified via simulations upon several classical learning problems

    Modeling Financial Time Series with Artificial Neural Networks

    Full text link
    Financial time series convey the decisions and actions of a population of human actors over time. Econometric and regressive models have been developed in the past decades for analyzing these time series. More recently, biologically inspired artificial neural network models have been shown to overcome some of the main challenges of traditional techniques by better exploiting the non-linear, non-stationary, and oscillatory nature of noisy, chaotic human interactions. This review paper explores the options, benefits, and weaknesses of the various forms of artificial neural networks as compared with regression techniques in the field of financial time series analysis.CELEST, a National Science Foundation Science of Learning Center (SBE-0354378); SyNAPSE program of the Defense Advanced Research Project Agency (HR001109-03-0001

    Supervised Learning in Multilayer Spiking Neural Networks

    Get PDF
    The current article introduces a supervised learning algorithm for multilayer spiking neural networks. The algorithm presented here overcomes some limitations of existing learning algorithms as it can be applied to neurons firing multiple spikes and it can in principle be applied to any linearisable neuron model. The algorithm is applied successfully to various benchmarks, such as the XOR problem and the Iris data set, as well as complex classifications problems. The simulations also show the flexibility of this supervised learning algorithm which permits different encodings of the spike timing patterns, including precise spike trains encoding.Comment: 38 pages, 4 figure
    corecore