29,564 research outputs found

    A low variance error boosting algorithm

    Get PDF
    This paper introduces a robust variant of AdaBoost, cw-AdaBoost, that uses weight perturbation to reduce variance error, and is particularly effective when dealing with data sets, such as microarray data, which have large numbers of features and small number of instances. The algorithm is compared with AdaBoost, Arcing and MultiBoost, using twelve gene expression datasets, using 10-fold cross validation. The new algorithm consistently achieves higher classification accuracy over all these datasets. In contrast to other AdaBoost variants, the algorithm is not susceptible to problems when a zero-error base classifier is encountered

    Analog hardware for delta-backpropagation neural networks

    Get PDF
    This is a fully parallel analog backpropagation learning processor which comprises a plurality of programmable resistive memory elements serving as synapse connections whose values can be weighted during learning with buffer amplifiers, summing circuits, and sample-and-hold circuits arranged in a plurality of neuron layers in accordance with delta-backpropagation algorithms modified so as to control weight changes due to circuit drift

    Mean Field Bayes Backpropagation: scalable training of multilayer neural networks with binary weights

    Full text link
    Significant success has been reported recently using deep neural networks for classification. Such large networks can be computationally intensive, even after training is over. Implementing these trained networks in hardware chips with a limited precision of synaptic weights may improve their speed and energy efficiency by several orders of magnitude, thus enabling their integration into small and low-power electronic devices. With this motivation, we develop a computationally efficient learning algorithm for multilayer neural networks with binary weights, assuming all the hidden neurons have a fan-out of one. This algorithm, derived within a Bayesian probabilistic online setting, is shown to work well for both synthetic and real-world problems, performing comparably to algorithms with real-valued weights, while retaining computational tractability

    An on-line training radial basis function neural network for optimum operation of the UPFC

    Get PDF
    The concept of Flexible A.C. Transmission Systems (FACTS) technology was developed to enhance the performance of electric power networks (both in steady-state and transient-state) and to make better utilization of existing power transmission facilities. The continuous improvement in power ratings and switching performance of power electronic devices together with advances in circuit design and control techniques are making this concept and devices employed in FACTS more commercially attractive. The Unified Power Flow Controller (UPFC) is one of the main FACTS devices that have a wide implication on the power transmission systems and distribution. The purpose of this paper is to explore the use of Radial Basis Function Neural Network (RBFNN) to control the operation of the UPFC in order to improve its dynamic performance. The performance of the proposed controller compares favourably with the conventional PI and the off-line trained controller. The simple structure of the proposed controller reduces the computational requirements and emphasizes its appropriateness for on-line operation. Real-time implementation of the controller is achieved through using dSPACE ds1103 control and data acquisition board. Simulation and experimental results are presented to demonstrate the robustness of the proposed controller against changes in the transmission system operating conditions
    • ā€¦
    corecore