56,586 research outputs found
Contrastive Hebbian Learning with Random Feedback Weights
Neural networks are commonly trained to make predictions through learning
algorithms. Contrastive Hebbian learning, which is a powerful rule inspired by
gradient backpropagation, is based on Hebb's rule and the contrastive
divergence algorithm. It operates in two phases, the forward (or free) phase,
where the data are fed to the network, and a backward (or clamped) phase, where
the target signals are clamped to the output layer of the network and the
feedback signals are transformed through the transpose synaptic weight
matrices. This implies symmetries at the synaptic level, for which there is no
evidence in the brain. In this work, we propose a new variant of the algorithm,
called random contrastive Hebbian learning, which does not rely on any synaptic
weights symmetries. Instead, it uses random matrices to transform the feedback
signals during the clamped phase, and the neural dynamics are described by
first order non-linear differential equations. The algorithm is experimentally
verified by solving a Boolean logic task, classification tasks (handwritten
digits and letters), and an autoencoding task. This article also shows how the
parameters affect learning, especially the random matrices. We use the
pseudospectra analysis to investigate further how random matrices impact the
learning process. Finally, we discuss the biological plausibility of the
proposed algorithm, and how it can give rise to better computational models for
learning
Customer profiling using classification approach for bank telemarketing
Telemarketing is a type of direct marketing where a salesperson contacts the customers to sell products or services over the phone. The database of prospective customers comes from direct marketing database. It is important for the company to predict the set of customers with highest probability to accept the sales or offer based on their personal characteristics or behaviour during shopping. Recently, companies have started to resort to data mining approaches for customer profiling. This project focuses on helping banks to increase the accuracy of their customer profiling through classification as well as identifying a group of customers who have a high probability to subscribe to a long-term deposit. In the experiments, three classification algorithms are used, which are Naïve Bayes, Random Forest, and Decision Tree. The experiments measured accuracy percentage, precision and recall rates and showed that classification is useful for predicting customer profiles and increasing telemarketing sales
Corporation robots
Nowadays, various robots are built to perform multiple tasks. Multiple robots working
together to perform a single task becomes important. One of the key elements for multiple
robots to work together is the robot need to able to follow another robot. This project is
mainly concerned on the design and construction of the robots that can follow line. In this
project, focuses on building line following robots leader and slave. Both of these robots will
follow the line and carry load. A Single robot has a limitation on handle load capacity such as
cannot handle heavy load and cannot handle long size load. To overcome this limitation an
easier way is to have a groups of mobile robots working together to accomplish an aim that
no single robot can do alon
An ultra-fast method for gain and noise prediction of Raman amplifiers
A machine learning method for prediction of Raman gain and noise spectra is
presented: it guarantees high-accuracy (RMSE < 0.4 dB) and low computational
complexity making it suitable for real-time implementation in future optical
networks controllers
Harnessing machine learning for fiber-induced nonlinearity mitigation in long-haul coherent optical OFDM
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).Coherent optical orthogonal frequency division multiplexing (CO-OFDM) has attracted a lot of interest in optical fiber communications due to its simplified digital signal processing (DSP) units, high spectral-efficiency, flexibility, and tolerance to linear impairments. However, CO-OFDM’s high peak-to-average power ratio imposes high vulnerability to fiber-induced non-linearities. DSP-based machine learning has been considered as a promising approach for fiber non-linearity compensation without sacrificing computational complexity. In this paper, we review the existing machine learning approaches for CO-OFDM in a common framework and review the progress in this area with a focus on practical aspects and comparison with benchmark DSP solutions.Peer reviewe
- …