15,852 research outputs found
From features to speaker vectors by means of restricted Boltzmann machine adaptation
Restricted Boltzmann Machines (RBMs) have shown success in different stages of speaker recognition systems. In this paper, we propose a novel framework to produce a vector-based representation for each speaker, which will be referred to as RBM-vector. This new approach maps the speaker spectral features to a single fixed-dimensional vector carrying speaker-specific information. In this work, a global model, referred to as Universal RBM (URBM), is trained taking advantage of RBM unsupervised learning capabilities. Then, this URBM is adapted
to the data of each speaker in the development, enrolment and
evaluation datasets. The network connection weights of the adapted RBMs are further concatenated and subject to a whitening with dimension reduction stage to build the speaker vectors. The evaluation is performed on the core test condition of the NIST SRE 2006 database, and it is shown that RBM-vectors achieve 15% relative improvement in terms of EER compared to i-vectors using cosine scoring. The score fusion with i-vector attains more than 24% relative improvement. The interest of this result for score fusion yields on the fact that both vectors are produced in an unsupervised fashion and can be used instead of i-vector/PLDA approach, when no data label is available. Results obtained for RBM-vector/PLDA framework is comparable with the ones from i-vector/PLDA. Their score fusion achieves 14% relative improvement compared to i-vector/PLDA.Peer ReviewedPostprint (published version
Training Restricted Boltzmann Machines on Word Observations
The restricted Boltzmann machine (RBM) is a flexible tool for modeling
complex data, however there have been significant computational difficulties in
using RBMs to model high-dimensional multinomial observations. In natural
language processing applications, words are naturally modeled by K-ary discrete
distributions, where K is determined by the vocabulary size and can easily be
in the hundreds of thousands. The conventional approach to training RBMs on
word observations is limited because it requires sampling the states of K-way
softmax visible units during block Gibbs updates, an operation that takes time
linear in K. In this work, we address this issue by employing a more general
class of Markov chain Monte Carlo operators on the visible units, yielding
updates with computational complexity independent of K. We demonstrate the
success of our approach by training RBMs on hundreds of millions of word
n-grams using larger vocabularies than previously feasible and using the
learned features to improve performance on chunking and sentiment
classification tasks, achieving state-of-the-art results on the latter
Energy Disaggregation for Real-Time Building Flexibility Detection
Energy is a limited resource which has to be managed wisely, taking into
account both supply-demand matching and capacity constraints in the
distribution grid. One aspect of the smart energy management at the building
level is given by the problem of real-time detection of flexible demand
available. In this paper we propose the use of energy disaggregation techniques
to perform this task. Firstly, we investigate the use of existing
classification methods to perform energy disaggregation. A comparison is
performed between four classifiers, namely Naive Bayes, k-Nearest Neighbors,
Support Vector Machine and AdaBoost. Secondly, we propose the use of Restricted
Boltzmann Machine to automatically perform feature extraction. The extracted
features are then used as inputs to the four classifiers and consequently shown
to improve their accuracy. The efficiency of our approach is demonstrated on a
real database consisting of detailed appliance-level measurements with high
temporal resolution, which has been used for energy disaggregation in previous
studies, namely the REDD. The results show robustness and good generalization
capabilities to newly presented buildings with at least 96% accuracy.Comment: To appear in IEEE PES General Meeting, 2016, Boston, US
Discriminative conditional restricted Boltzmann machine for discrete choice and latent variable modelling
Conventional methods of estimating latent behaviour generally use attitudinal
questions which are subjective and these survey questions may not always be
available. We hypothesize that an alternative approach can be used for latent
variable estimation through an undirected graphical models. For instance,
non-parametric artificial neural networks. In this study, we explore the use of
generative non-parametric modelling methods to estimate latent variables from
prior choice distribution without the conventional use of measurement
indicators. A restricted Boltzmann machine is used to represent latent
behaviour factors by analyzing the relationship information between the
observed choices and explanatory variables. The algorithm is adapted for latent
behaviour analysis in discrete choice scenario and we use a graphical approach
to evaluate and understand the semantic meaning from estimated parameter vector
values. We illustrate our methodology on a financial instrument choice dataset
and perform statistical analysis on parameter sensitivity and stability. Our
findings show that through non-parametric statistical tests, we can extract
useful latent information on the behaviour of latent constructs through machine
learning methods and present strong and significant influence on the choice
process. Furthermore, our modelling framework shows robustness in input
variability through sampling and validation
- …