98 research outputs found
Anomaly Detection on Graph Time Series
In this paper, we use variational recurrent neural network to investigate the
anomaly detection problem on graph time series. The temporal correlation is
modeled by the combination of recurrent neural network (RNN) and variational
inference (VI), while the spatial information is captured by the graph
convolutional network. In order to incorporate external factors, we use feature
extractor to augment the transition of latent variables, which can learn the
influence of external factors. With the target function as accumulative ELBO,
it is easy to extend this model to on-line method. The experimental study on
traffic flow data shows the detection capability of the proposed method
Bridging the Gap between Probabilistic and Deterministic Models: A Simulation Study on a Variational Bayes Predictive Coding Recurrent Neural Network Model
The current paper proposes a novel variational Bayes predictive coding RNN
model, which can learn to generate fluctuated temporal patterns from exemplars.
The model learns to maximize the lower bound of the weighted sum of the
regularization and reconstruction error terms. We examined how this weighting
can affect development of different types of information processing while
learning fluctuated temporal patterns. Simulation results show that strong
weighting of the reconstruction term causes the development of deterministic
chaos for imitating the randomness observed in target sequences, while strong
weighting of the regularization term causes the development of stochastic
dynamics imitating probabilistic processes observed in targets. Moreover,
results indicate that the most generalized learning emerges between these two
extremes. The paper concludes with implications in terms of the underlying
neuronal mechanisms for autism spectrum disorder and for free action.Comment: This paper is accepted the 24th International Conference On Neural
Information Processing (ICONIP 2017). The previous submission to arXiv is
replaced by this version because there was an error in Equation
A Stochastic Decoder for Neural Machine Translation
The process of translation is ambiguous, in that there are typically many
valid trans- lations for a given sentence. This gives rise to significant
variation in parallel cor- pora, however, most current models of machine
translation do not account for this variation, instead treating the prob- lem
as a deterministic process. To this end, we present a deep generative model of
machine translation which incorporates a chain of latent variables, in order to
ac- count for local lexical and syntactic varia- tion in parallel corpora. We
provide an in- depth analysis of the pitfalls encountered in variational
inference for training deep generative models. Experiments on sev- eral
different language pairs demonstrate that the model consistently improves over
strong baselines.Comment: Accepted at ACL 201
- …