20,414 research outputs found
Improving Variational Encoder-Decoders in Dialogue Generation
Variational encoder-decoders (VEDs) have shown promising results in dialogue
generation. However, the latent variable distributions are usually approximated
by a much simpler model than the powerful RNN structure used for encoding and
decoding, yielding the KL-vanishing problem and inconsistent training
objective. In this paper, we separate the training step into two phases: The
first phase learns to autoencode discrete texts into continuous embeddings,
from which the second phase learns to generalize latent representations by
reconstructing the encoded embedding. In this case, latent variables are
sampled by transforming Gaussian noise through multi-layer perceptrons and are
trained with a separate VED model, which has the potential of realizing a much
more flexible distribution. We compare our model with current popular models
and the experiment demonstrates substantial improvement in both metric-based
and human evaluations.Comment: Accepted by AAAI201
Exact Results of Strongly Correlated Systems at Finite Temperature
Some rigorous conclusions of the Hubbard model, Kondo lattice model and
periodic Anderson model at finite temperature are acquired employing the
fluctuation-dissipation theorem and particle-hole transform. The main
conclusion states that for the three models, the expectation value of will be of order at any finite
temperature.Comment: 8 pages, no figures, LATEX, corrected some typos, to appear in Phys.
Lett.
- …