3,046 research outputs found
Effective Approaches to Attention-based Neural Machine Translation
An attentional mechanism has lately been used to improve neural machine
translation (NMT) by selectively focusing on parts of the source sentence
during translation. However, there has been little work exploring useful
architectures for attention-based NMT. This paper examines two simple and
effective classes of attentional mechanism: a global approach which always
attends to all source words and a local one that only looks at a subset of
source words at a time. We demonstrate the effectiveness of both approaches
over the WMT translation tasks between English and German in both directions.
With local attention, we achieve a significant gain of 5.0 BLEU points over
non-attentional systems which already incorporate known techniques such as
dropout. Our ensemble model using different attention architectures has
established a new state-of-the-art result in the WMT'15 English to German
translation task with 25.9 BLEU points, an improvement of 1.0 BLEU points over
the existing best system backed by NMT and an n-gram reranker.Comment: 11 pages, 7 figures, EMNLP 2015 camera-ready version, more training
detail
Novel insights into transfer processes in the reaction 16O+208Pb at sub-barrier energies
The collision of the doubly-magic nuclei O+Pb is a benchmark
in nuclear reaction studies. Our new measurements of back-scattered
projectile-like fragments at sub-barrier energies show show that transfer of 2
protons () is much more probable than -particle transfer.
transfer probabilities are strongly enhanced compared to expectations for the
sequential transfer of two uncorrelated protons; at energies around the fusion
barrier absolute probabilities for two proton transfer are similar to those for
one proton transfer. This strong enhancement indicates strong pairing
correlations in O, and suggests evidence for the occurrence of a nuclear
supercurrent of two-proton Cooper pairs in this reaction, already at energies
well below the fusion barrier.Comment: 5 pages, 3 figure
- …
