While end-to-end neural machine translation (NMT) has made remarkable
progress recently, it still suffers from the data scarcity problem for
low-resource language pairs and domains. In this paper, we propose a method for
zero-resource NMT by assuming that parallel sentences have close probabilities
of generating a sentence in a third language. Based on this assumption, our
method is able to train a source-to-target NMT model ("student") without
parallel corpora available, guided by an existing pivot-to-target NMT model
("teacher") on a source-pivot parallel corpus. Experimental results show that
the proposed method significantly improves over a baseline pivot-based model by
+3.0 BLEU points across various language pairs.Comment: Accepted as a long paper by ACL 201