opus-2021-01-03.zip dataset: opus model: transformer source language(s): deu target language(s): fra model: transformer pre-processing: normalization + SentencePiece (spm32k,spm32k) download: opus-2021-01-03.zip test set translations: opus-2021-01-03.test.txt test set scores: opus-2021-01-03.eval.txt Benchmarks testset BLEU chr-F euelections_dev2019.de-fr-deufra.deu.fra 32.2 0.594 newssyscomb2009-deufra.deu.fra 26.8 0.555 news-test2008-deufra.deu.fra 26.6 0.549 newstest2009-deufra.deu.fra 25.4 0.539 newstest2010-deufra.deu.fra 29.1 0.575 newstest2011-deufra.deu.fra 27.1 0.555 newstest2012-deufra.deu.fra 27.5 0.556 newstest2013-deufra.deu.fra 29.7 0.562 newstest2019-defr-deufra.deu.fra 35.9 0.628 Tatoeba-test.deu.fra 48.6 0.661 Tatoeba-test.deu-fra.deu.fra 48.6 0.661