ID | Benchmark (bleu) | Output | OPUS-MT | bleu | external | bleu | Diff |
---|---|---|---|---|---|---|---|
0 | flores101-devtest | compare | eng-ara/opus...2022-02-25 | 29.4 | facebook/nllb-200-3.3B | 24.8 | 4.6 |
1 | flores200-devtest | compare | eng-ara/opus...2022-02-25 | 29.7 | facebook/nllb-200-3.3B | 24.2 | 5.5 |
2 | tatoeba-test-v2020-07-28 | compare | en-ar/transl...-hplt_opus | 22.8 | facebook/nllb-200-3.3B | 19.7 | 3.1 |
3 | tatoeba-test-v2021-03-30 | compare | en-ar/transl...-hplt_opus | 22.5 | facebook/nllb-200-3.3B | 19.3 | 3.2 |
4 | tatoeba-test-v2021-08-07 | compare | en-ar/transl...-hplt_opus | 22.5 | facebook/nllb-200-3.3B | 19.2 | 3.3 |
5 | tico19-test | compare | deu+eng+fra+...2024-05-30 | 30.3 | facebook/nllb-200-3.3B | 27.3 | 3.0 |
average | 26.2 | 22.4 | 3.8 |