ID | Language | Benchmark | Output | bleu |
---|---|---|---|---|
0 | acm-eng | flores200-devtest | show | 26.9 |
1 | apc-eng | flores200-devtest | show | 26.1 |
2 | ara-eng | flores101-devtest | show | 34.6 |
3 | ara-eng | flores200-devtest | show | 34.6 |
4 | ara-eng | tatoeba-test-v2020-07-28 | show | 44.7 |
5 | ara-eng | tatoeba-test-v2021-03-30 | show | 44.4 |
6 | ara-eng | tatoeba-test-v2021-08-07 | show | 44.4 |
7 | ara-eng | tico19-test | show | 32.8 |
8 | ara_Latn-eng | flores200-devtest | show | 1.2 |
9 | arq-eng | tatoeba-test-v2020-07-28 | show | 8 |
10 | arq-eng | tatoeba-test-v2021-03-30 | show | 8 |
11 | arq-eng | tatoeba-test-v2021-08-07 | show | 8 |
12 | ary-eng | flores200-devtest | show | 15.4 |
13 | arz-eng | flores200-devtest | show | 21.9 |
average | 25.071 |