ID | Language | Benchmark | Output | bleu |
---|---|---|---|---|
0 | fin-bul | flores101-devtest | show | 25.1 |
1 | fin-bul | flores200-devtest | show | 25.1 |
2 | fin-bul | ntrex128 | show | 19.8 |
3 | fin-ces | flores101-devtest | show | 20.1 |
4 | fin-ces | flores200-devtest | show | 20.1 |
5 | fin-ces | ntrex128 | show | 17.4 |
6 | fin-hrv | flores101-devtest | show | 20.3 |
7 | fin-hrv | flores200-devtest | show | 20.3 |
8 | fin-hrv | ntrex128 | show | 18.5 |
9 | fin-pol | flores101-devtest | show | 15.9 |
10 | fin-pol | flores200-devtest | show | 15.9 |
11 | fin-pol | ntrex128 | show | 16.6 |
12 | fin-pol | tatoeba-test-v2021-03-30 | show | 50.2 |
13 | fin-pol | tatoeba-test-v2021-08-07 | show | 50.3 |
14 | fin-rus | flores101-devtest | show | 20.6 |
15 | fin-rus | flores200-devtest | show | 20.6 |
16 | fin-rus | ntrex128 | show | 17.2 |
17 | fin-rus | tatoeba-test-v2021-08-07 | show | 46.2 |
18 | fin-slv | flores101-devtest | show | 21.6 |
19 | fin-slv | flores200-devtest | show | 21.6 |
20 | fin-slv | ntrex128 | show | 19.1 |
21 | fin-srp_Cyrl | flores101-devtest | show | 21.4 |
22 | fin-srp_Cyrl | flores200-devtest | show | 21.4 |
23 | fin-srp_Cyrl | ntrex128 | show | 11.9 |
24 | fin-srp_Latn | ntrex128 | show | 14.8 |
25 | fin-ukr | flores101-devtest | show | 17.8 |
26 | fin-ukr | flores200-devtest | show | 17.8 |
27 | fin-ukr | ntrex128 | show | 15.3 |
average | 22.246 |