ID | Language | Benchmark | Output | bleu |
---|---|---|---|---|
0 | nno-cmn_Hans | flores200-devtest | show | 0.2 |
1 | nno-cmn_Hant | flores200-devtest | show | 0.2 |
2 | nno-yue | flores200-devtest | show | 0.1 |
3 | nno-yue | ntrex128 | show | 0 |
4 | nob-cmn_Hans | flores101-devtest | show | 0.2 |
5 | nob-cmn_Hans | flores200-devtest | show | 0.2 |
6 | nob-cmn_Hant | flores101-devtest | show | 0.2 |
7 | nob-cmn_Hant | flores200-devtest | show | 0.2 |
8 | nob-yue | flores200-devtest | show | 0.1 |
9 | nob-yue | ntrex128 | show | 0 |
10 | nor-zho | tatoeba-test-v2020-07-28 | show | 29.9 |
11 | nor-zho | tatoeba-test-v2021-03-30 | show | 30 |
12 | nor-zho | tatoeba-test-v2021-08-07 | show | 29.8 |
average | 7.008 |