notesum.ai
Published at October 21On Creating an English-Thai Code-switched Machine Translation in Medical Domain
cs.LG
cs.AI
stat.ML
Released Date: October 21, 2024
Authors: Parinthapat Pengpun, Krittamate Tiankanon, Amrest Chinkamol, Jiramet Kinchagawat, Pitchaya Chairuengjitjaras, Pasit Supholkhan, Pubordee Aussavavirojekul, Chiraphat Boonnag, Kanyakorn Veerakanjana, Hirunkul Phimsiri, Boonthicha Sae-jia, Nattawach Sataudom, Piyalitt Ittichaiwong, Peerat Limkonchotiwat

| Model Variant | CS F1 | BLEU | chrF | CER | WER | COMET | METEOR | Fact. |
|---|---|---|---|---|---|---|---|---|
| Gemini-Pro-CS | 0.132 | 0.353 | 0.595 | 0.599 | 0.686 | 0.849 | 0.622 | 5.750 |
| Gemini-Pro-MN | 0.110 | 0.352 | 0.599 | 0.426 | 0.526 | 0.854 | 0.630 | 5.375 |
| Google-NMT | 0.119 | 0.385 | 0.617 | 0.392 | 0.480 | 0.815 | 0.650 | 3.125 |
| GPT-3.5-CS | 0.141 | 0.208 | 0.504 | 0.833 | 0.987 | 0.671 | 0.491 | 3.625 |
| GPT-3.5-MN | 0.114 | 0.205 | 0.504 | 0.599 | 0.775 | 0.687 | 0.494 | 3.875 |
| GPT-4-CS | 0.340 | 0.314 | 0.601 | 0.636 | 0.757 | 0.850 | 0.593 | 6.250 |
| GPT-4-MN | 0.132 | 0.282 | 0.581 | 0.511 | 0.660 | 0.847 | 0.597 | 5.125 |
| Llama2-13B-CS | 0.058 | 0.012 | 0.189 | 5.226 | 6.104 | 0.153 | 0.129 | 1.125 |
| Llama2-13B-MN | 0.082 | 0.022 | 0.224 | 3.326 | 4.139 | 0.163 | 0.174 | 1.000 |
| Llama2-7B-CS | 0.074 | 0.012 | 0.171 | 5.361 | 6.141 | 0.159 | 0.117 | 0.500 |
| Llama2-7B-MN | 0.086 | 0.015 | 0.197 | 3.852 | 4.761 | 0.162 | 0.143 | 0.500 |
| OpenThaiGPT-13B-CS | 0.039 | 0.094 | 0.446 | 2.343 | 2.538 | 0.394 | 0.388 | 2.125 |
| OpenThaiGPT-13B-MN | 0.036 | 0.094 | 0.465 | 1.978 | 2.208 | 0.425 | 0.396 | 2.375 |
| OpenThaiGPT-7B-CS | 0.045 | 0.046 | 0.308 | 12.620 | 13.224 | 0.310 | 0.237 | 2.875 |
| OpenThaiGPT-7B-MN | 0.027 | 0.068 | 0.344 | 9.954 | 10.223 | 0.369 | 0.282 | 2.750 |
| SeaLLM-7B-CS | 0.035 | 0.017 | 0.242 | 11.678 | 11.165 | 0.235 | 0.188 | 2.000 |
| SeaLLM-7B-MN | 0.076 | 0.032 | 0.329 | 8.340 | 8.259 | 0.321 | 0.259 | 1.705 |
| Typhoon-7B-CS | 0.021 | 0.012 | 0.220 | 18.946 | 18.434 | 0.186 | 0.168 | 1.875 |
| Typhoon-7B-MN | 0.023 | 0.013 | 0.239 | 18.111 | 19.020 | 0.174 | 0.176 | 1.875 |
| NLLB | 0.107 | 0.140 | 0.432 | 0.610 | 0.906 | 0.530 | 0.405 | 2.500 |
| NLLB-1 | 0.475 | 0.253 | 0.487 | 0.491 | 0.593 | 0.678 | 0.502 | 4.375 |
| NLLB-2 | 0.230 | 0.262 | 0.548 | 0.448 | 0.612 | 0.720 | 0.546 | 3.375 |
| NLLB-3 | 0.380 | 0.257 | 0.520 | 0.472 | 0.604 | 0.702 | 0.521 | 4.000 |
| NLLB-4 | 0.452 | 0.272 | 0.520 | 0.461 | 0.577 | 0.710 | 0.532 | 3.875 |
| NLLB-5 | 0.193 | 0.255 | 0.544 | 0.458 | 0.627 | 0.715 | 0.546 | 3.250 |
| NLLB-6 | 0.286 | 0.264 | 0.539 | 0.456 | 0.606 | 0.711 | 0.541 | 4.000 |
| Gemini-Pro-CS + Mask | 0.628 | 0.301 | 0.512 | 0.668 | 0.716 | 0.704 | 0.543 | 5.500 |
| Gemini-Pro-MN + Mask | 0.644 | 0.314 | 0.529 | 0.461 | 0.517 | 0.726 | 0.562 | 5.750 |
| Google-NMT + Mask | 0.647 | 0.327 | 0.531 | 0.458 | 0.509 | 0.656 | 0.564 | 5.000 |
| GPT-3.5-CS + Mask | 0.574 | 0.212 | 0.463 | 0.839 | 0.953 | 0.631 | 0.468 | 5.250 |
| GPT-3.5-MN + Mask | 0.536 | 0.215 | 0.474 | 0.662 | 0.755 | 0.623 | 0.478 | 5.000 |
| GPT-4-CS + Mask | 0.612 | 0.265 | 0.500 | 0.682 | 0.758 | 0.724 | 0.515 | 6.000 |
| GPT-4-MN + Mask | 0.619 | 0.275 | 0.517 | 0.556 | 0.634 | 0.705 | 0.535 | 4.750 |
| Llama2-13B-CS + Mask | 0.052 | 0.011 | 0.164 | 6.050 | 7.205 | 0.142 | 0.110 | 1.000 |
| Llama2-13B-MN + Mask | 0.100 | 0.023 | 0.199 | 4.201 | 5.363 | 0.156 | 0.148 | 0.750 |
| Llama2-7B-CS + Mask | 0.013 | 0.005 | 0.127 | 6.091 | 7.175 | 0.144 | 0.079 | 0.500 |
| Llama2-7B-MN + Mask | 0.024 | 0.008 | 0.150 | 4.188 | 5.712 | 0.161 | 0.101 | 0.750 |
| OpenThaiGPT-13B-CS + Mask | 0.052 | 0.072 | 0.369 | 1.831 | 2.215 | 0.275 | 0.313 | 2.250 |
| OpenThaiGPT-13B-MN + Mask | 0.078 | 0.066 | 0.384 | 2.119 | 2.715 | 0.293 | 0.309 | 1.375 |
| OpenThaiGPT-7B-CS + Mask | 0.043 | 0.038 | 0.266 | 11.545 | 12.430 | 0.226 | 0.202 | 1.250 |
| OpenThaiGPT-7B-MN + Mask | 0.063 | 0.062 | 0.307 | 6.760 | 7.068 | 0.271 | 0.258 | 2.125 |
| SeaLLM-7B-CS + Mask | 0.048 | 0.016 | 0.223 | 10.167 | 9.953 | 0.204 | 0.166 | 1.375 |
| SeaLLM-7B-MN + Mask | 0.163 | 0.033 | 0.306 | 8.119 | 8.009 | 0.259 | 0.240 | 1.625 |
| Typhoon-7B-CS + Mask | 0.080 | 0.011 | 0.199 | 18.283 | 18.291 | 0.170 | 0.147 | 1.875 |
| Typhoon-7B-MN + Mask | 0.113 | 0.010 | 0.218 | 17.891 | 18.786 | 0.172 | 0.150 | 1.750 |
| NLLB + Mask | 0.523 | 0.183 | 0.423 | 0.556 | 0.719 | 0.533 | 0.424 | 4.125 |
| NLLB-1 + Mask | 0.578 | 0.237 | 0.457 | 0.515 | 0.605 | 0.645 | 0.479 | 4.625 |
| NLLB-2 + Mask | 0.637 | 0.240 | 0.475 | 0.506 | 0.612 | 0.644 | 0.489 | 4.750 |
| NLLB-3 + Mask | 0.605 | 0.237 | 0.464 | 0.511 | 0.608 | 0.648 | 0.481 | 5.125 |
| NLLB-4 + Mask | 0.599 | 0.250 | 0.472 | 0.502 | 0.596 | 0.651 | 0.493 | 4.875 |
| NLLB-5 + Mask | 0.642 | 0.242 | 0.478 | 0.504 | 0.609 | 0.645 | 0.493 | 3.625 |
| NLLB-6 + Mask | 0.628 | 0.241 | 0.473 | 0.505 | 0.605 | 0.646 | 0.489 | 4.750 |