English-Korean Natural Machine Translation
This is a live demo of an English–Korean translation system powered by a Transformer model that I developed .
There is one caveat: the demo is hosted on the free plan of Hugging Face Spaces, which means the server goes idle after 48 hours of inactivity. If that happens, you may need to visit my Hugging Face Space once to wake the model up. After that, the demo should work normally.
There are a couple of important limitations to keep in mind:
- The current models can only process up to 64 input characters.
- You may notice that (Kor→Eng) translation performs better than (Eng→Kor). You are right. The Kor→Eng model achieves a higher BLEU score (34.03 vs. 9.32), likely due to linguistic asymmetry between the two languages.
In particular, when translating from English to Korean, the model must infer honorifics and levels of politeness, which are explicitly encoded in Korean but largely absent in English. This ambiguity naturally leads to lower BLEU scores.
For example:
- 저는 학생입니다. (formal)
- 저는 학생이에요.
Both sentences convey the same meaning (“I am a student”) but differ in their level of politeness, making accurate inference more challenging for the model.