Advertisement
Advertisement
WeChat
Get more with myNEWS
A personalised news feed of stories that matter to you
Learn more

Why is WeChat translating the Canadian flag emoji into “He’s in prison”?

A glitch in the translation tool for Tencent’s messaging app generates unusual results

WeChat
This article originally appeared on ABACUS

Emoji transcend languages, so they usually don’t require translation. But WeChat will do it for you anyway if you’re typing in Chinese, with some pretty bizarre results. 

Tencent’s popular messaging app comes with a built-in translator. So you can tap on any Chinese messages to get an instantaneous English translation. Most of the time it works fine, but it’s different when you throw emoji into the mix.

While emoji by themselves don’t translate into anything, when they’re accompanied by a Chinese message, some emoji appear to be associated with specific words or phrases.

I'm so sorry, says WeChat. (Picture: WeChat)

Flag emoji give some especially strange results: 

Some flag emoji bring up seemingly random names: 

In a statement to Abacus, WeChat said it’s taking immediate action to fix the translation bug, which was first discovered by Twitter users.

“We thank users for flagging this matter and apologize for any inconvenience caused,” it said.

WeChat’s translation function appears to be based on machine learning. In a similar translation glitch earlier this year involving celebrity names, WeChat explained that the error arose because the system wasn’t trained on certain English words. WeChat also apologized in 2017 when it was discovered that “black foreigner” in Chinese was translated as a racial slur.

With machine learning, a system learns by reading a large amount of text in one language and comparing it with the corresponding translation in another language. Since the system is trained on full sentences rather than individual words, it should ideally result in more accurate and natural translations.

But that’s not always the case. Last year, users on Reddit and elsewhere discovered that when Google Translate was asked to translate a string of nonsense words and phrases, it spewed out gibberish that sounds like ominous religious prophecy. It turns out that Google Translate was trained partly on religious texts, and it resorted to them when it got confused.

For more insights into China tech, sign up for our tech newsletters, subscribe to our award-winning Inside China Tech podcast, and download the comprehensive 2019 China Internet Report. Also roam China Tech City, an award-winning interactive digital map at our sister site Abacus.

Post