TOPICS 

    Subscribe to our newsletter

     By signing up, you agree to our Terms Of Use.

    FOLLOW US

    • About Us
    • |
    • Contribute
    • |
    • Contact Us
    • |
    • Sitemap
    封面
    NEWS

    WeChat Apologizes for Translating ‘Black Foreigner’ as N-Word

    Racist slur was found to appear when paired with negative adjectives.

    Chinese messaging app WeChat has apologized for an error in its algorithm that provided the N-word as a translation for a neutral Chinese term for black foreigners.

    “We’re very sorry for the inappropriate translation,” a WeChat spokesperson told Sixth Tone. “After receiving users’ feedback, we immediately fixed the problem.”

    The issue was discovered by Shanghai-based theater director Ann James, who texted her Chinese colleagues in their messaging group this morning to say she was running late. As usual, the black American typed in English, using WeChat’s in-app translation feature to read the Chinese responses. She gasped when she saw that the next message contained a racist obscenity: “The n----- is late.”

    “I was just horrified,” James told Sixth Tone. Yet she doubted that her colleague would use such a slur, and another friend confirmed that the original Chinese message used a neutral term: hei laowai, or “black foreigner.”

    A local English-language media outlet, That’s Shanghai, reported the story and found that the translator gave neutral translations in some instances but used the slur when the phrase in question included a negative term, such as “late” or “lazy.” Sixth Tone’s own testing on Wednesday evening found similar results.

    WeChat, an app with an estimated 1 billion active users, is ubiquitous in China, not only as a messaging tool but also as a cashless payment provider and a social media and online publishing platform. The company behind it, Tencent, is now the world’s 10th most valuable public firm, worth $275 billion, according to The Economist. But as Chinese technology firms expand globally, their cross-cultural aptitude will be put to the test.

    The Chinese colleague who had sent the original message was shocked, but the theater director reassured her. “I said, ‘No problem, I know it’s not you — it’s something in the programming,’” James recalled, though she questions how the algorithm came to present such a profanity in the first place: “Why is that word even in the translator?”

    The spokesperson from WeChat explained that the app used neural machine translation, though the engine was constantly being refined to provide “more accurate, faithful, expressive, and elegant” results.

    Many modern translation apps take advantage of big data sets and machine learning techniques, basing their translations on existing usage without a human filter. But such processes can introduce a risk of artificial intelligence picking up offensive associations and language. In 2016, Twitter users taught an AI account to enthusiastically support Hitler, and a report that examined software used by U.S. courts claims it picked up the racial bias of institutions whose data it used.

    James, who recently played a role in the Chinese blockbuster “Wolf Warrior 2,” found the translation issue disheartening but unsurprising. “If you’re a black person in China, you’ve come up against some craziness,” she says, explaining that she was often touched and photographed in public without her consent even before her film appearance.

    In 2016, a Chinese laundry detergent advertisement was widely criticized for showing a black man “washed” into a light-skinned Chinese man. And on Saturday, international skin care and cosmetics brand Dove apologized for a similar gaffe.

    “I know there’s a lot of curiosity and a lot of ignorance about black people [in China],” said James, who was quick to emphasize that she loves the country she’s called home for five years. “I just think that we need to have more open discussion between Chinese people and black people.”

    Additional reporting: Wang Lianzhang; editor: David Paulk.

    (Header image: Xiao Mu/VCG)