TOPICS 

    Subscribe to our newsletter

     By signing up, you agree to our Terms Of Use.

    FOLLOW US

    • About Us
    • |
    • Contribute
    • |
    • Contact Us
    • |
    • Sitemap
    封面
    VOICES & OPINION

    It’s Time to Pop China’s Online Filter Bubbles

    When Eli Pariser first coined the term “filter bubble” a decade ago, he still hoped tech companies would realize the error of their ways and self-correct. That hasn’t happened.
    Jan 07, 2021#technology

    One afternoon in mid-November, a university journalism professor confided in me a not-so-shocking secret: “I only found out today what ‘Versailles literature’ actually means.” China’s latest buzzword was already all over the internet and on every student’s lips, but he had never encountered it during his daily online browsing sessions. Eventually his students’ conversations left him feeling so out of touch that he looked it up.

    It’s a common experience in our digital world. In 2010, the American internet activist Eli Pariser argued in his book, “The Filter Bubble: What the Internet Is Hiding From You,” that the internet, social media, search engines, and recommended content algorithms were creating a “state of intellectual isolation,” an “Adderall society” dominated by targeted algorithms. Increasingly cossetted within their own small worlds, Pariser saw users becoming unable to hear different voices, unaware of events outside their bubbles, and incapable of identifying shared interests. Over time, as they became more and more disconnected, groups would cease to have common understandings, and society would gradually start to break apart.

    Despite a very different digital landscape, China is not immune to this phenomenon. More than 700 million Chinese get their news online, mostly through algorithms. On Jinri Toutiao, the country’s most popular algorithm-based recommendation app, smart, highly personalized recommendations effectively match information, individuals, and environment. This sometimes leads to “information gaps.” Take city-targeted algorithms, for example: A friend of mine who owns two cellphones once flew into Shanghai to find that one of his devices had yet to automatically update its location info. He soon realized that they had completely different news feeds, not just for local items, but national ones, too.

    Of course, simple content filters don’t always result in bubbles. For that to happen, tech companies must implement a whole ecosystem, which is exactly what many of them are doing. China’s tech scene has long been characterized by a prevalence of “walled gardens,” as big firms seek to capture users and keep them contained within their own do-everything apps. The resulting mass of user data makes it easy for companies to serve up targeted ads and experiences: Even formerly clean, sleek, and largely open platforms like Douban are now inundated with ads based on users’ browsing histories. As a result, a site that once offered a variety of new content to curious users is increasingly flooded with ads showing them what the algorithms assume they’re already interested in.

    Such filtering has a social impact, leading to sensations of being alone, together. Take the Chinese Tinder clone Tantan, which markets itself positively as a “social app.” It claims to match users with people nearby based on their personal profile, location, interests and other information, helping them meet like-minded friends. Unsurprisingly, according to the app’s own data, 92% of the platform’s users aged 18 to 35 are unmarried. But if the app seems to be encouraging young people to be social, it also has incentives to strengthen and maintain their status as singletons. In addition to reinforcing users’ tastes and prejudices, it can even hamper their ability to develop intimate relationships, isolating them from the outside world even as it feeds them a steady stream of mirror images of themselves to interact with.

    Search engines present more problems, where entering a harmless term like “Henan people” into Baidu might return problematic auto-fill prompts like “Why do Henan people steal manhole covers?” These prompts are often based on keywords from previous searches, but affect individuals’ impressions of their search items. Plus, if a user actually clicks on the admittedly eye-catching suggestion, their action will be included into the algorithm’s next cycle. In the long run, this fosters stereotypes, leading to real-life discrimination. One forthcoming study analyzed 937 prompt words related to all 34 province-level regions across China. It found that Baidu’s auto-complete algorithm both aggravates and strengthens the regional discrimination faced by people in central and western regions.

    Similar issues plague the country’s social media platforms, which often seek to present themselves as windows into the day’s events, even if they conceal as much as they reveal. Microblogging platform Weibo’s “trending” function is considered one of the most important real-time indicators of what Chinese people are following on the internet, but topics on the trending list don’t make it there solely based on internet users clicking to vote. Behind the algorithm is a collaborative filtering mechanism. Sometimes this results in important topics being removed: Researchers in 2018 found that most terms on the trending list were “positive.” At other times, they’re manipulated by businesses or celebrities for commercial gain.

    It’s clear that filter bubbles reinforce isolation, prejudice, disconnectedness, and alienation in our lives. So, what can be done? Perhaps the first step in trying to burst filter bubbles is becoming aware of them and their root causes. In China, that means looking at the power of the country’s tech giants. While much of the focus of recent draft anti-monopoly regulations has been on Ant Group and financial technology, one key target is less a particular company and more the practice of companies misusing the data at their disposal to reorder society and individual behavior to their advantage.

    During the design process of many apps, for example, developers are looking to get users hooked and convince them to spend more time in-app. They want users to be reliant on their product, then use that reliance to extract evermore profit from them through practices like differential pricing and “big-data backstabbing.”

    Pariser, who coined the “filter bubble” term, hoped that internet companies and programmers would eventually engage in greater introspection about such practices. The reality, however, is that it may be necessary to increase external oversight. Filter bubbles may well represent a technological trend that cannot be reversed. That doesn’t mean they can’t be popped.

    Translator: David Ball; editors: Wu Haiyun and Kilian O’Donnell; portrait artist: Wang Zhenhao.

    (Header image: Visual elements from miakievy, saemilee, and Shijue/People Visual, re-edited by Ding Yining/Sixth Tone)