wechat_bg

2021-08-12 16:06:14 Voices

On August 2, state-run news agency Xinhua published two major reports on China’s “unhealthy” fan culture. The first detailed the powerful Cyberspace Administration of China’s recent crackdown on unruly fan groups, which has so far resulted in the removal or closure of 814 hashtags and more than 1,300 groups engaged in channeling traffic to celebrities. The second article covered new guidelines meant to push content platforms away from a reliance on online traffic — and the algorithms that drive it — and toward artistic values and the public good.

The government’s renewed scrutiny of fan culture, algorithms, and digital platforms comes less than a month after celebrity rapper Kris Wu was detained on suspicion of rape. Perhaps the most famous member of a generation of pop idols known more for their ability to mobilize vast online fandoms than their musical talent, Wu was the quintessential “traffic star,” or liuliang mingxing. These idols worked together with dedicated fan groups to boost their social media profiles through hashtags, shares, and policing negative comments. The most successful are placed at the top of microblogging platform Weibo’s influential “star power list” — proof of their marketability and a powerful draw for film producers, investors, and brands.

For years, platforms — even listed companies like Weibo which face tighter scrutiny from regulators — have adopted a laissez-faire attitude to fan groups’ manipulation of social media data, partly because of their own stakes in the traffic economy. Now, however, as Chinese regulators rethink their approach to the country’s booming digital economy, and internet giants’ excessive influence over almost every aspect of Chinese society, platforms are under mounting pressure to change their business practices. On August 6, just days after Xinhua published its reports on fan culture, Weibo announced it was taking down its star power list and would more closely align itself with the authorities by introducing new indicators of cultural influence into its algorithms, including a “positive energy index.”

The reach of the new regulations is not limited to China’s celebrity economy. Despite its association with the pop idol system, the use of traffic manipulation techniques to sway public opinion is common in other fields where “influence,” however nebulously defined, is the first and foremost metric of success. Short videos, online retail, and even car-hailing platforms have all been linked to traffic manipulation, often to the detriment of consumers.

Short videos, online retail, and even car-hailing platforms have all been linked to traffic manipulation, often to the detriment of consumers.

To a degree, the current moment, with its scandals and escalated regulatory scrutiny, is a sort of Cambridge Analytica scandal with Chinese characteristics. In China, digital platforms, algorithms, and the tech industry more generally have long been portrayed in a positive light by officials and the media, partly because of their outsize contributions to the country’s economic growth. Now, there’s a growing realization among the public that algorithms don’t represent who we are or what we need, but the economic interests of largely unregulated digital platforms.

Take popular blog network Mimeng, for example. At least before it was shut down for fabricating stories, the network was successful as a result of its traffic-oriented content production model: It surveyed which topics and headlines were most likely to provoke engagement and interaction and relentlessly exploited them. “One-Hundred Thousand-plus clicks” — the maximum traffic number shown by the major messaging platform WeChat — became the benchmark by which an article was judged economically, if not journalistically, successful. WeChat tacitly encouraged this shift, which increased user engagement with the app and helped transform it from primarily a messaging service into a major news source for Chinese readers.

Meanwhile, principles such as fact-checking and balanced storytelling were abandoned altogether, since content that perpetuates readers’ biases and anxieties has a better chance of getting picked up by platforms’ algorithms and going viral. Even traditional newspapers and magazines are not immune to these ethical issues, as they are locked in fierce competition with content mills like Mimeng for advertisement revenue.

These problems are arguably most noticeable on Weibo, which has transformed over the past decade from a social network and center of political discussion into a celebrity-driven entertainment site with outsize influence over the country’s cultural industries. Producers, looking for sure things in an uncertain market, began to treat Weibo’s star power list as a proxy for marketability, incentivizing idols and their teams to game the system however they could. To cement its status as film industry kingmaker, Weibo established its own “Weibo Movie Night” awards, with the winners determined based on who had the most rabid fan groups in their corner.

Needless to say, this system did not always produce blockbusters, much less high-quality films. One of the most famous examples of idol stunt casting, 2019’s Lu Han vehicle “Shanghai Fortress,” was a box office and critical bomb.

However, it is equally important to caution against reductive narratives that position algorithms and tech companies as one-dimensional villains. The question at hand isn’t how to replace one opaque, imperfect ranking system with another, but how to regulate algorithms to ensure they stimulate economic development without infringing on public interest.

The country’s newly published guidelines call for the government and platforms to emphasize expert artistic reviews of new and promoted content — a measure meant to help replace or balance algorithms and traffic as the standard for what makes a cultural work “good.” But expertise itself is no guarantee of objectivity, and it’s hard to imagine how expert panels are supposed to judge the merits of new and emerging forms like short videos.

Rather than continuing past failed approaches by forcing algorithms to include subjective data points like “positive energy” or expert feedback, we should start by making them more transparent. Scholars are increasingly arguing for algorithms to be available for audit. Companies are unlikely to go along willingly; in the digital economy, algorithms are prized business secrets. But casting the same light on all algorithms is likely to yield better results than trusting firms to implement regulations in secret. In this case, third party ethics panels, rather than expert panels, may be more useful in ensuring digital platforms balance the economic interests of their creators with the law and social morality.

Editors: Cai Yineng and Kilian O’Donnell.

(Header image: Fans hold signs with their favorite idols’ names in Hangzhou, Zhejiang province, 2018. People Visual)