Subscribe to our newsletter

     By signing up, you agree to our Terms Of Use.


    • About Us
    • |
    • Contribute
    • |
    • Contact Us
    • |
    • Sitemap

    Keeping an Ear to Weibo’s Suicidal Whispers

    Artificial intelligence is helping volunteers to intervene when people post troubled messages in online ‘tree hollows.’

    SHANGHAI — When Lü Xiaokang logged into social media app Weibo one day in February, he sent out a message of despair: “Cut my wrist again, still cannot find the artery.” He placed it below an old post, where, perhaps, nobody would ever see it.

    But a global team of volunteers was monitoring the platform for messages just like his.

    Their software picked up on Lü’s desperate phrasing and flagged his comment. A volunteer reached out to Lü, who told them he was depressed and shared some of his personal information with them. Enough to go off of so that when, five days later, Lü said he’d just overdosed on sleeping pills, volunteers were able to warn police, who sent the 21-year-old to the hospital.

    Suicide is the primary cause of death among Chinese people aged 15 to 35. On Weibo, many young users with depression post about their feelings in so-called tree hollows, or shudong, places to whisper secrets to alleviate psychological burdens. The practice has unclear origins but was popularized by Hong Kong movies “In the Mood for Love” and “2046,” in which characters talk about admitting secrets to a hole in a tree.

    The most famous tree hollow on Weibo is a digital suicide note from 2012, written by a university student who later killed herself. In the seven years since, it has garnered over 1 million comments, many of which are people expressing despair, grief, or suicidal thoughts. It is in such a tree hollow that Lü — a pseudonym used in media reports — posted his initial message, only to be noticed by the volunteers of the Tree Hollow Rescue Movement (THRM).

    The nonprofit group consists of some 220 members, including experts such as psychiatrists, as well as many volunteers who want to help those with depression. It was started by Huang Zhisheng, an artificial intelligence professor at Vrije Universiteit Amsterdam. Wanting to find practical AI applications to benefit society, he set out to write an algorithm that could pick up on suicidal intentions.

    During a visit to Shanghai, Huang shows Sixth Tone how THRM’s system works. Opening a file named “tree hole agent” on his laptop, lines of code flash to life. “It’s set to monitor 120 pages of comments,” Huang says, referring to the digital tree hollows that the software monitors for new comments. The algorithm assigns each Weibo comment a suicide risk level ranging from 1 to 10 based on word usage and produces a report consisting of a list of messages at or above Level 6. “Usually it finds six to 10 comments (per day),” Huang says, adding that the latest iteration of the software is so accurate that 82% of flagged comments are indeed about suicide plans.

    The system started round-the-clock monitoring on July 27. With nearly one year of data, Huang says most users who post suicidal messages are aged between 18 and 23, female, and deal with school bullying, study pressure, or relationship problems. “A few are directly caused by economic problems, such as debt,” Huang adds.

    THRM is supported by a growing network of volunteers who, on average, attempt more than 90 suicide preventions per month. One or multiple people will reach out to the users through private messages — conversations that can go on for months. So far, 320 have been deemed effective. In other cases, either the person did not respond, or they continued to post messages that were merely gloomy rather than outright suicidal.

    Some volunteers, including Huang, are based in Europe. As most suicidal posts are made between 10 p.m. and 2 a.m., the time zone difference is a benefit. “Because a lot of people with depression will post on Weibo at night, (doing the work from) Europe, where it’s daytime, has its advantages,” Huang says.

    Using AI or algorithms to prevent suicides is not new. Facebook started applying AI to identify suicidal content in 2017 and either gives such users prompts to contact a friend or support services or contact authorities when it thinks there is “imminent harm.”

    A Weibo spokesperson tells Sixth Tone that the company sometimes also informs authorities of suicidal individuals, citing a successful 2017 rescue case. “We monitor all the content that users put on Weibo, mostly because of safety considerations and because we need to make sure all content is legal,” the spokesperson says. “Content related to suicide is a part of that.” Users are also encouraged to alert Weibo to such content.

    Facebook’s approach has raised privacy concerns, as has THRM. “Facebook is different,” Huang says. “Firstly, they already have users’ private information.” In contrast, THRM relies on what users have already made public or have directly told volunteers. Still, the decision of whether to send that private information to police to save a life can be complicated, says Shanghai-based psychological consultant and THRM volunteer Zhou Zihan.

    “From the professional point of view of a psychological consultant, THRM’s privacy issue is open for discussion,” Zhou says. “Tree hollows are a place for people to speak their minds.” Misinterpreting a message and reporting it to the police might prevent a person with depression from speaking about their feelings at all, she adds. Zhou was the volunteer who contacted Lü back in February, a case in which authorities were ultimately called to intervene.

    Earlier this year, after the software had flagged a comment saying the user planned to kill herself after her birthday, Zhou kept a close watch. She stayed in contact with the user through private messages and got to know her surname and address in the southern province of Hainan. When, a day after the user’s birthday, the user resigned from her job, Zhou thought police should be informed. A Hainan-based volunteer objected, however. “That (volunteer) thought she wasn’t in danger and said we’d disturbed her too much,” Zhou says.

    Zhou went ahead anyway, asking police to check her situation rather than disturb her. Instead, police disregarded the woman’s privacy and informed all of her former colleagues about her depression. The coarse method disappointed Zhou, and the Hainan-based volunteer quit the group.

    Zhou later spoke with a local police officer several times to talk about how best to communicate with people who are contemplating suicide. The next time the woman was in trouble, the police sent a female staff member to speak with her. “She decided (again) not to go through with it.” Zhou says.

    “If I could choose again, I’d still report (the case) to the police,” Zhou recalls. “The problem isn’t whether or not to report it; it’s how the police deal with it.” But Professor Huang says it’s not just the police: Parents, teachers, and others also don’t know how to act around suicidal people.

    Huang also knows THRM’s actions can only help so much. Even in the case of Lü, hailed by media as a success story, there’s no guarantee of a good outcome. “Whether (they) will attempt suicide in the future is another issue,” Huang says. “We are not responsible for treatment and cannot solve all problems.”

    Thirty-year-old Ye Chaoqun also felt the same. After reading the domestic media reports on Lü’s rescue story, Ye applied to join the volunteer work in March. “I felt huge pressure. I cared about these people under the tree hollow and could not sleep well. In this period, I sometimes only slept three or four hours a night,” Ye tells Sixth Tone. “I wanted to help these people, but I know there are many practical issues I cannot solve.”

    While the World Health Organization set up a goal for all member countries to build comprehensive social care services for mental health in local communities by 2020, Huang hopes to expand the industry to build a rescue ecosystem in which those with depression or contemplating suicide can obtain related services.

    “We discovered that, as it keeps rolling, the snowball gets bigger and bigger,” Huang says. “We have a lot of things we need to do.”

    In China, the Beijing Suicide Research and Prevention Center can be reached for free at 800-810-1117 or 010-82951332. In the United States, the National Suicide Prevention Lifeline can be reached for free at 1-800-273-8255. A fuller list of prevention services by country can be found here.

    Editor: Kevin Schoenmakers.

    (Header image: Ding Yining/Sixth Tone)