TOPICS 

    Subscribe to our newsletter

     By signing up, you agree to our Terms Of Use.

    FOLLOW US

    • About Us
    • |
    • Contribute
    • |
    • Contact Us
    • |
    • Sitemap
    封面
    VOICES & OPINION

    How Women Are Being Coded Into the Background

    From Siri to Xiaoice, women are everywhere in the virtual world. That’s not necessarily a good thing.
    May 19, 2022#food

    Whether due to structural sexism or hostile work environments, the tech industry remains a male dominated space. Take a look at the futures tech companies are building for us, however, and you’ll find women everywhere. Boot up your virtual assistant, and the voice that greets you will almost certainly be feminine. Use Microsoft’s Bing search engine, and you may get a pop-up message from “Xiaoice,” a blushing artificial intelligence chatbot developed by the company, asking if you’ve missed her. And when the prestigious Tsinghua University announced it had enrolled its first-ever “virtual student,” Hua Zhibing, the AI came with a headshot of a sweet-faced freshman.

    The latest example of this trend in China is Wen Yaoyao. Created by Baidu AI Cloud in partnership with Art Exhibitions China and officially unveiled Wednesday, Wen is a virtual museum tour guide. Depicted as a young woman with tight-fitting traditional clothes and a high-pitched voice, she promises to take visitors through the collections of China’s top museums. The press release announcing her creation referred to her as a “national-style girl;” media reports hailed her as the perfect mix of “good looks plus real skills.”

    Although sometimes presented as antidotes to the troubles of the real world, our new virtual worlds reflect old stereotypes. Regardless of whether the proliferation of female-coded virtual assistants is a product of male fantasies or merely unconscious bias, the consequences are very real. As the UNESCO report “I’d Blush If I Could” notes, virtual assistants like Siri don’t just mirror gender biases; they actively propagate them, modeling tolerance of sexual harassment and female obedience.

    The irony here is that programming was once considered women’s work. From the 19th century mathematician Ada Lovelace to the female programmers who operated the first programmable electronic computer, ENIAC, in the 1940s, women laid the theoretical and practical foundations of computational programming.

    In her “Technologies of the Gendered Body,” Anne Balsamo uses her own mother’s experiences as a computer worker in the 1930s to explore how programming went from stereotypically women’s work to a male-dominated industry. As computers grew increasingly powerful, they required ever more complex algorithms and more detailed divisions of labor to run. Men, believing themselves to possess superior intelligence and greater faculties of reason, gradually pushed women out of the industry they had helped build. This new paradigm required new role models, and the industry lionized exacting engineer-entrepreneurs like Bill Gates and Steve Jobs, while women were relegated to the fringes.

    That division of labor carried over into China during the reform and opening-up period. Even today, women still comprise a tiny minority of programmers, and coders’ public image is that of a geeky man. To fit in, Chinese female programmers have little choice but to adapt to a work culture obsessed with flannel and male-pattern baldness; they call themselves “female losers” and “programmer-ettes” — both plays on terms widely used by their male counterparts, and try to avoid being written off as eye candy for their coworkers.

    They are rarely in a position to push back as they witness the encoding of gender bias into new technologies. There is a tendency to think of things like the metaverse as an act of meta-creation, the formation of something wholly new. But the development and application of technology is neither pure nor innocent, and what we think of as a meta-creation is often just another level of cloning.

    That’s not to say no one is trying to address these problems. For example, in 2019, Stanford University professor Feifei Li, analyzed 14 million files on the ImageNet database, which is frequently used to train deep learning algorithms. Her team found that 45.4% of the database’s images originated in the United States, and that many images originally labeled as “work scenes” were centered around male figures — an oversight that was leading algorithms trained on the set to overlook female professionals.

    As a corrective, Li and her team consciously placed female-centric images into the data set to ensure that the computer would learn to recognize work scenes with women as well as men. Their goal was to make their data more diverse, better suited to different contexts, and more compatible with the diversity of race, gender, and age in modern workplaces. While admirable, these scattered efforts must be expanded and brought into the mainstream if they will ever fully counterbalance the biases of the rest of the industry.

    Translator: Matt Turner; editor: Wu Haiyun.

    (Header image: A promotional image for Wen Yaoyao. From @我是文夭夭 on Weibo)