United Nations Study Reveals The Problem With Female Voice Assistants
Photo Credit: Los Angeles, USA - October 14, 2011: A white Apple iPhone 4S showing the new Siri voice application. Siri is a voice recognition system which allows users to control the phone by using spoken words.

United Nations Study Reveals The Problem With Female Voice Assistants

Most voice assistants are gendered. From Amazon’s Alexa, Apple’s Siri, and Microsoft’s Cortana, virtual assistants generally default to female voices. Now, a new report by the United Nations Educational, Scientific and Cultural Organization highlights why that’s such a problem.

The report — “I’d blush If I could: Closing Gender Divides in Digital Skills Through Education” — examined gender bias coded in tech describing the problem as “pervasive in the technology sector and apparent in digital skills education.”

Critiques of female virtual assistants have often noted how using a female voice perpetuates the idea that women are meant to be docile servants — or even digital slaves. After all, voice assistants often take a myriad of abuse without the ability to ever snap back.

It may seem trivial, but researchers honed in on one disturbing aspect: voice assistants’ responses to sexual harassment. For example, when Siri is told, “You’re a slut,” one of the responses includes, “I’d blush if I could,” which was used in the title of the UNESCO’s report.

For anyone who has experienced sexual harassment, it’s clear that “I’d blush if I could,” isn’t an appropriate response. In the cases where voice assistants’ replies aren’t playing along with harassment, there’s still no condemnation of it.

“Beyond engaging and sometimes even thanking users for sexual harassment, voice assistants – ostensibly non-gendered, despite a female voice – seemed to show a greater tolerance towards sexual advances from men than from women,” the report went on to add.

Specifically, researchers cited a 2017 Quartz article by Leah Fessler that tested Siri and Alexa to see how they’d respond to sexual harassment. Fessler found that Siri would respond proactively to requests for sexual favors by men, but less so to requests made by women.

“What emerges is an illusion that Siri – an unfeeling, unknowing, and non-human string of computer code – is a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment,” the researchers wrote. “It projects a digitally encrypted ‘boys will be boys’ attitude.”

Although many voice assistants will say they don’t have a gender, the researchers pointed out that they’ve still been feminized. So, even if Siri or Cortana don’t say that they’re women, feminization can be seen “in name, in voice, in patterns of speech and in personality.”

The report calls on companies like Google, Amazon, Apple, and Microsoft to stop making their voice assistants female by default. It is possible to create voice assistants that are genderless. It was accomplished by a Denmark-based team with Q.

There are a lot of gender issues within tech, including the lack of women who work within the sector. According to the report, women are also 25 percent less likely to have basic digital skills than men. However, the impact of virtual assistants reaches far beyond tech alone.

Kids are interacting with and learning from these systems, so there is a problem if little kids learn that women-coded assistants are meant to endure abuse and harassment for their pleasure. It is up to tech companies to think about the long-term impact their devices can have and do better.