Male-Sounding Siri Voice Characterized As A Black Person Rated 'Less Competent And Less Professional,' Survey Finds
Photo Credit: Unsplash

Male-Sounding Siri Voice Characterized As A Black Person Rated 'Less Competent And Less Professional,' Survey Finds

iPhone users across the world have the convenience of using Siri. From asking for additional information about a random project to helping to identify who sings a song you may have overheard at brunch; the digital assistant is there to provide the answers.

With Siri being a core feature on all Apple products, its functionality is paramount to all users. However, did anyone ever conceptualize human attributes to Siri? Is it a Black person? Does the digital assistant identify with a gender?

Who Said That?

The use of digital assistants is not an Apple-exclusive. Amazon’s Alexa and Google assistant all have standard voices that are typically perceived, in my opinion, to sound like white women. However, tech companies with digital assistants on their products have recently created voices that sound different.

In 2021, Amazon offered a male-sounding voice as an option for Alexa. Additionally, Alexa has the opportunity for its voice to be set to sound like certain celebrities. Just this year, Apple added additional voice options for Siri that do not sound like traditional masculine or feminine voices.

According to Consumer Reports, based on surveys from a linguist at the University of Pennsylvania, “Apple released two voices that users were more likely to say sounded Black compared to the original voices.”

Consumer Reports also found that users from the BIPOC community were more welcoming to the voice option that aligned more with their ethnicity. Nevertheless, with this type of diversity and representation in audible sound, some people’s biases are not as open.

We Hear What We Experience

Nicole Holliday, Ph.D. – assistant professor of linguistics at the University of Pennsylvania conducted a survey asking over 480 English speakers to rate the four available voice options for Siri. The categories were based on character traits: funniness, professionalism, competence, and friendliness.

After her survey, Holliday discovered that users’ reactions to the voice option directly reflected gender and racial stereotypes that they held to be true. For example, the voice that users believed sounded most like a Black man was rated funniest but also perceived as the “less professional” and “less competent” sounding voice.

“The voice gets the same negative stereotypes we assign to Black men,” Holliday says.

The standard option for Siri’s voice was not judged much differently from other voices related to white people. Holliday found that many people assign identities based on what they hear concerning their human experience with others.

“Whenever you hear a voice, and you don’t see a body attached to it, you imagine one,” Holliday explained. “Our ability to classify voices like this is part of our language faculty.”

While the classification of voices can be frustrating when understanding the history behind forced assimilation, Holliday suggests there is no practicality in disassociating the gender or race from a digital voice. Intentional or not, people automatically try to assign an identity to what they hear.