Google Assistant, Siri, and Alexa are better known for their female than male voices. And this is a problem, according to a UN report, which finds that seeing an assistant respond systematically with a female voice reinforces sexist stereotypes that women must respond to orders and do what they are told without flinching. said to do.
Do Google, Amazon and Apple help to convey sexist clichés through their respective voice assistants? In any case, this is the conclusion of a report by UNESCO (United Nations Educational, Scientific and Cultural Organization), one of the branches of the United Nations. The authors believe that the fact that Google Assistant, Alexa or Siri generally sport a female voice “reflects and reinforces” the idea that the role of assistants should be attributed to women.
Google Assistant, Siri, Alexa: the default voices are too often feminine and reinforce the clichés according to the UN
The report is titled “I'd blush if I could,” which was a response Siri gave when asked “Are you a bitch?”. “Because most voice assistants use a female voice, it sends a signal that they are helpful, docile and eager to please helpers, available anytime with the push of a button or with a direct voice command like 'hey' or “ok”. The assistant has no power beyond what is asked of him. He honors orders and responds to requests regardless of tone or degree of hostility. In many communities, this reinforces the generally accepted gender prejudices that women are subject to and tolerant of abuse ”, denounces the document.
Read also: Connected speakers: Google Assistant, Alexa and Siri understand men better than women
The effects on children are of particular concern to researchers. Calvin Lai, of Harvard University who specializes in unconscious bias, explains that “the gender associations people make depends on how often people are exposed to them. As digital assistants increase in number, the frequency and volume of associations between “woman” and “assistant” are increasing dramatically ”. Professor Noble adds “that commands barked at voice assistants - such as 'find x', 'call x', 'change x' or 'command x' - act as powerful socialization tools and teach people, and in particular children, that the role of women, girls and women is to meet their demands. Constantly representing digital assistants as a woman gradually codifies a link between the voice of the woman and her submission ”.
UNESCO recommends that digital giants take a serious look at the issue and come up with more suitable solutions. Note that the default voice of assistants may vary depending on the language. For Siri, it's usually a female voice, but it's now a male voice in British English, Dutch, Arabic, or Spanish. Alexa for her part is only entitled to a female voice for the moment.