According To A Recent UN Study, Digital Assistants Like Siri and Alexa Reinforce Harmful Gender Stereotypes

According To A Recent UN Study, Digital Assistants Like Siri and Alexa Reinforce Harmful Gender Stereotypes

Amazon’s Alexa and Apple’s Siri are accused of perpetuating antiquated, harmful ideas about women through their submissive responses to queries

A recent report made by UNESCO found that Siri, Alexa, and other voiced AI assistants like Microsoft’s Cortana and the Google Assistant, which are all voiced by women as a default setting, are perpetuating harmful gender stereotypes that “women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command.” The report also highlights the passive and polite responses the assistants give when users make sexually abusive remarks thus encouraging sexist and abusive language. The report also notes:

“The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment. What emerges is an illusion that Siri — an unfeeling, unknowing, and non-human string of computer code — is a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment. It projects a digitally encrypted ‘boys will be boys’ attitude.”

The report, authored by the United Nations Educational, Scientific, and Cultural Organization, known as UNESCO, was named “I’d Blush If I Could,” which is the response Siri once gave when users gave sexually explicit commands, the study outlined the effects of bias in AI research and product development and the potential long-term negative implications of conditioning society, particularly children. Especially since the voice assistants have female voices and wear female names, thus tech companies are once again preconditioning users to fall back upon antiquated and harmful perceptions of women.

The study made the first official UN recommendations regarding AI personal assistants, urging companies and governments to end the practice of making digital assistants female by default. As well as suggesting them to explore the possibility of making the voices “neither male nor female,” as well as discourage program assistants from abusive or sexist language, and require them to “announce the technology as non-human at the outset of interactions with human users.”

Share