Apple’s Siri can prattle off the height of celebrities at the drop of a hat. Amazon’s Alexa can order a fresh batch of toilet paper to your door. Google’s Assistant can help tune your guitar, while Microsoft’s Cortana does impressions on request.
Just say the word, and these digital assistants will respond to your queries with synthetic pep and nary a hint of complaint. But ask any of the devices about their gender and they seem to demur.
Pose the question to Siri, Cortana or Assistant, and each will insist they transcend such human constructs. Alexa replies, “I’m female in character.”
But all of these artificially intelligent aides will answer in a woman’s voice, at least according to their default settings for Canadian users.
READ MORE: Victoria-based seniors’ home uses Alexa in pilot project
Critics argue this chorus of fembots reinforces stereotypes about women being servile, while members of the tech industry insist they’re simply catering to consumers.
Jodie Wallis, managing director of consulting firm Accenture’s Canadian AI department, says there’s truth to both sides of the debate — sexist attitudes may be at play, but developers shouldn’t be held responsible for society’s views.
“It absolutely reinforces stereotypes, but it’s reinforcing those stereotypes based on research done on what we respond well to,” said Wallis.
A Microsoft spokeswoman said the company thought “long and hard” about gender and did extensive research about voice in crafting Cortana’s “personality.”
Microsoft concluded there were “benefits and trade-offs to either gender-oriented position,” but found there is a “certain warmth” to the female voice that’s associated with helpfulness — a quality the company wanted in its product, according the spokeswoman.
Representatives for Amazon and Google did not respond to email inquiries about how gender factored into product development, while an Apple spokeswoman declined to comment.
Researchers at Indiana University found both men and women said they preferred female computerized voices, which they rated as “warmer” than male machine-generated speech, according to a 2011 study.
But late co-author Clifford Nass suggested in the 2005 book “Wired for Speech” that these gender-based preferences can change depending on a machine’s function. Male voice interfaces are more likely to command authority, he wrote, while their female counterparts tend to be perceived as more sensitive.
While Alexa and Cortana only offer female voices, male options were added to Assistant and Siri after their initial rollouts. For some languages and dialects, Siri’s voice defaults to male, including Arabic, British English and French.
Ramona Pringle, a Ryerson University professor who studies the relationship between humans and technology, acknowledged these developments seem promising, but said if companies are passing the buck onto their customers, then they should be able to select the voice during setup.
“The tech industry often does put the onus back on users, whereas it’s not up to an individual user to bring about this kind of change,” said Pringle. “The way that we perpetuate any stereotype is by saying, ‘That’s what people expect.’”
Pringle said there’s a “clear power dynamic” between users and digital assistants, with the female-voiced devices being placed in a subservient role, perpetuating stereotypes that women should be “docile and doing our bidding at our beck and call.”
This form of digital sexism that may seem innocuous, but could become more insidious as AI develops, Pringle warned.
If we’re conditioned to hurl commands or even verbal abuse at female-coded devices, then as robots become more human-like, the risk is women will be dehumanized, she said.
“It’s almost a step backwards, because it changes the way we engage,” said Pringle.
“It’s very concerning, because there’s no reason … for (these devices) being gendered the way that they are.”
Adina Bresge, The Canadian Press