Soon Siri won’t default to a female voice anymore — here’s why that’s a big deal

0
9
Soon Siri won’t default to a female voice anymore — here’s why that’s a big deal


Hey Siri — why did you and your fellow AI assistants have to be female for so long? 

The virtual assistants built into smartphones and smart home devices that millions of people now depend upon worldwide have long defaulted to female-sounding voices. Close your eyes and listen to the sound of Amazon’s
AMZN,
+1.33%

Alexa, Alphabet’s
GOOG,
+2.72%

Google Assistant and Microsoft’s
MSFT,
+2.18%

Cortana in your head; if you’re American, it’s likely that you’re hearing the feminine voice that was set as the default when you first activated the device. 

But some tech companies are breaking away from reinforcing the dated stereotype of a subservient woman catering to a person’s every need. Most recently, Apple
AAPL,
+0.93%

announced on Wednesday that Siri will stop defaulting to a female-sounding voice in the latest beta version of iOS, according to a TechCrunch report. Instead, anyone setting up Siri on their iOS device will choose from a range of voices that vary by tone and regional accent from the start. 

Apple is also adding two new Siri voices for English speakers. Currently, Apple lets users of U.S. devices pick between a male and a female voice, along with six accents that include American, British and Indian. But it defaults to a female voice in the U.S. (whereas it defaults to a male voice in some countries like the U.K.) And you have to go into your settings to change it. 

Reps from Apple were not immediately available for comment. But in a statement shared with TechCrunch, the company said, “We’re excited to introduce two new Siri voices for English speakers and the option for Siri users to select the voice they want when they set up their device. This is a continuation of Apple’s longstanding commitment to diversity and inclusion, and products and services that are designed to better reflect the diversity of the world we live in.”

It’s a small but important step toward gender equality. A 2019 United Nations report warned that female-voiced AI assistants perpetuate the idea that “women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command.”

These female AI assistants even get sexually harassed, as Wired and Quartz have reported — and the U.N. noted that these assistants have often been programmed to give passive or polite responses to sexually suggestive or abusive remarks. For example, before the #MeToo movement, if you called Siri a “bitch,” she would respond with “I’d blush if I could,” or, “There’s no need for that.” Quartz noted that Siri has since been updated to say “Language!” in response.

“What emerges is an illusion that Siri — an unfeeling, unknowing, and non-human string of computer code — is a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment. It projects a digitally encrypted ‘boys will be boys’ attitude,” the U.N. wrote. 

What’s more, a 2017 industry report by MindShareWorld found that more than a quarter of users actually fantasize about having sex with their voice assistant.

“Our technologies reflect our culture,” Dr. Miriam E. Sweeney, an assistant professor at the University of Alabama who specializes in digital media, previously told MarketWatch. “And the fact that we end up with female voices, or females portrayed in these various types of service roles, actually does reinforce the feminization of a certain labor force of servitude [like being a personal assistant or working a call center] that is often seen as less skilled, less valuable and that can be paid less.”

The U.N. has recommended that companies and governments stop making digital assistants female by default. And it suggested making these voices “neither male nor female,” as well as programming the assistants to discourage abusive or sexist language.  

Some early changes include Alexa now also shutting down sexual harassment or explicit questions with “I’m not going to respond to that,” or “I’m not sure what outcome you expected.” And many devices have been expanding the voices that they offer, or adding more male-sounding voices — including celebrities like Samuel L. Jackson speaking for Alexa and John Legend lending his voice to the Google Assistant in the past. Amazon’s recent Super Bowl commercial also flipped the sexualization of AI assistants on its head by casting Michael B. Jordan as Alexa’s “body.”



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here