Technology, Top News

UNESCO highlights the danger of female virtual assistants

UNESCO highlights the danger of female virtual assistants

People just don’t want Alexa to sound like Danny Dyer for some reason

WHILE WOMEN ARE criminally underrepresented in the world of tech, they have a 100 per cent dominance of one area: being the disembodied voice of the digital slaves that live inside Amazon Echoes, Google Homes and Apple HomePods.

That’s less of a glass ceiling to shatter, and more of a glass trap door to tumble through. Virtual assistants are essentially slaves designed to do human bidding. And while nobody is seriously talking about them as being sentient, as well as saying something about us that we’re happier with the role being feminine in nature, there’s also a risk that it enforces some pretty unpleasant behaviour.

That’s the verdict of a paper from UNESCO entitled “Hey Siri you’re a bitch. The eye-catching title isn’t letting off steam at a thoroughly useless AI as we all have from time to time: it’s actually something that Apple has programmed Siri to respond to. Not, as you might have hoped by dialing the speaker’s mother so they can explain how they ended up being such an unpleasant cockweasel, but with a more coquettish “I’d blush if I could.” Oh how we laughed. Or threw up. One of the two.

Anyway, UNESCO highlights five problems with our female default for virtual assistants, which are that it:

  1. reflects, reinforces and spreads gender bias;

  2. models acceptance of sexual harassment and verbal abuse;

  3. sends messages about how women and girls should respond to requests and express themselves;

  4. makes women the ‘face’ of glitches and errors that result from the limitations of hardware and software designed predominately by men; and

  5. forces a synthetic ‘female’ voice and personality to defer questions and commands to higher (and often male) authorities.

The paper suggests five remedies to this. It implores companies to:

  1. end the practice of making digital assistants female by default;

  2. explore the feasibility of developing a neutral machine gender for voice assistants that is neither male nor female;

  3. programme digital assistants to discourage gender-based insults and abusive language;

  4. encourage interoperability so that users can change digital assistants, as desired; and

  5. require that operators of AI-powered voice assistants announce the technology as non-human at the outset of interactions with human users.

A couple of these are doable, but good luck with the overarching point. Amazon has explained why Alexa is female before, and it comes down to human tastes. “We carried out research and found that a woman’s voice is more ‘sympathetic’ and better received,” Amazon’s Daniel Rausch told Business Insider last year. This is backed up by a couple of academic studies on the subject.

So these suggestions could and probably should be implemented, but will they be? Well, equality is nice and all, but selling more future e-waste for short-term gain is even nicer. µ

Further reading

Source : Inquirer

Previous ArticleNext Article
Founder and Editor-in-Chief of 'Professional Hackers India'. Technology Evangelist, Security Analyst, Cyber Security Expert, PHP Developer and Part time hacker.

Send this to a friend