VOICE ASSISTANTS might be all the rage right now, but it turns out Alexa, Google Assistant and Siri can be manipulated using ‘inaudible’ commands.
According to a report at the New York Times, researchers at the University of California, Berkeley, have demonstrated that they can stuff ‘silent’ commands into music or spoken text to that could potentially get a voice assistant to add something to a shopping list, control an IoT device… or worse.
The hidden instruction is inaudible to the human ear, so there’s no easy way of telling when Alexa might be tricked into adding an item to your Amazon shopping cart or unlocking your front door, for example.
The researchers claim they were even able to hide the command, “OK Google, browse to evil.com” in a recording of the spoken phrase “Without the data set, the article is useless.”
Speaking to the New York Times, Nicholas Carlini, a fifth-year PhD student in computer security at UC Berkeley and one of the paper‘s authors said that while such attacks haven’t yet been reported, it’s possible that “malicious people already employ people to do what I do.”
“We want to demonstrate that it’s possible, and then hope that other people will say, ‘Okay this is possible, now let’s try and fix it’,” Carlini added.
In response to the report, Amazon told the newspaper that it has taken steps to ensure its Echo smart speaker is secure, while Google said its Assistant has features to mitigate undetectable audio commands.
Apple said it’s HomePod smart speaker is designed to prevent commands from doing things like unlocking doors, and it noted that iPhones and iPads must be unlocked before Siri will act on commands.
This isn’t the first time that so-called silent commands have been used to fool artificial intelligence assistants. A technique developed by Chinese researchers dubbed DolphinAttack (below) even went one step further by muting the target phone before issuing inaudible commands so the owner wouldn’t hear the device’s responses. µ
Source : Inquirer