New in OS X: Get MacRumors Push Notifications on your Mac

Resubscribe Now Close

Researchers Demonstrate Subliminal Smart Device Commands That Have Potential for Malicious Attacks

Researchers in the United States and China have been performing tests in an effort to demonstrate that "hidden" commands, or those undetectable to human ears, can reach AI assistants like Siri and force them to perform actions their owners never intended. The research was highlighted in a piece today by The New York Times, suggesting that these subliminal commands can dial phone numbers, open websites, and more potentially malicious actions if placed in the wrong hands.

A group of students from the University of California, Berkeley and Georgetown University published a research paper this month, stating that they could embed commands into music recordings or spoken text. When played near an Amazon Echo or Apple iPhone, a person would just hear the song or someone speaking, while Siri and Alexa "might hear an instruction to add something to your shopping list." Or, more dangerous, unlock doors, wire money from your bank, and purchase items online.


The method by which the students were able to accomplish the hidden commands shouldn't be a concern for the public at large, but one of the paper's authors, Nicholas Carlini, believes malicious parties could already be making inroads with similar technology.
“We wanted to see if we could make it even more stealthy,” said Nicholas Carlini, a fifth-year Ph.D. student in computer security at U.C. Berkeley and one of the paper’s authors.

Mr. Carlini added that while there was no evidence that these techniques have left the lab, it may only be a matter of time before someone starts exploiting them. “My assumption is that the malicious people already employ people to do what I do,” he said.
Last year, researchers based at Princeton University and Zheijiang University in China performed similar tests, demonstrating that AI assistants could be activated through frequencies not heard by humans. In a technique dubbed "DolphinAttack," the researchers built a transmitter to send the hidden command that dialed a specific phone number, while other tests took pictures and sent text messages. DolphinAttack is said to be limited in terms of range, however, since it "must be close to the receiving device."

DolphinAttack could inject covert voice commands at 7 state-of-the-art speech recognition systems (e.g., Siri, Alexa) to activate always-on system and achieve various attacks, which include activating Siri to initiate a FaceTime call on iPhone, activating Google Now to switch the phone to the airplane mode, and even manipulating the navigation system in an Audi automobile.
In yet another set of research, a group at the University of Illinois at Urbana-Champaign proved this range limitation could be increased, showing off commands received from 25 feet away. For the most recent group of researchers from Berkeley, Carlini told The New York Times that he was "confident" his team would soon be able to deliver successful commands "against any smart device system on the market." He said the group wants to prove to companies that this flaw is a potential problem, "and then hope that other people will say, 'O.K. this is possible, now let's try and fix it.'"

For security purposes, Apple is stringent with certain HomeKit-related Siri commands, locking them behind device passcodes whenever users have passcodes enabled. For example, if you want to unlock your front door with a connected smart lock, you can ask Siri to do so, but you'll have to enter your passcode on an iPhone or iPad after issuing the command. The HomePod, on the other hand, purposefully lacks this functionality.

Tag: Siri


Top Rated Comments

(View all)

23 weeks ago
This is really clever. I wouldn’t have thought that the AIs would respond to non-vocal frequencies as they’re intended to listen to humans only. I would think that checking the frequency range of the command would be enough to counteract this problem fairly simply.
Rating: 18 Votes
23 weeks ago
In fairness, Apple has made great strides in tackling this issue already through Siri's continued uselessness.
Rating: 12 Votes
23 weeks ago
That is NOT "subliminal".

I think you're looking for another word.
Rating: 12 Votes
23 weeks ago

That is NOT "subliminal".

I think you're looking for another word.


My thought is that the word should be "inaudible" and NOT "subliminal". As in, "The devices can react to inaudible commands."
Rating: 7 Votes
23 weeks ago
HomePod directs me to use my phone to unlock my front door or open my garage doors. This potential issue seems to be somewhat under control with iOS.
Rating: 7 Votes
23 weeks ago

This is really clever. I wouldn’t have thought that the AIs would respond to non-vocal frequencies as they’re intended to listen to humans only. I would think that checking the frequency range of the command would be enough to counteract this problem fairly simply.


Agreed. But why wouldn't Apple have foreseen this and limited the frequency range in the first place? There's literally no need for phone mics to detect anything below/above human voice frequencies.
Rating: 6 Votes
23 weeks ago
Finally a way to get Siri to properly recognize voice commands - embedding them in pop music.
Rating: 5 Votes
23 weeks ago
No problem, Siri is so terrible that I already have Siri disabled on all of my devices.
Rating: 3 Votes
23 weeks ago
Reminds me of the time (long ago, pre-Siri) my Mac's display was waking randomly, while I was in the same room watching TV. Never knew what was causing it, until one day when the headphones were disconnected from it.

TV: "....we don't have time for that..."
Mac: <wakes display> "It's 7:15"

Feels funny hearing your appliances having a conversation and not being involved in it!
Rating: 3 Votes
23 weeks ago
What they don't point out is they must of "trained" Siri on the victim phone with the voice of the other Siri device as Siri will not respond to other people saying Hey Siri except the owner. That is for the iPhone of course. HomePod is different story.

They really need to show more info. I can assume they picked the iPhone for more of a scare factor though left out that part. Now Alexa and many Android phones, they still respond to anybody.
Rating: 2 Votes

[ Read All Comments ]