Researchers say a muffled voice hidden in a YouTube video can take over your smartphone
Researchers have discovered a series of inaudible voice commands secretly hidden in innocuous YouTube videos that can compel unprotected Android or iOS smartphones to carry out malicious operations to a nearby smartphone without you even knowing it.
The threat was described by the researchers in a research paper to be presented next month at the USENIX Security Symposium in Austin, Texas.
Thanks to services like Google Now and Apple’s Siri, voice recognition has taken off quickly on smartphones, according to Micah Sherr, Professor at Georgetown University and one of the paper’s authors. However, the voice software has also made it simpler to hack devices.
Last year, two security researchers from French agency ANSSI controlled smartphones with voice commands by using radio waves to send hidden commands to smartphones running Siri or Google Now. But, the attack was possible only if the smartphone had its headphones plugged in.
A team of seven researchers from the University of California, Berkeley, and Georgetown University has developed a variation of this attack that uses distorted voice commands hidden in YouTube videos.
The attack works when the user is viewing a tainted YouTube video that contains hidden commands. He can view the video from his nearby PC, laptop, smart TV, tablet, or another smartphone.
Once the target mobile picks up the mangled voices, the sound filtering features included with Siri or Google Now will clean out the sounds and execute the commands.
“Ok Google, Open XKCD.com,” the voice says, and a nearby smartphone opens that URL.
It is forthright to think about how a hacker might direct a smartphone from an internet site containing malware, or instruct the smartphone to take a photograph.
The hackers can create voice commands that are even harder to decipher by humans, if they know the ins and outs of the voice recognition software itself, and know its internal workings.
Researchers have recorded a video of their attack, which shows that some of the mangled voice commands are easy to pick up by a human paying enough attention, but some of the commands are not.
The type of hidden commands embedded in such videos range from simple Google searches to instructions to download and install malware, ultimately allowing the attacker to take full control of the device.
Researchers argue that a series of defences can be put in place, such as alerting the user when voice commands are accepted or by adding a verbal challenge-response system.
Similarly, developers of voice recognition software could include filters to distinguish between human and computer-generated sounds to guard against the threat, the paper stated.
Technical details about the attack are available in the researchers Hidden Voice Commands paper found on their project’s official website. Check out the embedded video below.