Jos© Lopes Esteves and Chaouki Kasmie are two French researchers working for ANSSI, a French government agency devoted to information security. The duo have discovered an ingenious exploit that could allow a hacker to inject voice commands into Google Now, and similar technologies such as Siri, using electromagnetic waves. However, the hack is essentially a hardware attack as it involves utilizing the headphone cord as an antenna in order to convert electromagnetic waves into signals that are registered by the operating system as sound coming from the microphone. As you can see from the picture at the bottom of this article, the equipment would look a little conspicuous on a table in your local Starbucks or Cheesecake Factory!
What could this attack be used for? As it happens, anything that Google Now can be used for. This means that the hacker could tell your smartphone to remind you to buy the milk when you get home, browse to a malware site, send spam or phishing messages to contacts, or dial a premium rate number. However, the radio wave hack has a few serious limitations apart from the necessary equipment – which has a range of around six feet. With a more powerful radio and higher capacity batteries, which would require mounting into a van, the range could be increased to around sixteen feet. However, the exploit also requires a victim's smartphone to have microphone-enabled headphones be physically plugged into the device. The victim would also have to be looking and listening away from the screen, as when Google Now is activated, it pauses whatever media that's playing and beeps the confirmation. Even if one were to not notice the radio antenna on the next table, a potential victim would surely notice their device activating Google Now and could prevent it from performing whatever malicious command the hacker was trying to inject into the device.
Google has been notified of the potential exploit and Jos© and Chaouki have given a number of suggestions as to how Google could modify Google Now to help prevent this kind of attack. Better electromagnetic shielding on the headphone cables would help (although this could prevent them from acting as the FM Radio antenna), together with allowing users to create their own custom words to wake Google Now. If the device is locked, Android does not allow Google Now to work from the lock screen by default and is easily trained to recognize the user's voice. Things are not so great for iPhone customers as Siri is enabled by default on the lock screen and does not support different voice profiles. There's more bad news for iPhone customers as the researchers state that with some reverse engineering and spoofing, it would be possible to create the ability to convince the device that the iPhone button had been pressed in order to activate Siri.