Wednesday, September 13, 2017

Hey Alexa, can you hear this dog whistle?

It seems that the major voice command products - Siri, Alexa, Google Now, and others - use hardware where the microphone can detect sound frequencies that your ears cannot.  Humans generally cannot hear sounds above 20 kHz, but microphones can.

As a result, someone who can play a recorded message in those frequencies can essentially send commands to your voice command system without you being the wiser, even if you're in the same room:
Speech recognition (SR) systems such as Siri or Google Now have be- come an increasingly popular human-computer interaction method, and have turned various systems into voice controllable systems (VCS). Prior work on attacking VCS shows that the hidden voice commands that are incomprehensible to people can control the systems. Hidden voice commands, though ‘hidden’, are nonethe- less audible. In this work, we design a completely inaudible attack, DolphinAttack, that modulates voice commands on ultrasonic carriers (e.g., f > 20 kHz) to achieve inaudibility. By leveraging the nonlinearity of the microphone circuits, the modulated low- frequency audio commands can be successfully demodulated, recov- ered, and more importantly interpreted by the speech recognition systems. We validate DolphinAttack on popular speech recogni- tion systems, including Siri, Google Now, Samsung S Voice, Huawei HiVoice, Cortana and Alexa. By injecting a sequence of inaudible voice commands, we show a few proof-of-concept attacks, which include activating Siri to initiate a FaceTime call on iPhone, activating Google Now to switch the phone to the airplane mode, and even manipulating the navigation system in an Audi automobile. 
Sigh.  If architects designed buildings the way engineers design software, the first woodpecker would destroy civilization.

What's that, Lassie?  Something told Siri that Timmy fell down the well?  That's funny - I didn't hear anything!


I love the nav attack on the car.  Given that voice commands are becoming the Interface Of The Future, given that there's lousy protection on automotive CAN networks, given the rapid movement towards self-driving cars, and given the continued spread of malware on mobile phones we're looking at the possibility of a Perfect Storm of self-driving mischief.  And survivors would swear under oath than nobody ordered the car to make an emergency stop.  After all, they wouldn't have heard a thing even though it was there.

4 comments:

  1. Now add in a hack to monitor the audio system regardless of permissions and even if you've got "Hey, Siri" turned off, and think of the possibilities for mischief.

    ReplyDelete
  2. Heh... when I'm not using Alexa, it's unplugged from the power. Period.

    ReplyDelete
  3. There are two chances that manufacturers will institute any defenses: slim and none. I work for a company that is heavily involved in driver-less car technologies. The security is again laughable.

    ReplyDelete
    Replies
    1. So many techies are inspired by sci-fi-- the Sony Walkman was inspired by Fahrenheit 451 of all things! Many say they were inspired by Star Trek did these self- driving car developers not see The Wrath of Khan and the end of the battle in the electrical storm?

      Delete

Remember your manners when you post. Anonymous comments are not allowed because of the plague of spam comments.