13

A few days ago I was driving home and suddenly Siri started to talk by herself in reply to a command I never gave. As you can imagine I have the "hey siri..." feature active, so I can make calls without moving my hands from the wheel.

I was wondering...

Is it possible for a radio announcer to say something like "Hey Siri, open site www.victimsite.com" generating a DDoS or even say something like "Hey siri tweet #radiocool is a great station" creating a trending topic in seconds?

AcidRod75
  • 243
  • 1
  • 5

1 Answers1

7

It is absolutely possible. There was a publicized case of this in a Xbox ad that demonstrated the voice activation features that seemed to have worked a bit too well.

Some quick-and-dirty remedies for this are:

  • Allowing users to program their own initiation phrase. This isn't perfect as the radio can still "get lucky" and hit the initiation phrase, causing an accidental triggering. Also, there's a time window between when you've initiated Siri and issued a command where the radio can talk to Siri.
  • Requiring confirmation for some/all actions. Unless the confirmation is physical, there is still an opportunity for the radio to confirm the task as well.

I guess the best solution would be for Siri to be able to distinguish your voice from the voices of others. That seems trickier though as, according to my understanding of the implementations, the initiation phrase (eg: "Hey Siri") is recognized locally by the phone while the full voice recognition utilizes cloud computing resources and the phone's local power is limited by CPU and battery.

I suspect that all that is needed is one successful attack for vendors like Siri and Android to come up with better defenses.

Neil Smithline
  • 14,842
  • 4
  • 39
  • 55