Word from Apple, ahead of the big rollout of iPhone 8 and iOS11 on September 12, is that its voice assistant Siri is going to sound more like a person and less like a robot.
Great for the user experience. But based on a report published just last week by a team of researchers at Zhejiang University in China, perhaps Apple should
have spent more of its time on what Siri hears instead of what users hear.
Because they demonstrated that Siri – along with every other voice assistant (VA) they tested – will respond to commands that don’t come from a human – that are not only outside the human vocal range, but are also inaudible to humans.
Which means your dog could probably hear it. But it also means an attacker could give your VA a command, and you won’t know about it.
In the report, titled, “Dolphin Attack: Inaudible Voice Commands”, the researchers said they were able to validate it on Siri, Google Now, Samsung S Voice, Huawei HiVoice, Cortana and Amazon’s Alexa. Using ultrasonic voice commands at frequencies of more than 20 kHz, they got the VAs to
- Visit a malicious website, “which can launch a drive-by-download attack or exploit a device with 0-day lnerabilities”.
- Spy on the user by initiating outgoing video/phone calls, therefore getting access to the image/sound of device surroundings.
- Inject fake information, by instructing the device, “to send fake text messages and emails, to publish fake online posts, to add fake events to a calendar, etc”.
- Impose a denial of service, through a command to turn on the airplane mode, disconnecting all wireless communications.
- Conceal attacks by dimming the screen and lowering the volume of the device.
- “Tested attacks include launching Facetime on iPhones, playing music on an Amazon Echo and manipulating the navigation system in an Audi automobile,” the team wrote, which means an attacker could change the destination on your GPS.