That innocent-sounding bird singing out its morning song may not be so innocent after all. According to scientists at Ruhr-Universitaet in Bochum, Germany, manipulated audio waves from the sounds of birds chirping could be used to launch an attack against voice assistants.
According to the researchers, the manipulated audio files are part of what is called an “adversarial attack,” which is designed to confuse the deep neural networks that help artificial intelligence-powered assistants like Apple’s Siri, Google’s Assistant, and Amazon’s Alexa function.
Using the sound of birds chirping — or edited versions of songs or human speech — manipulated in a way that only the microphone on your smart speaker or smartphone can pick up, the attack bypasses detection from human ears and begins meddling with the A.I. assistant. What sounds like a bird’s song could actually be one of these attacks with hidden commands being delivered to your voice assistant of choice.
The researchers suggest the attacks, which use psychoacoustic hiding to mask their true intentions, could be played via an app or hidden in another type of broadcast. For instance, the sound could be hidden in a commercial that plays on TV or the radio to hit thousands of targets at once.
“[In] a worst-case scenario, an attacker may be able to take over the entire smart home system, including security cameras or alarm systems,” the researchers wrote, per Fast Company. They also put together a demonstration to show how such an attack could be used to deactivate a security camera.
There is a catch to this theoretical attack: the researchers have not launched it through a broadcast yet. Instead, they have fed the doctored files that contain the hidden audio command directly into the assistants so they hear the message clearly. However, the scientists are confident that the attack could be carried out through another means. “In general, it is possible to hide any transcription in any audio file with a success rate of nearly 100 percent,” they concluded in their paper.
There aren’t a lot of ways to defend against such an attack, which, to be clear, would require a fair amount of effort for someone to execute and is unlikely, even if possible. Use the security settings on your voice assistant of choice to prevent any access to sensitive information. On Alexa, for example, you can require a PIN to be provided before completing a purchase.