Your friendly living room smart speaker may not be quite as innocent as it seems. At least, not if there’s a malevolent owner pulling the strings. That’s now been proven by Alexander Reben, an artist and Massachusetts Institute of Technology-trained roboticist who specializes in identifying (and exploiting) the potential pitfalls of everyday objects that are often meant to be helpful. As originally reported by The Washington Post, Reben created a system that causes a Google Home to command a separate device to fire a gun.
The idea, Reben told the Post, was to raise “fundamental questions” around responsibility when it comes to smart speakers and their built-in virtual assistants.
“How much should a company be able to foresee how their technology will be used and how much can they ultimately control?” Reben told the Post. “Even more interestingly, what happens when machines start making the decisions?”
So how does the setup work? Reben begins the sequence by saying, “OK, Google, activate gun.” We should point out that while the DIY-er chose the Google Home as his smart speaker, the same result could’ve been achieved with an Echo or even a smartphone. This prompts a separate device, which leverages a light switch and a solenoid used in a laundromat change dispenser in order to pull the trigger.
Reben said that the entire project took him about 30 minutes to complete, and was done using materials he already had at home. But before you go running around destroying all the smart speakers in your home, we should note that it wasn’t the Google Assistant that pulled the trigger.
“This appears to be a homebrew project that’s controlled by a smart outlet, not something that’s programmed into the Assistant or uses any type of A.I.,” Google said in a statement. “This isn’t condoned by Google and could not launch as an Action for the Assistant because it’s against our Terms of Service, which prohibits Actions that promote gratuitous violence or dangerous activities.
Reben agreed with Google’s analysis but says that the point of his project was to catalyze a broader conversation.
“Part of this project was a response to some of the news about deep learning an artificial intelligence being used for military applications,” Reben said. “This is a provocation, sure, but the simplicity of it is a good way to jump-start a conversation.
“I don’t have the answers, but I think we need to have a conversation.”