Engineers developing autonomous cars certainly have their work cut out as they try to perfect the technology to make the safest vehicles possible, but it’s often the unexpected issues that pop up along the way that can leave them scratching their heads.
A few months ago, for example, it was revealed that bird poop had been causing havoc with the sensors on autonomous cars, with a direct hit obscuring their ability to “see,” making the vehicle about as safe as a human driver tootling along with their eyes closed. While Waymo has overcome the poop problem with the development of tiny water squirters and wipers that spring into action the moment the gloop hits the sensor, another issue has just reared its ugly head that clearly requires urgent attention if we’re ever to see self-driving technology rolled out in a meaningful way.
Interested in testing the all-important sensors that help a car to make sense of its surroundings and make decisions at speed, security researchers at the University of Washington recently tampered with a street sign — under lab conditions, of course — to see if it would confuse the technology.
It did.
The researchers said that by printing off some stickers and attaching them in a particular way to different street signs, the alterations were able to confuse cameras that are used by “most” autonomous vehicles, Car and Driver reported.
Rather worryingly, the team managed to confuse a self-driving car into thinking a regular “stop” sign was a 45-mph speed limit sign, simply by adding a few carefully placed stickers to it (pictured).
The sign alterations can be very small and go unnoticed by humans because the camera’s software is using an algorithm to understand the image, and interprets it in a profoundly different way to how a human does. So the sign used in the test clearly continues to show the word “stop,” despite the addition of the graffiti-like stickers that serve to trick the car into thinking it means something else.
The researchers suggest that if hackers are able to access the algorithm, they could use an image of the road sign to create a customized, slightly altered version capable of confusing the car’s camera.
The implications of such confusion aren’t hard to imagine. A self-driving car speeding through a stop sign that it mistook for a speed limit sign could put it in the path of an oncoming vehicle, though in such a scenario the self-driving tech in both cars should prevent a catastrophic collision. So, in such cases, tampering with street signs has the potential to cause huge amounts of chaos on the roads rather than anything more serious.
But what happens if the entire sign is fake having been put up by pranksters — something that does happen from time to time. How will the driverless car be able to tell the difference between a fake sign and a genuine one? While the car’s mapping technology will add to its knowledge of its immediate surroundings, information on temporary signs for construction or incidents may have to be transmitted to driverless cars ahead of time to avoid issues. The technology could also take into account contextual information, prompting it to ignore, say, a (fake) 80 mph sign in a residential area.