Researchers hack a self-driving car by putting stickers on street signs

7 years, 3 months ago - 7 August 2017, Autoblog
Researchers hack a self-driving car by putting stickers on street signs
Researchers at University of Washington, University of Michigan, Stony Brook University, and UC Berkeley have figured out how to hack self-driving cars by putting stickers on street signs.

Starting by analyzing the algorithm the vision system uses to classify images, they used a number of different attacks to manipulate signs in order to trick machine learning models into misreading them. For instance, they printed up stickers to trick the vision system an autonomous car would use into reading a stop sign as a 45-mile-per-hour sign, the consequences of which would obviously be very bad in the real world.

In the paper, "Robust Physical-World Attacks on Machine Learning Models," the authors demonstrated four different ways they were able to disrupt the way machines read and classify these signs using only a color printer and a camera. The most troubling part about these experiments is that they all appear very subtle to the human eye, camouflaged as graffiti, art, or incorporated into the sign's imagery.

The first method involves printing up a full-size poster to cover the sign, which looks normal but maybe a little faded in places to a human. This caused the machine vision, from a number of angles and distances, to classify a stop sign as a speed limit sign 100 percent of the time in tests. Stickers placed on a stop sign to look like graffiti spelling the words "love" and "hate" caused the stop sign to read as a speed limit sign two-thirds of the time (and once as a yield sign). An "abstract art" attack – just a few small, strategically placed stickers – had the same effect as the poster cover-up. On a right turn sign, the researchers masked the arrow with grey stickers, and got it to read as a stop sign two-thirds of the time, and an added lane sign the rest of the time.

To make the attack work, hackers would have to be able to know the algorithm the car's vision system uses to classify the road signs it sees (or be able to approximate the model based on feedback from the system). After figuring out how to confuse the system, though, they only need a photo of the target sign, a color printer, some sticky paper, and the will to live with the consequences of causing someone to blow through a stop sign at 45 miles per hour.

Tarek El-Gaaly, senior research scientist at autonomous driving startup Voyage, tells Car and Driver that there are solutions for these sorts of attacks, though, that can be incorporated into autonomous driving systems. Context is one fix, and a car could be able to tell if it misidentified a sign based on, say, its location, and know that it shouldn't go highway speeds in an urban area. "In addition," he said, "many self-driving vehicles today are equipped with multiple sensors, so failsafes can be built in using multiple cameras and lidar sensors."

Support Ukraine