Hackers steer a Tesla with game controller, trick it with stickers

5 years, 8 months ago - 4 April 2019, Autoblog
Hackers steer a Tesla with game controller, trick it with stickers
They took control, fooled Autopilot, and made the wipers watch TV

Using a game controller, a TV set and some red markings on a road surface, hackers at the Tencent Keen Security Lab, targeting a Tesla Model S 75 running 2.5 Autopilot hardware and 2018.6.1 software, have managed to (1) remotely control the steering system; (2) make the windshield wipers think it is raining; (3) cause the vehicle to move out of its existing lane, being tricked into thinking that the current lane has shifted.

Elon Musk responded on Twitter, praising the researchers' "solid work."

The researchers focused on getting into the vehicle's APE, or "Autopilot ECU."

For the steering takeover, they had to "dynamically inject malicious code into cantx service and hook the 'DasSteeringControlMessageEmitter::finalize_message()' function of the cantx service to reuse the DSCM's timestamp and counter to manipulate the DSCM with any value of steering angle."

Or make the processor believe that the instructions were legit.

The setup they used was a gamepad that connected to a mobile device that connected to the compromised APE via 3G/WiFi.

As for the wipers, the Keen researchers took advantage of the fact that unlike other automakers, which use optical sensors to detect raindrops on a windshield, Tesla uses a 120-degree fisheye camera that feeds information to a neural network to figure out whether it is raining, and if so, to turn on the wipers.

Turns out that neural networks can be tricked with some perturbations in the images being processed. So they created an "adversarial" image and put it on a TV screen that the fisheye lens could detect, and on went the wipers. (The researchers pointed out, "it is well known that the traditional autowipers solution without neural network does not have such a problem.")

Finally, lane detection. Or maybe that's lack of lane detection. According to the researchers, "Tesla uses a pure computer vision solution for lane recognition." So what they did was to paste red stickers on the surface of a road, which managed to cause the Autopilot system to steer out of its lane and into the adjacent lane.

Here's something about this that is certainly disturbing. The Tencent Keen Security Lab researchers write, "Tesla's Autopilot module's lane recognition function has a good robustness in an ordinary external environment (no strong light, rain, snow, sand and dust interference)." Which is good. However, the system was tricked by the red stickers on the road.

They point out, "This kind of attack is simple to deploy, and the materials are easy to obtain."

Which seems to mean that even if you don't know what a DasSteeringControlMessageEmitter or a cantx service is, you can still trick a Tesla.

Support Ukraine