Tesla Cybertruck Driving On FSD Fails To Detect Mannequin

1 day, 19 hours ago - 19 November 2024, InsidEevs
Tesla Cybertruck Driving On FSD Fails To Detect Mannequin
In a series of tests run by an independent YouTuber, the Tesla Cybertruck repeatedly failed pedestrian safety tests.

In an independent pedestrian safety test, the Tesla Cybertruck's Full-Self Driving (FSD) system failed to detect multiple rolling and stationary objects. It failed to detect a mannequin and almost drove over it before the driver intervened. FSD also made the Cybertruck drive dangerously close to an actual human, where it was expected to come to a halt.

Imagine a self-driving pickup truck made of bulletproof stainless steel, weighing over 6,600 pounds, hurtling down the road as you stand in its path. You'd expect its cameras and sensors to detect you and bring it to a stop. However, at least one independent test suggests the Cybertruck may fail to identify several large and small objects on the road, including humans in some cases.

Chris, the host of the YouTube channel Dirty Tesla, tested the Cybertruck’s Full-Self Driving (FSD) system on a dirt road and the results were alarming. With FSD engaged and the truck traveling between 20-30 miles per hour, Chris placed various objects in its path. The Cybertruck failed to detect most of these items, drove uncomfortably close to others and even displayed a ghost-like figure instead of a mannequin.

Tesla started Cybertruck deliveries roughly a year ago. Its production has ramped up since and the truck was the third best-selling EV in the U.S. in the third quarter, behind the Model Y and Model 3. However, Tesla only started rolling out FSD on the Cybertruck around September this year. Despite its misleading label, FSD is not a certified self-driving system. It’s a Level 2 advanced driver assistance system that requires full supervision. 

Initial tests of the Cybertruck’s FSD revealed troubling inconsistencies when small objects were placed in its path. The stainless steel pickup ran over items like an exercise ball, a small Amazon-style delivery box and a white bucket without even detecting them. When larger objects were placed, the results varied. In both stationary and moving tests, the truck detected and stopped for a kid’s bike, thanks to Autonomous Emergency Braking (AEB). 

However, the test showed FSD's drawbacks when a human-sized mannequin was placed in its path. Instead of recognizing the figure, the Cybertruck displayed an amorphous, almost invisible shape on its screen. The driver had to intervene to navigate around it manually. Even when the mannequin's arms were raised to improve visibility, FSD failed to detect it.

When Chris himself stood directly in front of the truck, FSD reacted inconsistently. In one instance, the Cybertruck dangerously maneuvered around him with an uncomfortably close distance. In the following test run, it stopped correctly when Chris stood squarely in its path, again relying on AEB. 

We don't recommend trying such experiments at home if you own a Tesla (or any other ADAS-equipped vehicle). FSD has been linked to dozens of deaths and hundreds of crashes and is under multiple federal investigations. 

This is the same tech that will underpin the Tesla Robotaxi, which CEO Elon Musk unveiled at a Hollywood-style event in Los Angeles in October. Scientists in AI and automation, including several interviewed by InsideEVs, have raised red flags about the system’s reliability.

Now, Musk is leading a new Department of Government Efficiency under the Trump administration. Dubbed the DOGE, it could streamline regulations for autonomous vehicles, which includes Tesla's Robotaxi business. We’ll have to wait and see if FSD sees any safety improvements before Tesla gets approval for and releases its unsupervised version. But at this point, its safety is very much an open—and urgent—question.

Support Ukraine