For groups that have lobbied for stronger safety rules, that’s precisely what’s wrong with U.S. regulators’ increasingly anything-goes approach.
“Allowing automakers to do their own testing, with no specific guidelines, means consumers are going to be the guinea pigs in this experiment,” said Jackie Gillan, president for Advocates for Highway and Auto Safety, a longtime Washington consumer lobbyist who has helped shape numerous auto-technology mandates. “This is going to happen again and again and again.”
Tesla’s use of technology still in development, while common in its Silicon Valley home, contrasts with the cautious method of General Motors Co. and other automakers that have restricted their semi-autonomous cars to test tracks and professional drivers. It’s permitted because U.S. regulators have taken an intentionally light approach to encourage innovation.
DVD Player
The May crash under investigation involved a 40-year-old Ohio man who was killed when his 2015 Model S drove under the trailer of an 18-wheeler on a highway near Williston, Florida, according to the Florida Highway Patrol. The truck driver told the Associated Press that he believes the Ohio man may have been watching a movie. Authorities recovered a portable DVD player but don’t know whether it was playing at the time of the crash.
The National Highway Traffic Safety Administration said Thursday that it is investigating the crash, which comes as the regulator says it is looking for ways to collaborate with the industry. The agency negotiated an agreement to speed the introduction of automatic emergency braking earlier this year, frustrating safety groups who say they had no input and said carmakers’ pledges to install the technology couldn’t be enforced by law.
NHTSA is expected to announce guidelines as soon as this month that will set some parameters for self-driving cars on U.S. roads. Transportation Secretary Anthony Foxx told reporters Wednesday the agency would be as exact as it could without being overly prescriptive.
“We’re crafting a Declaration of Independence, not a Constitution,” Foxx said.
In January, Foxx and NHTSA chief Mark Rosekind announced in Detroit that they’d allow automakers to demonstrate the safety of autonomous vehicles and apply for exemptions to existing safety rules. They said the government shouldn’t stand in the way of technological progress.
In the Florida crash, Tesla’s "Autopilot" semi-autonomous driving feature failed to detect the white side of the tractor trailer against a brightly lit sky, so it didn’t hit the brakes, according to the company.
The company says the cars are safer than conventional ones. In a blog post, Tesla said the May accident was the first known fatality in more than 130 million miles of Autopilot driving. That compares with one fatality in every 94 million miles among all U.S. vehicles, according to Tesla.
Highway Deaths
In fact, highway deaths are on the rise. Preliminary data released Friday showed a 7.7 percent increase in U.S. highway fatalities in 2015, to 35,200, compared with the year before. NHTSA attributed the increase, in part, to an improving economy and falling gas prices. But it said that human error is a factor in 94 percent of car crashes.
“Autopilot is by far the most advanced driver-assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility,” the company said. “Since the release of Autopilot, we’ve continuously educated customers on the use of the feature, reminding them that they’re responsible for remaining alert and present when using Autopilot and must be prepared to take control at all times.”
Using customers to help refine a technology is common in the tech industry, where software features are rolled out in "beta" phase to consumers who provide programmers with real-time feedback to refine bugs. The industry has concluded that accepting some flaws is the price for getting to use the new products more quickly.
Beta Testing
That kind of trial-and-error technique is far different than accepted practices in the auto industry, said Joan Claybrook, another veteran auto-safety advocate who ran NHTSA in the 1970s. Automakers work for months, even years, to refine safety technology so it’s reliable enough for the public. If they miss, car companies know they’ll be on the hook for costly safety recalls.
“They shouldn’t be doing beta-testing on the public,” Claybrook said. “The history of the auto industry is they test and test and test. This is a life-and-death issue.”
Tesla prides itself on its safety record. In August 2013, the Model S sedan was awarded a 5-star safety rating by the National Highway Traffic Safety Administration. The company’s website states that “Model S comes with Autopilot capabilities designed to make your highway driving not only safer, but stress free.”
Semi-Autonomous
Autopilot doesn’t make the cars fully autonomous. Designed chiefly for highway use, it takes over steering and engages an active cruise control to adjust speed without driver input. The car will change lanes if the driver puts on a turn signal, and there’s automatic emergency braking if it senses a crash is imminent. It uses cameras, radar and digitally controlled brakes to take over all those functions from the human driver.
Autopilot alerts drivers if it loses confidence in its ability to drive, Tesla said in an e-mailed response to questions. If the system doesn’t sense the human operator has reengaged, the cars will gradually slow down and eventually come to a complete stop and flash the hazard lights.
BMW AG, the German luxury-car maker, announced its own self-driving car venture Friday, partnering with Intel Corp. and Mobileye NV, aiming for cars on the road by 2021. Even on the day of the announcement, company executives were cautious about the limits of technology that allows people to drive hands-free.
“First of all, the news of the accident is very sad,” BMW Chief Executive Officer Harald Krueger said at an event in Munich Friday. “Technology for highly autonomous driving isn’t ready for series production yet.”
In February, a Lexus-model Google self-driving car hit the side of a bus near the company’s Silicon Valley headquarters. The vehicle was in autonomous mode going about 2 miles per hour around sandbags in the road. Google’s software detected the bus but predicted that it would yield, which it did not, according to a company report about the incident. There were no injuries reported at the scene, the company said. “In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision,” Google said in its report.
Google says its cars have covered 1.5 million miles and are currently being tested on the streets in Mountain View, California, Austin, Texas, Kirkland, Washington and the Phoenix metropolitan area. It’s testing both the specially built iconic “koala” cars and Lexus SUVs retrofitted with Google sensors and software.
A test driver accompanies every vehicle “for now,” according the company website.
Consumer groups have been accused of being anti-technology, even though they’ve pushed for mandates for things like air bags and electronic stability control, said Gillan of Advocates for Highway and Auto Safety. They’re against allowing things into the market before they’ve been proven to work.
“We want to minimize the road kill,” Gillan said. “You set standards for testing so everyone is abiding by the same rules. You can’t just let these companies put this technology on the road and let owners find their own way. That’s not good enough.”
Related News