Associations, Latest News

US consumers union calls on Tesla to improve “Autopilot” safety

NCR Tesla

Consumers Union, the advocacy division of Consumer Reports (CR), called on Tesla to move quickly to improve the safety of its “Autopilot” driver-assist system and to publicly release the detailed data behind the company’s safety claims. The company recently acknowledged that this technology was engaged at the time of a recent fatal crash in California, and it was reported that the company commented directly on the safety of its system and what it described as “moral and legal liability” in the crash. This is at least the second fatal accident that has happened with the Autopilot system engaged.

CR experts have determined that Autopilot does not limit its use to only driving circumstances for which it is safe for the system to operate. The experts have also found that Tesla’s system of monitoring whether a driver’s hands are on the wheel fails to effectively address the safety risks of foreseeable uses of the system, unlike certain other driver-assist systems involving automated steering and braking.

The National Transportation Safety Board (NTSB) and the National Highway Traffic Safety Administration (NHTSA) have launched investigations into the March 23 collision of a Tesla Model X on a California highway, in which a 38-year-old engineer and father of two lost his life. The NTSB announced the removal of Tesla as a party to the NTSB’s investigation because Tesla violated the party agreement by releasing investigative information before it was vetted and confirmed by the NTSB. Such releases of incomplete information often lead to speculation and incorrect assumptions about the probable cause of a crash, which does a disservice to the investigative process and the travelling public.

David Friedman, Director of Cars and Product Policy and Analysis for Consumers Union, said, “After another tragedy involving Autopilot, Tesla should commit to put safety first and to stop using consumers as beta testers for unproven technology. While the results of the crash investigations will be critical to understanding all that contributed to this tragedy, previous NTSB findings already showed that Autopilot should do more to protect consumers. We see no excuse: Tesla should improve the safety of Autopilot without delay.”

“Tesla markets itself as an innovator. It should not put lives at risk, damage its reputation, or risk the success of its systems, or driver assist technology as a whole, by failing to take steps that would better protect consumers’ safety. Further, the company should not make either specific or broad safety claims without providing the detailed data to back them up. They should show, not just tell, us how safe their system is,” Friedman added.

“Instead of issuing a defensive Friday evening blog post or statements blaming the victim, Tesla should fix Autopilot’s design and be transparent about their safety claims. The company should publicly provide detailed data to demonstrate conditions for which its Autopilot system can safely operate. It should limit Autopilot’s operation only to those conditions, and have a far more effective system to sense, verify, and safely react when the human driver’s level of engagement in the driving task is insufficient or when the driver fails to react to warnings. If other companies can do it, so should Tesla and this would fulfil the NTSB recommendations made more than six months ago.”

This article courtesy of Russell Thrall III, publisher CollisionWeek. Check out their website at: CollisionWeek.

Send this to a friend