NCR Tesla recall
Published on February 11th 2022 in

Tesla recall over “rolling stop” feature in self-driving mode

In the USA, Tesla is recalling 53,822 vehicles equipped with a test version of its Full-Self Driving software that can allow the vehicle to roll through four-way stop signs, the National Highway Traffic Safety Administration (NHTSA) announced.

NHTSA said the recall covers some 2016-2022 Model S and Model X, 2017-2022 Model 3, and 2020-2022 Model Y vehicles. The agency said Tesla would disable the setting that allows rolling stops through an over-the-air software update.

Tesla makes the Full Self-Driving, or FSD, feature available by monthly subscription as part of its Autopilot or Enhanced Autopilot self-driving systems. Autopilot allows hands-off driving under certain circumstances, but still requires the driver to be paying attention and ready to take control of the vehicle.

Tesla, which has disbanded its media relations department, did not comment.

FSD allows drivers to choose between three “profiles,” labelled “Chill,” “Average” and “Assertive.” Tesla documentation says that in both Average and Assertive modes, the vehicle “may perform rolling stops,”. In Assertive mode, the vehicle will also “have a smaller follow distance, perform more frequent speed lane changes, and will not exit passing lanes… .”

According to NHTSA, the feature allows Tesla vehicles to roll through four-way stop signs at speeds up to 5.6 mph, without first coming to a full stop, under certain circumstances. The idea of a semiautonomous driving system that could deliberately violate traffic laws had generated much discussion in automotive circles in recent weeks.

Regulators and safety advocates have recently focused attention on the potential for misuse of advanced driver assistance system (ADAS) technology, both through deliberate recklessness and misunderstandings about the systems’ capabilities.

The Insurance Institute for Highway Safety (IIHS) recently announced that it will push the industry for adequate safeguards to make certain that the drivers of these vehicles are still paying attention to the road.

“Partial automation systems may make long drives seem like less of a burden, but there is no evidence that they make driving safer,” said IIHS President David Harkey. “In fact, the opposite may be the case if systems lack adequate safeguards.”

The Tesla FSD situation appears to be different, in that the vehicle can perform in a way considered unsafe by NHTSA even if the driver is paying attention and using the self-driving system as the OEM intended.

This article courtesy of John Huetter of Repairer Driven Education. Check out the website at:

Industry Partners