By Akash Sriram
(Reuters) -The National Highway Traffic Safety Administration on Friday said it was opening an investigation into 2.4 million Tesla vehicles with the automaker’s Full Self-Driving software after four reported collisions, including a fatal crash.
The U.S. auto safety regulator said it was opening the preliminary evaluation after four reports of crashes where FSD was engaged during reduced roadway visibility like sun glare, fog, or airborne dust. In one crash “the Tesla vehicle fatally struck a pedestrian. One additional crash in these conditions involved a reported injury,” NHTSA said.
The probe covers 2016-2024 Model S and X vehicles with the optional system as well as 2017-2024 Model 3, 2020-2024 Model Y, and 2023-2024 Cybertruck vehicles.
The preliminary evaluation is the first step before the agency could seek to demand a recall of the vehicles if it believes they pose an unreasonable risk to safety.
Tesla says on its website its “Full Self-Driving” software in on-road vehicles requires active driver supervision and does not make vehicles autonomous.
NHTSA is reviewing the ability of FSD’s engineering controls to “detect and respond appropriately to reduced roadway visibility conditions.”
The agency is asking if other similar FSD crashes have occurred in reduced roadway visibility conditions, and if Tesla has updated or modified the FSD system in a way that may affect it in reduced roadway visibility conditions.
NHTSA said the “review will assess the timing, purpose, and capabilities of any such updates, as well as Tesla’s assessment of their safety impact,” the agency said.
Tesla CEO Elon Musk is seeking to shift Tesla’s focus to self-driving technology and robotaxis amid competition and weak demand in its auto business.
The company did not immediately respond to requests for comment. Its shares were down 0.5% before the bell.
Last week, Musk unveiled Tesla’s two-seater, two-door “Cybercab” robotaxi concept without a steering wheel and pedals that would use cameras and artificial intelligence to help navigate roads. Tesla would need NHTSA approval to deploy a vehicle without human controls.
Tesla’s FSD technology has been in development for years and aims for high automation, where its vehicle can handle most driving tasks without human intervention.
But it has faced legal scrutiny with at least two fatal accidents involving the technology, including an incident in April in which a Tesla Model S car was in Full Self-Driving mode when it hit and killed a 28-year-old motorcyclist in the Seattle area.
Tesla’s “camera-only” approach to partially and fully autonomous driving systems, some industry experts have said, could cause issues in low-visibility conditions as the vehicles do not have a set of back-up sensors.
“Weather conditions can impact the camera’s ability to see things and I think the regulatory environment will certainly weigh in on this,” said Jeff Schuster, vice president at GlobalData.
“That could be one of the major roadblocks in what I would call a near-term launch of this technology and these products.”
Tesla’s rivals that operate robotaxis rely on expensive sensors such as lidar and radar to detect driving environments.
The company had in December recalled more than 2 million vehicles in the U.S. to install new safeguards in its Autopilot advanced driver-assistance system. NHTSA is still probing whether that recall is adequate.
(Reporting by Akash Sriram in Bengaluru and David Shepardson in Washington; Editing by Shilpi Majumdar, Arun Koyyur and Chizu Nomiyama)