The National Highway Traffic Safety Administration (NHTSA) just announced it has upgraded its probe into Tesla’s Full Self-Driving (FSD) system to a full-blown engineering analysis.
The advanced investigation could ultimately lead to a recall of Tesla’s popular driver-assistance technology. The investigation now covers roughly 3.2 million Tesla vehicles equipped with FSD. At the center of the engineering probe are concerns that Tesla’s camera-based system may struggle in low-visibility conditions, including glare, dust, or other airborne obstructions.
According to NHTSA documents, Tesla’s system may fail to properly detect hazards or warn drivers when camera performance is compromised, in some cases only issuing alerts seconds before a crash-or not at all.

The safety agency said it has identified nine incidents potentially linked to the issue, including one fatal crash and two involving injuries. An additional six crashes are also under review to determine whether they are related, including several instances where FSD appeared to lose track of or fail to recognize vehicles ahead
The investigation builds on an earlier evaluation launched in October 2024, which initially looked at 2.4 million vehicles-the new probe adds a further 800,000 vehicles to the scrutiny.
Part of the problem is with Tesla’s shift to a camera-only system-creatively known as Tesla Vision-which replaced radar-based inputs starting in 2021.

While Tesla subsequently introduced a system designed to detect when camera visibility is degraded, regulators are questioning whether it works effectively in all conditions. The engineering analysis will examine how well the system identifies reduced visibility and whether it provides an adequate warning to drivers.
NHTSA will also review software updates Tesla has introduced, including whether newer versions improve detection and alert performance, and how widely those updates have been rolled out across the fleet.
This is just the latest headline involving regulatory actions over Tesla’s advanced driver-assistance features-including both Autopilot and Full Self-Driving. In a separate investigation launched last year, NHTSA looked at 2.9 million Tesla vehicles following reports of traffic safety violations and crashes linked to automated driving behavior.
That’s before we mention the 2.6 million cars captured by NHTSA’s investigation into the automaker’s “Actually Smart Summon” driverless feature after a series of crashes.

NHTSA’s engineering analysis is usually the final step before regulators decide whether a formal recall is necessary. For now, the investigation remains ongoing, with NHTSA continuing to collect data and evaluate system performance.
Tesla has not publicly commented on the latest development.
AutoGuide’s Take:
Fix it now-the rest of us are absolutely sick of being part of a beta test we didn’t consent to. We’re several years into watching Tesla vehicles behaving badly by themselves on our roadways. It’s gotten bad enough that I actively move away from Teslas when I encounter them on the highway. I don’t trust the automaker’s system, and I certainly don’t trust the androgynous driver with rounded shoulders and glazed-over eyes to react quickly enough in the event of an emergency.
Become an AutoGuide insider. Get the latest from the automotive world first by subscribing to our newsletter here.