The National Highway Traffic Safety Administration (NHTSA) announced April 6 the closure of its investigation into nearly 2.6 million Tesla vehicles equipped with a remote-controlled driving feature. The federal safety regulator determined that the system, marketed as “Actually Smart Summon,” was linked exclusively to low-speed incidents and that Tesla has implemented sufficient software mitigations to address the identified risks.
The probe was initiated in early 2025 following reports of several collisions involving the technology. “Actually Smart Summon” allows Tesla owners to move their vehicles over short distances—typically in parking lots or on private property—using a smartphone application. While the system was designed to improve convenience, it came under regulatory scrutiny after users reported varying degrees of property damage during remote operation.
Following a comprehensive review, the NHTSA concluded that the feature’s involvement in accidents was limited to low-velocity environments. According to the agency, there were approximately 100 reported crashes linked to the feature; however, none resulted in injuries or fatalities.
“Most reported incidents involved vehicles striking obstacles such as parked cars, garage doors, or gates,” the agency said. The findings indicated that these collisions frequently occurred early in a Summon session, when visibility or situational awareness was limited.
The regulator highlighted the absence of severe outcomes during the investigative period. The NHTSA confirmed that no incidents involved major crashes, airbag deployment, or vehicles being towed. Given the “low frequency and severity” of the events, the agency determined that further regulatory action—or a formal recall specific to this feature—was not warranted at this time.
The closure of the investigation follows a series of over-the-air software updates deployed by Tesla to refine system performance. According to the NHTSA, these updates were aimed at improving obstacle detection, identifying camera blockages, and enhancing vehicle response to dynamic objects such as gates. The updates also address operational errors caused by environmental factors, including snow or condensation that can obstruct the vehicle’s camera-based vision system.
Despite the resolution of this probe, Tesla’s broader suite of automated driving technologies remains under heightened federal oversight. Last month, the NHTSA escalated a separate investigation into Tesla’s “Full Self-Driving” (FSD) system to an engineering analysis—a more advanced stage in the regulatory process that can precede a potential recall. The expanded review now covers approximately 3.2 million vehicles.
This broader scrutiny stems from ongoing concerns over visibility limitations and the effectiveness of system warnings in real-world conditions. In a separate statement on FSD, the NHTSA said the software has, in some cases, “induced vehicle behavior that violated traffic safety laws.” That assessment prompted an earlier investigation in October involving 2.9 million vehicles, following more than 50 reports of traffic-safety violations and crashes.