There are edge cases in autonomous driving, and then there are literal wooden barriers stretched across the road with flashing lights attached. One of these is a challenge. The other is just… a thing that’s there.

A dashcam video that went viral over the weekend shows a Tesla Model 3 running on Full Self-Driving mode doing something that would make any traffic safety engineer reach for their antacids: driving clean through a lowered railroad crossing gate without so much as tapping the brakes.

The footage, posted to Threads by a user named Laushi Liu on Sunday, March 8, shows his Model 3 approaching an active railroad crossing near West Covina, California at about 23 mph. The barriers are fully down. The lights are flashing. The FSD system, by all visual evidence, is completely unbothered. The car punches straight through the gate like it isn’t there — because, as far as FSD was concerned, apparently it wasn’t.

Liu captioned the post “Tesla FSD almost killed me today,” which, given the footage, is not an unreasonable assessment.

There’s one detail that makes the failure especially hard to brush aside: the crossing barriers came down at roughly the same height as Tesla’s front-facing cameras. The system wasn’t just slow to react — it showed no indication it detected the barrier at all. The driver did press the brakes, but not soon enough to avoid impact.

This Isn’t a New Problem

Here’s the uncomfortable part: this isn’t a one-off glitch or some rare cosmic alignment of unfortunate conditions. Railroad crossings have been a known weak spot for Tesla’s FSD for years. NBC News dug into the issue and turned up more than 40 reported incidents on social media alone, speaking with six Tesla drivers who had firsthand experiences — four of whom had video to back them up.

One of the more harrowing documented cases involved a Tesla Model 3 in eastern Pennsylvania where FSD guided the vehicle onto the tracks, where it was then struck by a train. The driver and passengers had gotten out beforehand, so no one was hurt — but the car certainly was.

The pattern became concerning enough that U.S. Senators Ed Markey and Richard Blumenthal formally wrote to NHTSA urging a dedicated investigation into exactly this kind of failure.

The Timing Could Not Be More Awkward for TeslaTesla Model 3 Long Range RWD

Image Credit: Tesla.

The day Liu’s video went viral — March 9 — was the exact deadline NHTSA had set for Tesla to hand over crash data as part of a sweeping federal investigation into FSD traffic violations. An investigation that, not coincidentally, specifically includes railroad crossing failures.

NHTSA launched the probe back in October 2025 after linking 58 separate incidents to FSD, including 14 crashes and 23 injuries. By December, that number had climbed to 80 documented violations pulled from driver complaints, Tesla’s own reports, and media accounts. The investigation covers roughly 2.88 million Tesla vehicles equipped with FSD — which is a lot of cars to be asking some pretty serious questions about.

Tesla has not exactly been sprinting toward cooperation. The company asked for and received two deadline extensions — pushing the original January 19 due date first to February 23, then to March 9. The reason given? Tesla said it had 8,313 records requiring manual review and could only get through about 300 per day. Whether that math works out in anyone’s favor remains to be seen.

NHTSA wants detailed incident timelines going back 30 seconds before each traffic violation, including which software version was active, whether the driver received any warnings, and whether crashes or injuries resulted.

The Name Is Still Doing a Lot of Heavy Lifting

The reliable counterargument any time an FSD video surfaces is that it’s a Level 2 driver-assistance system, not a fully autonomous one — and that drivers are required to remain attentive and ready to take control. That’s technically accurate, and yes, Liu could have intervened sooner.

But Tesla has also charged up to $15,000 for a product called “Full Self-Driving.” The California DMV has already formally determined that Tesla’s marketing doesn’t match the product’s actual capabilities. When a system with that name and that price tag fails to acknowledge a physical barrier sitting directly in front of its cameras, the “the driver should have been watching” defense starts to feel like it’s carrying more weight than it should.

And while all of this is happening, Tesla is running a limited unsupervised Robotaxi service in Austin using the same FSD software currently sitting under a federal microscope. That’s the kind of timeline overlap that makes risk management professionals lose sleep.

The railroad crossing video joins a growing recent highlight reel that also includes FSD attempting to steer a Tesla into a lake in February, and a head-on crash during a Chinese owner’s FSD demonstration livestream in December.

Flashing lights. Lowered gates. Painted road markings. These aren’t puzzles — they’re the clearest possible signals a road can give. If FSD can’t solve them consistently, the question isn’t whether the driver should have been paying closer attention. It’s whether any of this should be happening on public roads in the first place.