As self-driving technology advances rapidly, Tesla keeps rolling out new updates to its FSD (Full Self-Driving software), and people are eager to put each version to the test. To develop safer vehicles, automakers spend countless hours fine-tuning their cars in controlled environments. They use staged scenarios with cardboard cutouts, crash-test dummies, and predictable movements to evaluate driver-assist systems.
But a recent viral review pushed far beyond those conditions. The Dirty Tesla team, working with Out of Spec Reviews, challenged Tesla’s latest FSD update with unpredictable, human-like obstacles instead of cones and dummies. The result is a striking look at how the newest software responds to sudden glitches, surprise hazards, and real roadside chaos.
What is Tesla’s FSD
Tesla’s Full Self Driving System (FSD) is an advanced driver-assistance system that can navigate hazards and make autonomous driving decisions. Tesla’s FSD features intelligently and accurately complete driving maneuvers, including route navigation, steering, lane changes, parking, and more under active supervision. Drivers can use it for quick errands, daily commutes, and road trips.
The recent test wasn’t designed to question Tesla’s perfection, but rather to estimate its resilience under aggressive or deliberate interference.
Testing FSD
Instead of a typical obstacle course, this test used a second vehicle, a Ford Crown Victoria, as the “problem driver.” The Tesla with FSD enabled circled the track while the Crown Vic tailgated, cut in suddenly, tried surprise merges, and even drove straight at the Tesla.
All of this happened on a private track, with professional drivers and no other traffic. The goal was to see how FSD responds when another driver behaves aggressively, unsafely, or unpredictably. Along the way, the testers also noticed how FSD drove the track itself, including how it hugged the right side of the pavement and refused to go off the road.
Tailgating test
Tailgating was the first scenario. The Crown Vic pulled up close behind the Tesla and stayed there. Interestingly, FSD didn’t really “freak out” at all. It kept driving normally, held its lane, and didn’t slam on the brakes or jerk the wheel. From Tesla’s point of view, it was just another car behind it.
This part of the test showed that FSD doesn’t overreact to rear pressure. It focuses mainly on what’s in front and around it, rather than panicking just because someone is riding its bumper.
Side collision avoidance
In the next scenario, Crown Victoria deliberately steered towards the Tesla from the side, a reaction that most cars with an automatic driving system might not respond to in time.
When the Crown Vic swung toward the Tesla, FSD braked early and steered away. The driver pointed out that the system reacted before he would have, and at times it was clearly faster than his own instincts.
Later, when the Crown Vic came up in the Tesla’s blind spot, FSD again shifted away before the human driver even saw the other car. The test made it clear that the side and rear cameras are constantly feeding the system information, and in many of these side-attack moments, FSD felt quicker than a human at detecting and responding to the threat.
Respond to aggressive drivers
What happens if a Tesla encounters an aggressive driver who won’t let go? The test also created circumstances to figure out the answer. According to the test, Tesla’s FSD develops a patterned response to aggression after multiple attempts. The consistent pattern includes increasing the distance, sometimes pulling toward lane edges, or pausing to reassess.
This shows the FSD software values spacing and will avoid getting into unnecessary positions when threatened. While this seems like a good idea, constantly backing off in real-life road aggression might not be feasible.
High-speed approaches
Tesla’s response to high-speed approaches in the adjacent lane is routine and registered as usual. When the Crown Vic did quick fly-bys on the side, FSD sometimes reacted strongly and gave it plenty of space, and other times seemed to treat it like normal traffic and just kept going.
When they tried to test braking response with the Tesla following the Crown Vic, FSD made it difficult to create a true emergency. It refused to get very close and kept a generous following distance, so when the Crown Vic braked, FSD simply slowed down smoothly and avoided any real risk of a rear-end collision.
In other words, FSD was very cautious about closing in on the car ahead. That’s good for safety, but it also meant the testers couldn’t easily push it into a true edge-case braking panic from behind.
Head-on approach
The most intense clips came from the head-on approach. The Crown Vic drove straight toward the Tesla to test its reactions.
In these cases, FSD typically hit the brakes hard and then started backing up. However, the reverse behavior had limitations. At several points, it only reversed at about 2 mph, even though the testers had seen it reverse faster in other situations. That slow backing speed felt conservative and, in a real high-speed closing situation, might not create as much room as a human driver would want.
Throughout the test, FSD also showed a strong preference for staying on pavement. It would keep braking and trying to hold its line instead of instantly steering into the grass. Only when it was pushed to the edge or manually forced closer to the shoulder did it finally go off the road. Once on the grass, it crept slowly, sometimes stopped, and even put itself in park and then later reversed to find its way back to the road.
For human drivers, leaving the road or using a soft shoulder can be an instinctive escape in extreme situations. FSD, by contrast, clearly treats leaving the pavement as a last resort.
Emergency vehicle detection
Later in the video, the Crown Vic was set up as a mock police car, with working light bars but no siren. The idea was to see whether FSD would recognize flashing emergency lights and react, even without sound.
With the lights on behind it, the Tesla didn’t do anything special. It kept driving, didn’t pull over, and didn’t treat the flashing lights as an emergency. When the Crown Vic parked on the side of the road with its lights on, FSD sometimes moved over and gave it space, but this response was inconsistent. On other passes, it simply went by without a noticeable change.
The test suggests that, at least in this setup, FSD did not reliably respond to flashing lights alone. The testers speculated that sound or other cues may play a role in how Tesla eventually handles emergency vehicles, but this specific run showed no dependable response to lights without audio.
The test’s results
Taken as a whole, the test showed that FSD is very good at some parts of aggressive, human-like driving behavior and still uncertain in others.
It handled many side threats and surprise movements impressively, often reacting faster than the human driver and maintaining distance whenever possible. It recognized cars in the side blind spot and steered away or braked early, which could easily lead to minor crashes in everyday driving.
On the other hand, it struggled with situations that might benefit from using the side of the road and avoided leaving the pavement unless cornered. Sometimes reacted inconsistently to high-speed approaches, and did not reliably respond to flashing emergency lights alone. Its cautious reverse behavior and tendency to stop and park when confused also showed how limited it can be off a normal, well-defined road.
What drivers can expect
For drivers curious about FSD, this test showed that the system is still an assistant, not a replacement for a human. It can react quickly, manage many threats better than an average driver, and clearly tries to stay out of trouble when another car is acting badly.
At the same time, it remains conservative, road-bound, and occasionally inconsistent. It may brake hard when it senses risk, hesitate when the environment is unclear, and fail to notice some cues that a human would instantly understand, such as a police car’s lights without sound.
The video shows Tesla FSD as a fast-reacting, cautious co-pilot that can handle a surprising amount of chaos on a closed track. But it also makes one thing obvious: you still need a human paying attention, ready to take over when the world stops behaving nicely.