Tesla North America’s official account on X promoted a video interview of a new Cybertruck owner who says his ophthalmologist recommended he buy a Tesla with “Full Self-Driving” because he is losing his eyesight.

The problem is that Tesla itself classifies FSD as a Level 2 driver-assist system that requires driver monitoring at all times — and the driver is responsible for the vehicle at all times. Those two things are fundamentally incompatible.

Tesla North America reposted a video from a content creator known as Captain Eli, who self-describes as a “Tesla investor supporting Elon Musk”. In this particular clip, a Cybertruck buyer named Ricky explains that his deteriorating eyesight is what led him to Tesla.

Ricky says he went to his ophthalmologist to discuss his ability to keep driving. According to Ricky, the doctor, who owns two Teslas himself, told him he needed a car with Tesla’s “Full Self-Driving” software. The ophthalmologist then set up a test drive for Ricky and even met him on a Saturday to walk him through the system.

Advertisement – scroll for more content

During the test drive, Ricky says the car “drove itself” for an hour and a half across three different routes, and he “never touched the wheel.” That experience sold him on the Cybertruck, which he bought specifically for the FSD software.

“So you buy a software that costs you some amount of money — a lot of money — and it comes with the car,” Ricky told the interviewer when asked to confirm he bought the vehicle for FSD.

Tesla North America’s decision to amplify this particular testimonial is where things go from concerning to reckless.

Why this is a massive problem

Tesla’s own support page for “Full Self-Driving (Supervised)” states plainly that the system is an SAE Level 2 partial automation system. That classification means the driver must remain fully attentive and engaged in the driving task at all times. The driver is responsible for the vehicle, whether FSD is enabled or not. The system “does not make the vehicle autonomous,” according to Tesla’s own documentation.

This is not ambiguous. Level 2 means the human is the driver. The software is an assistant. If the human cannot safely drive the vehicle, say, because they are losing their eyesight, then a Level 2 system does not solve that problem. It makes it worse, because it creates the illusion that the car is handling driving duties that legally and technically remain the human’s responsibility.

We have covered extensively how FSD creates dangerous complacency even in expert drivers. Raffi Krikorian, Mozilla’s CTO and the former head of Uber’s autonomous vehicle division, a man who literally built self-driving cars and trained safety drivers, crashed his Tesla Model X while using FSD because the system’s near-perfect performance lulled him into a false sense of security. Research shows drivers need 5 to 8 seconds to mentally re-engage after an automated system hands control back, but emergencies unfold faster than that window allows.

If a former self-driving chief can’t maintain proper supervision, what happens when someone with deteriorating vision is behind the wheel, trusting FSD to be their eyes?

This comes at a time when NHTSA has escalated its investigation into FSD to cover 3.2 million vehicles — the step that typically precedes a recall. The agency is already running a separate probe into 80+ FSD traffic violations, and Tesla has had difficulty even turning over the requested crash data. A Cybertruck owner has already filed a lawsuit alleging FSD caused a crash. Tesla’s own CEO has claimed that FSD drivers should be able to text and drive — while keeping FSD classified as Level 2 to avoid liability.

The pattern is clear: Tesla markets FSD as if it’s autonomous, tells customers they can relax, and then points to the Level 2 classification as a legal shield when something goes wrong.

Electrek’s Take

I’ll wager everything I have that this Tesla North America tweet will end up as evidence in a court case about a Tesla FSD crash at some point. A plaintiff’s attorney will hold it up and say: “Tesla’s own official account promoted a testimonial from a vision-impaired buyer who purchased the vehicle specifically because he believed FSD could drive for him — and Tesla amplified that message to millions.”

This is reaching genuinely dangerous levels. Consider the chain of events that Tesla just endorsed: an ophthalmologist is allegedly recommending Tesla FSD to a patient who is losing his eyesight, that patient follows the recommendation, buys a Cybertruck specifically for the software, proudly tells on camera that FSD drove for an hour and a half while he never touched the wheel, and Tesla’s official account promotes this as a feel-good customer story.

All while Tesla’s own documentation says FSD is a Level 2 driver-assist system that requires a fully attentive driver who is “responsible for the vehicle at all times.” All while NHTSA is running three concurrent investigations into FSD safety. All while we’ve documented how even autonomous driving experts get dangerously complacent behind the wheel of a Tesla on FSD.

If Tesla genuinely believes FSD is safe enough for someone with failing eyesight to rely on, then it should classify FSD as Level 3 or higher and accept the liability that comes with it. But Tesla won’t do that, because Level 2 lets it keep the marketing upside while pushing all the risk onto the driver. Promoting this video while maintaining that legal position is madness.

It’s true that FSD can now drive a few hours, a few hundred miles, even a few thousand miles without critical intervention, but that’s nowhere near what you need to be reliable enough for level 4 autonomy. What happens when that critical intervention is needed in a couple of thousand miles, and the driver is not capable of taking over?

The dangerous thing now is that you can clearly see the pervasive idea that FSD is more than a driver-assist system is embedded in the Tesla community. The ophthalmologist might know eyes and might be a Tesla owner, but it doesn’t mean he has a good grasp of the capabilities of FSD and Tesla promoting the use of FSD for people losing their eyesight is not helping.


Add Electrek as a preferred source on Google
Add Electrek as a preferred source on Google

FTC: We use income earning auto affiliate links. More.