Q and A
Who is David Moss and why does that matter?
David is not a Tesla engineer, influencer, or professional driver. He sells LiDAR scanners for a living and spends a lot of time on the road. That matters because this was not a controlled experiment or a company demo. It was a normal person using the car the way real people actually do. Errands, hotels, chargers, parking lots, bad weather, boredom, fatigue, and real consequences.
Was this really zero intervention?
Yes. Zero disengagements of Full Self Driving during the 11,000+ mile streak. That does not mean zero supervision. Hands were ready, eyes were up, and safety always came first. But at no point did he take over steering, braking, or throttle because the system failed.
Was this mostly highway driving?
No. Highways were part of it, but not the majority of the challenge. The streak included city streets, dense urban traffic, parking garages, gated communities, office parks, hotels, superchargers, and complex parking lots. The system parked itself repeatedly, sometimes more than a dozen times in a single day.
Did he avoid difficult situations on purpose?
He avoided situations that would force a disengagement regardless of performance, such as border checkpoints. That was intentional and responsible. He did not avoid weather, traffic, construction, or unfamiliar environments. If anything, he added miles to stay safe rather than cut corners for the sake of a streak.
What FSD version was used?
The bulk of the streak occurred on Tesla Full Self Driving version 14 and 14.2. The visible autonomy stats counter appeared with 14.2, which is when the uninterrupted miles became publicly trackable.
What does FSD still struggle with?
Parking lots remain its weakest area. It is cautious and sometimes slow. Drivers cannot yet select specific parking spots. Speed limit interpretation occasionally lags behind human intuition. Many experienced users miss manual speed adjustment via the scroll wheel. It also doesn’t do fast food drive thrus quite yet. These are usability issues, not fundamental driving failures.
Was this safe or just lucky?
Luck always exists on the road, with or without autonomy. What makes this meaningful is that the system repeatedly demonstrated awareness, prediction, and control in situations where human error is common. This was not about pushing limits. It was about reducing risk over time.
Did other drivers react negatively?
Surprisingly, no. David reports being rarely honked at and never aggressively confronted. Most people around him had no idea the car was driving itself. The behavior was smooth, legal, and predictable enough to blend into normal traffic.
Is FSD actually better than a human driver?
David believes it is, and says so bluntly. Not because it is perfect, but because it is consistent, tireless, and always paying attention in every direction at once. Unless someone believes they are an outlier among drivers, it is hard to argue the system is not already safer in many scenarios.
Why does 11,000 miles matter specifically?
The average American drives about 12,000 miles per year. This single streak represents roughly a full year of driving for most people. Compressing that experience into one uninterrupted run makes the implications harder to dismiss.
Why did this resonate so strongly online?
Because it was not supposed to happen yet. Tesla had talked about coast-to-coast autonomy as far back as 2017. Years passed. Skepticism grew. Then one person quietly did it without fanfare, without Tesla staging it, and without planning to become a headline.
What is his advice to skeptics?
Try it yourself. Demo it at a service center. Rent a Tesla. Experience it firsthand. The debate changes quickly when it is no longer theoretical.
What comes next?
David plans to drive with FSD in all 50 states, including Hawaii and possibly Alaska. But the bigger shift is not about records anymore. It is about normalization.


