Tesla's Full-Self Driving (FSD) Progress in 2025: Faking the Numbers or the Real Deal? (Critical Analysis)
As of early 2025, Tesla isn't very close to L5 FSD autonomy... but HW4 upgrades could make a big difference.
My aim here was to compile objective evidence analyzing Tesla’s ($TSLA) progress in Full-Self Driving (FSD) as of February 2025 to gauge how far they’ve come, current challenges & limitations, and whether hype of FSD from Tesla investors and Elon Musk is consistent with reality.
I should preface this by stating that I am a fan of Elon Musk and have always liked his vision with Tesla… Elon singlehandedly carried the EV industry on his back and pushed it forward at an accelerated rate.
That said, analyses of FSD floating across the internet are typically biased by 2 types of people:
Tesla cultists: They’re in a Tesla-or-bust investment cult and have their entire net worth tied up in TSLA stock. I wouldn’t bet against Elon, but many of these people look for any reason to hype FSD without even analyzing the data and/or potential flaws in data quality/bias. (Their brains filter out anything remotely critical of Tesla).
Tesla haters: These people have some combination of Elon Derangement Syndrome (EDS) and/or hate that Tesla stock is clearly overvalued relative to fundamental “value investing” metrics. Some of these individuals have short positions. (Read: Shorting Tesla is Stupid — even if Tesla stock is in a bubble.)
My report provides a full-spectrum audit of Tesla’s Full Self-Driving (FSD) program, analyzing its evolution from the early days of Autopilot (circa 2014) to February 2025. It covers:
Historical context of FSD development, from Tesla’s initial promises through multiple hardware and software iterations.
Technological framework, including Tesla’s decision to rely on cameras instead of lidar, and the controversy surrounding that choice.
Safety performance based on Tesla’s self-reported data, third-party investigations, and regulatory probes by entities like the National Highway Traffic Safety Administration (NHTSA).
Allegations of data manipulation and prioritizing “VIP” users’ driving routes in FSD training data.
Technical/operational challenges and whether Tesla is truly closing in on “Level 4+” autonomy or still stuck at supervised driving.
Investment implications, including the financial risks and opportunities attached to Tesla’s FSD pursuit.
Note: The point of this report is NOT to sway investment decisions re: Tesla. I think it is probably smart to have some Tesla exposure, but you shouldn’t FOMO in at a random time without proper due diligence.
Tesla FSD (2025): High-Level Overview
Tesla has achieved incremental, noteworthy advances but has not delivered the true self-driving experience repeatedly promised over the last decade.
The timeline to robust, unsupervised FSD remains uncertain, with hardware limitations, regulatory scrutiny, and ethical questions all playing major roles in 2025.
Mixed Track Record on Promises: Despite repeated assurances since 2016 that “full autonomy” was just around the corner, Tesla’s FSD remains a Level 2 driver-assistance system in early 2025—one requiring constant human supervision.
Vision-Only vs. Multi-Sensor Approaches: Tesla’s camera-centric methodology has matured but continues to face corner-case issues (phantom braking, difficulty with odd road layouts, etc.). Competitors using lidar and high-definition mapping (e.g., Waymo, Cruise) have arguably made more reliable progress in limited operational domains.
Safety Data vs. Reality Check: Tesla’s internal safety metrics often appear better than national averages, but independent audits question the context and transparency of these stats. NHTSA investigations and documented crashes raise concerns about whether FSD is truly safer in real-world scenarios.
Controversial “VIP Training” Allegations: Reports indicate Tesla devotes extra labeling and development resources to driving routes used by Elon Musk and other “high-profile” owners, potentially skewing performance in favor of those geographies and limiting more generalizable improvements.
Progress vs. Credibility: While Tesla’s FSD is more capable than it was a few years ago—especially on simple or well-learned routes—unresolved hardware constraints and repeated missed timelines have eroded public trust among some users and regulators. Nevertheless, many owners remain optimistic about an eventual FSD breakthrough.
Historical Overview of Tesla FSD
1. Autopilot Genesis (2014–2016)
Initial Launch: Tesla’s Autopilot began around 2014 with hardware from Mobileye (a camera and sensor provider). Early capabilities included lane-centering and adaptive cruise control.
Early Excitement & Overreach: Tesla marketed Autopilot as a “semi-autonomous” suite, but high-profile crashes generated controversy about driver inattention. By 2016, Elon Musk began referencing a fully driverless future, setting the stage for Tesla’s own in-house development.
2. FSD Ambitions (2016–2020)
2016 Hardware Announcement: Tesla declared that all vehicles produced from October 2016 onward had “all the hardware needed for full self-driving,” implying only software updates would be required for autonomy. This was Hardware 2.0 (eight cameras, upgraded radar, and ultrasonic sensors).
Repeated Target Dates & Missed Milestones:
Musk promised a coast-to-coast, hands-off FSD demonstration by late 2017, which never materialized.
In 2019, “Autonomy Day” showcased Tesla’s custom “Hardware 3.0” (FSD computer) and a vision of 1 million Tesla “robotaxis” on the road by 2020—another target that was not met.
Mobileye Break & Software Rewrite: Tesla ended its partnership with Mobileye in 2016, forcing Tesla to rebuild Autopilot software from scratch. During 2017–2018, new features rolled out slowly, leading to a period where older (HW1) cars briefly performed more reliably than the newer (HW2) ones.
3. The “Rewrite” & FSD Beta Launch (2020–2022)
Fundamental Software Changes: Around late 2020, Musk described a “fundamental rewrite” of Tesla’s neural network stack, moving from a collection of disparate models to a more unified approach capable of city-street maneuvers.
FSD Beta Introduction (October 2020): A small group of owners began testing city-driving capabilities—such as making turns, navigating local streets—but required constant vigilance. Over time, Tesla incrementally expanded Beta access.
Navigate on Autopilot & Early Summons: Basic highway autonomy (suggesting lane changes, merging, off-ramp selections) and Smart Summon in parking lots indicated progress, but also sparked viral videos of near-miss incidents.
4. Unified Stack & Hardware Upgrades (2022–2024)
City + Highway Single-Stack (v11+): By 2023, Tesla merged highway and urban driving code into a unified “stack,” aiming to simplify AI development.
Pivot Away from Radar to Pure Vision: Starting mid-2021, Tesla controversially removed radar from new Model 3/Y cars, fully committing to camera-only perception (“Tesla Vision”). Phantom braking reports and official investigations followed.
Hardware 4 (Late 2023/Early 2024): Tesla began rolling out improved cameras and a more powerful FSD computer (HW4), partly reversing the “pure vision” stance by adding a next-generation radar. Nonetheless, earlier owners who were assured “full self-driving hardware” in 2016 or 2019 might still require retrofits.
5. Status as of Early 2025
Tesla’s FSD Beta user base has grown to hundreds of thousands, logging an alleged 1 billion+ cumulative FSD miles (based on Tesla’s internal counts).
Yet, in February 2025, the system remains “Level 2” by SAE standards—drivers must keep their hands on the wheel and eyes on the road.
Musk’s newest timeline suggests a robotaxi pilot in Austin with “limited geofenced unsupervised driving” by mid-2025, but skepticism persists given past missed timelines. (LINK)
Tesla’s Sensor Strategy & the LiDAR Debate
A.) Tesla’s Vision-First Philosophy
Musk’s Anti-Lidar Stance: Elon Musk famously declared that “lidar is doomed,” insisting that humans drive with vision alone, so cameras + AI can replicate (or exceed) human perception. (LINK)
Cost & Scalability Argument: Lidar, while providing accurate 3D depth data, historically was expensive (often thousands of dollars per unit). Tesla has long sought a mass-market approach, believing a camera-based system is cheaper and more scalable.
Radar Removal & Reintroduction: After removing radar in 2021, Tesla faced an uptick in phantom braking complaints. In 2024, the company introduced “Hardware 4” with an upgraded “HD radar,” suggesting partial reconsideration of purely vision-only. However, official statements still dismiss lidar.
B.) Industry Comparisons
Waymo (& formerly Cruise): Rivals rely on multi-sensor setups (lidar, radar, cameras, and advanced maps) within geofenced areas (Phoenix, San Francisco, etc.). They have achieved Level 4 driverless taxis in controlled zones—demonstrably operating without human drivers onboard.
Pros & Cons: Waymo shows remarkable safety records in its specialized domain, while Tesla’s approach aims for broad coverage anywhere on Earth but often requires active driver oversight.
Scaling Debate: Musk claims Tesla’s approach will eventually scale better to all roads globally. Critics counter that advanced sensors provide safety redundancy that Tesla is missing.
C.) Likelihood Tesla Revisits LiDAR
Tesla’s public stance remains strongly against lidar. Despite falling lidar costs (some units now <$1,000), Tesla continues to invest in camera neural networks and specialized radars.
Analysts believe it would be a major strategic U-turn for Tesla to adopt lidar, entailing redesigns in hardware, data pipelines, and core AI.
Regulatory or legal pressures could theoretically push Tesla to reconsider, but no concrete steps suggest that as of early 2025.
Prioritization of VIP & Influencer Data & Alleged Data Manipulation
A.) Reports of “VIP Queues”
Insider Accounts: Multiple current/former Tesla annotation employees have stated that footage from Elon Musk’s personal vehicles and prominent influencers gets “extra scrutiny.” Labelers are directed to spend more time ensuring top-notch performance on those routes.
Targeting Influencers’ Routes: YouTubers who post FSD content (like Chuck Cook, Tesla Raj, etc.) have reported Tesla test drivers frequently replicating their routes, presumably to fix the system’s failures.
Potential Overfitting: By focusing heavily on specific streets (e.g., roads near Musk’s home or favored test routes), FSD might show “superhuman” performance in those areas while lagging in broader, unmonitored regions.
B.) Alleged Safety & Data Graph Manipulations
1-Billion-Mile Milestone Controversy (2024): Tesla touted crossing “1 billion FSD Beta miles,” but critics online noted suspicious graph scaling and alleged manipulated timelines to suggest an exponential growth.
Inconsistent X-Axis Labeling: Analysts identified irregularities in Tesla AI’s posted charts—uneven date intervals, conflating partial data sets. They argue Tesla was painting a rosier rate of improvement than actual.
Tesla’s Response: Officially, Tesla has not commented in detail on these alleged manipulations; historically, the company publishes high-level metrics without revealing raw data or methodology.
C.) Implications for Systemic Reliability
Unequal Resource Allocation: If Tesla’s data-annotation workforce disproportionately focuses on VIP routes, that may improve the system specifically where high-profile drivers travel—but delay improvements elsewhere.
Regulatory & Ethical Questions: Regulators could view selective data prioritization as misleading if Tesla publicly claims broad FSD improvements but actually tailors them to certain routes.
Consumer Transparency: Critics argue that Tesla should clarify how training data is prioritized, to ensure prospective buyers understand potential gaps in the system’s performance away from well-trodden areas.
Tesla FSD Safety Metrics & Real-World Performance (2025)
A.) Tesla’s Self-Reported Statistics
Quarterly Safety Reports: Tesla often cites “1 crash every X million miles on Autopilot,” comparing favorably to the broader US average of ~1 crash per 0.6–1 million miles.
Highway vs. City Bias: Most Autopilot miles occur on relatively simpler highways. Tesla typically does not separate highway-only data from city-driving FSD Beta performance in these public metrics.
Limitations & Context:
No breakdown of crash severity or fault.
No standard methodology for reporting miles or near-misses.
Autopilot usage often happens on well-marked roads, so direct comparisons to all US drivers can be misleading.
B.) Independent & Regulatory Data
NHTSA Investigations: The agency has documented numerous crashes where Tesla Autopilot or FSD Beta was active—some involving fatalities. Many revolve around collisions with stationary emergency vehicles, prompting ongoing federal scrutiny.
Insurance Data & Third-Party Studies: Some analyses show Tesla’s safety advantage is inconclusive once you control for factors like driver demographics, advanced driver-assistance features (collision avoidance, etc.), and the typical driving environment.
Crash Examples (2023–2024):
A fatal accident in Seattle (mid-2024) allegedly occurred while FSD was active, highlighting driver overreliance on the system.
Instances of Teslas hitting road barriers or misidentifying obstacles spur repeated calls for more robust sensor fusion.
C.) Crowdsourced Trackers & Beta-User Data
Community FSD Trackers (e.g., Reddit-based trackers, TeslaScope): Some owners volunteer data on disengagements, near-misses, etc. Preliminary stats sometimes show improved miles-per-critical-intervention for each new FSD software release, but:
Small Sample & Biases: Enthusiasts more likely to report. Data collection can be inconsistent (some drivers are meticulous, others not).
“Distribution Skews”: If Tesla releases a major update in Texas first, or to highly-skilled drivers, data may not represent the average user.
D.) Do the Numbers Add Up?
Competing Narratives: Tesla’s official stance: FSD is safer than human driving based on aggregated crash rates. Outside experts: Tesla’s comparisons are incomplete and highway-weighted.
Contextual Gaps: True real-world safety is still ambiguous without standardized, transparent reporting akin to California’s DMV disengagement reports (which Tesla largely avoids by not testing “driverless” in CA’s official AV program).
Ongoing Investigations: Regulatory pressure in 2025 focuses on whether Tesla’s claims about FSD’s safety advantage are “deceptive” marketing or defensible in broad real-world contexts.
Technical & Operational Challenges for Tesla (2025)
A.) Edge Cases & “Long Tail” Scenarios
Unprotected Left Turns, Complex Intersections: FSD Beta has made strides in simpler environments but can still fail in chaotic intersections or nonstandard traffic signals.
Phantom Braking: Tesla’s camera-based system sometimes misidentifies hazards (like shadows or overpasses), abruptly slowing the car. Radar reintroduction may mitigate this.
Weather & Visibility: Heavy rain, fog, snow, or sun glare can overwhelm camera input, leading to forced handover or dangerous misinterpretations. Lacking lidar/thermal sensors could worsen performance in these conditions.
B.) Hardware Constraints & Upgrades
Multiple Generations of “FSD Hardware”:
HW2 (2016) → HW2.5 (2017) → HW3 (2019) → HW4 (2023/24).
Each iteration involved new cameras, processors, or sensor changes. Despite 2016’s claim that all Tesla vehicles were “FSD-ready,” later admissions proved older hardware insufficient.
Retrofit Costs: Tesla has indicated that HW3 owners who purchased FSD will need an HW4 upgrade for truly autonomous features. Logistical complexity and cost of these retrofits remain unclear.
Compute Power vs. Real-Time Inference: Tesla’s new Dojo supercomputer trains the neural nets offline, but the final FSD must run on embedded vehicle hardware. Achieving near-human (or better) performance with limited on-board compute is a persistent challenge.
C.) Software: End-to-End Neural Networks
Shift from Rule-Based to Deep Learning: Tesla increasingly trusts large neural networks for perception, object tracking, and even path planning.
Scaling Up with Data: The premise: as Tesla collects billions of miles of video, the networks will learn to handle rare scenarios. However, data biases (e.g., overrepresentation of certain VIP routes or states with many Teslas) could hamper truly universal coverage.
Rewrite(s) & Rethink(s): Tesla has performed multiple “fundamental rewrites” of the code to break out of stagnation. Each time, short-term regressions often appear as neural nets re-learn old behaviors in new frameworks.
C.) Regulatory & Legal Landscape
NHTSA’s Ongoing Investigation: The agency has not mandated a recall banning FSD yet, but has demanded changes (e.g., “rolling stop” fix). If further hazards are uncovered, Tesla could face stricter measures.
State-Level Hurdles: States like California demand certain disclosures or permit processes for AV testing. Tesla’s approach—deploying Beta to consumer vehicles—skirts typical “testing” rules, creating friction with regulators.
International Considerations:
Europe: Strict regulations and type-approval processes have slowed FSD Beta rollout.
China: Data security laws limit Tesla from freely collecting/streaming vehicle footage, complicating FSD development.
Tesla & Elon: Overpromising vs. Reality Check
A.) Historical Timelines & Missed Targets
Repeated Claims of Imminent Autonomy:
2016–2017: Elon Musk projected a coast-to-coast “hands-off” Tesla drive by late 2017, which did not happen.
2019: Tesla’s “Autonomy Day” introduced the idea of 1 million “robotaxis” by 2020. By 2025, there are still no driverless “robotaxis” operating under Tesla’s brand.
Cumulative Effect on Credibility: Each postponement, paired with constant public statements that “full self-driving is just a year away,” has chipped away at Tesla’s reliability in the eyes of regulators, some consumers, and media. Among fans, however, optimism remains high, with many believing Tesla’s data-driven approach will eventually prevail.
B.) Consumer Expectations vs. Beta Reality
Branding & Terminology: The term “Full Self-Driving” implies a capability beyond Level 2 or 3, potentially misleading drivers into complacency or misuse.
Liability & Lawsuits: Multiple class-action and fraud-related suits allege Tesla overhyped FSD’s abilities. These cases point to marketing that suggested near-future autonomy on cars sold as far back as 2016.
Driver Overreliance: Anecdotes of drivers napping, watching movies, or otherwise misusing Autopilot/FSD highlight a consistent gap between Tesla’s disclaimers (“hands on wheel”) and how some owners interpret “Full Self-Driving.”
C.) VIP Prioritization as a Possible Marketing Tactic
VIP “Showcase” Routes: By perfecting specific influencers’ or Elon Musk’s favorite paths, Tesla can generate impressive demonstration videos that go viral, fueling consumer excitement.
Limited Generalization?: Overfitting the system to specific roads or driver styles can mask broader deficiencies. Ultimately, if FSD is highly tuned for a small set of public demos, it may underperform in less-mapped regions.
D.) Investor & Public Sentiment
Stock Volatility: Tesla’s market valuation often factors in expectations of a future AV/robotaxi revolution. Delays in achieving unsupervised driving can cause swings in investor confidence.
Enthusiast Community vs. Skeptic Press: Tech enthusiasts champion Tesla’s iterative “learn in public” approach, while critical press and analysts focus on the persistent missed milestones, regulatory probes, and disclaimers.
Where Tesla Stands in 2025 with FSD
A.) Latest FSD Version & Performance Snapshots
FSD Beta v12.x+ (or v13):
Claimed improvements in unprotected turns and smoother behavior in congested urban settings.
Still requires driver supervision, although Tesla occasionally references “nearly hands-off” performance for extended periods under optimal conditions.
Community data trackers show a drop in phantom braking incidents compared to earlier “pure vision” phases, partly aided by reintroduced radar (Hardware 4).
Challenging Edge Cases Remain: Construction zones, unusual signage, unpredictable pedestrians, and inclement weather remain problem areas. Some testers note improved smoothness but still multiple interventions in complex routes.
B.) Hardware 4 Rollout
Camera Upgrades & Additional Compute: HW4 cars (starting in late 2023) have higher-resolution cameras, more robust processing power, and a new radar unit. Tesla states that these changes will allow for “long-term autonomy,” although no official promise of Level 4 by a specific date.
Retrofits for HW3 Owners: Tesla has indicated free or low-cost HW4 retrofits might be offered to those who purchased the FSD package—yet the actual retrofit process has faced delays, with limited clarity on scheduling and parts availability.
C.) Elon Musk’s 2025 Statements
Ongoing Optimism: Musk reiterated plans for a “robotaxi pilot” in Austin by mid-2025, involving geofenced, unsupervised operations. Skeptics recall past robotaxi deadlines that came and went.
Acknowledgment of Complexity: In several interviews, Musk admitted “true self-driving is a harder problem than I initially anticipated,” a shift from past declarations of near-term readiness.
Deflecting Regulatory Hurdles: He frequently cites regulatory obstacles as key bottlenecks—particularly in Europe and some US states—suggesting technology will be “ready,” but laws need to catch up.
D.) Comparisons to Competitors
Waymo: Has limited, but real, fully driverless services in select cities (Phoenix, San Francisco, Austin, etc.). Relies on lidar & high-definition maps, focusing on safe, small-scale expansions.
Tesla’s Data Moat vs. Practical Deployment: Tesla has more global mileage data, but it hasn’t yet fielded a “no human driver onboard” service at scale. Critics argue that “numbers of miles” are less important than “miles in driverless mode,” which Tesla has not systematically pursued under typical AV test permits.
Tesla FSD: Rate of Improvement & Future Projections
A.) Historical Trends in Disengagement Miles
Incremental Gains, Occasional Regressions: Each major FSD Beta iteration appears to improve certain aspects (like smoother merges) but can regress in others (e.g., new bugs in stop-sign interpretation).
Disengagement Frequency: User logs suggest that from 2020 to late 2024, average miles between “critical interventions” might have quadrupled (e.g., from ~100 miles to ~400 miles), but that’s still far short of the hundreds of thousands of miles humans drive between major incidents statistically.
Implications: A 10-100x jump is needed before FSD can rival average human crash rates, assuming no near-miss is allowed to escalate.
B.) Remaining Obstacles to “Level 4–5” Autonomy
Edge Cases & Urban Complexity: Dense city driving (pedestrians, cyclists, confusing signage) is far more challenging than highway cruising.
Sensor Redundancy: Without lidar or even robust radar coverage in older cars, vision-based FSD can be fragile in bad weather or low.
Tesla’s Progress Toward Level 5 in 2025 (Recap)
Despite years of development and numerous software/hardware upgrades, Tesla’s Full Self-Driving (FSD) capability remains at SAE Level 2—the vehicle can steer, accelerate, and brake in certain conditions, but a human driver must constantly supervise and be ready to intervene.
Tesla has repeatedly stated aspirations for Level 4 or 5 autonomy (where human supervision is largely unnecessary), but as of early 2025, these milestones have not been achieved or formally certified.
Performance vs. Human Driving (High-Level Averages)
Crash/Accident Rates: Tesla reports that on average, when Autopilot/FSD is engaged (mainly highway miles), its vehicles experience fewer crashes per million miles than the general U.S. driving population—often claiming a ratio of about 1 crash per 4–7 million miles for Autopilot vs. roughly 1 crash per 0.6–1 million miles for all U.S. drivers. However, these comparisons are not fully controlled: Autopilot is mostly used on well-marked highways, which already have lower crash rates than city streets.
Intervention Rates: Independent trackers (small sample) suggest that FSD Beta still requires one driver intervention every few hundred miles (depending on traffic complexity). By contrast, an average human driver may go tens or even hundreds of thousands of miles between serious incidents or near-miss interventions.
Edge Case Handling: In straightforward highway driving or predictable environments, Tesla’s system can exceed average human performance in reaction time and maintaining lane discipline. In complex urban settings—unprotected turns, construction, unpredictable pedestrians—FSD still struggles and often needs human takeover. (Exhibit A)
In sum, Tesla’s current system can significantly reduce driving workload on well-structured roads but does not yet match the consistency and adaptability of a fully alert human driver in more chaotic or nuanced scenarios.
While FSD appears to be improving (with software updates and new hardware releases), Tesla’s progress toward genuine Level 4–5 autonomy remains a work in progress, falling short of its longstanding marketing promises for truly driverless operation.
Primary References & Notes:
Tesla investor materials (2024) regarding HW5/AI5 timeline.
Public statements by Elon Musk (2016–2025) on autonomy targets.
Analysts from Guggenheim, Navigant, and others forecasting L4 timelines.
NHTSA investigations, official crash data, and Tesla’s self-reported safety reports.
Media and user community trackers (YouTube, social media, etc.) documenting real-world FSD Beta performance.