Risk of Tesla camera-only self-driving

“I don’t know” is a valid answer to questions you don’t know the answer to. Some might say it is the honest answer.

2 Likes

As needed for what? Good safety?

Anecdotes and small sample examples tell us almost nothing about safety at scale.

According to Waymo:

Unless there are legitimate criticisms of Waymo’s safety data (none I have seen here, but maybe there are some?), there is no basis to question Waymo’s overall safety record with anecdotes.

That would be like recommending a medication for everyone based on one person’s experience (we do clinical trials instead).

It’s not statistically valid.

If you don’t disagree, that means you agree. Let’s examine scale.

Waymo:

Year Total Paid Rides (Annual) Year-End Weekly Volume
2023 ~800,000 – 1 Million ~20,000 rides/week
2024 ~4.6 Million ~150,000 rides/week
2025 ~15 Million ~450,000 rides/week

Tesla:

Year Total Paid Rides (Annual) Year-End Weekly Volume
2023 0.0 0.0 rides/week
2024 0.0 0.0 rides/week
2025 0.0 0.0 rides/week

One of these products indeed seems slow to scale.

1 Like

Tesla and Waymo started out solving problems at different times in a different order. Waymo went to robotaxis first, and then later to a system for personally owned vehicles. Tesla started with personally owned vehicles, and then later to robotaxis.

Waymo’s robotaxi start with safety drivers was 2016 (Source). It wasn’t until 6 years later than they actually started offering driverless rides for a fee to non-employees. Tesla’s equivalent start was June 2025. We’re only a half year in, so Tesla has 5 years for its equivalent milestone.

Tesla started its self-driving for personally owned vehicles in Sept 2020 (Source). We could say Waymo started that in April 2025 (Source). We’re still waiting on both to achieve a level beyond “L2++” as has been coined by Nvidia recently.

One aspect of the robotaxi scaling is time from first city area to enlarged area to next city and then next city, etc. As of today, about 6 years after its first city, Waymo is still only in 5 or 6 cities. Yes, more are coming, and they’re coming faster than they have in the past. Once Tesla gets to its first city, let’s see how long it takes them to get to 5 or 6 cities. Then we’ll have an idea which system scales up faster.

One aspect of the POVs (“Personally Owned Vehicles,” as coined by Waymo-Toyota in the link above) is how many miles between driver interventions. Tesla is at about 800 miles, Waymo-Toyota is at exactly 0 miles. Additionally, Tesla is now at 98% of drives with no critical disengagements, and 70% of drives having no disengagements at all. Waymo-Toyota is at 0%.

We’ll have to wait and see once the Waymo-Toyota partnership offers any product for POVs, then we’ll see which is scaling that faster.

Separately, for L3 service for POVs, Mercedes is in the lead, with approvals for parts of Nevada and California on certain roads under very specific conditions. I’ve not seen any reporting as to how many vehicles, nor how many L3 miles have been logged, not any incidents under those miles. If Tesla or Waymo-Toyota make an L3 offering, we can maybe start tracking that data as well.

1 Like

That’s what I like: data! Here’s some more:

Full Self-Driving sounds like a cheat code for traffic. The real crash numbers are a lot less glamorous. Since 2021, federal crash reports collected under NHTSA’s [Standing General Order](https://www.nhtsa.gov/laws-regulations/standing-general-order-crash-reporting) have shown one pattern again and again: Tesla racks up the bulk of serious incidents involving driver-assist systems, especially fatal crashes where Autopilot or FSD was in play.

At the same time, a new Waymo safety study over 56.7 million driverless miles shows big drops in injury crashes compared with human drivers in the same cities. Fewer serious injuries. Fewer pedestrian hits. Fewer cyclists on the ground. That doesn’t make robotaxis perfect, but it proves something important: sensor-heavy, tightly supervised automation behaves very differently from camera-only systems that lean on the driver as the final safety net.

A Reuters analysis of federal crash reports found Tesla involved in the vast majority of fatal crashes reported under those rules, even as the company talks up safety stats on its own site. That tension is the whole story in one picture: the marketing says “safer than humans,” while independent data keeps regulators glued to Tesla’s every move.

Not the same, but kinda’. (Note: the Waymo study is not a comparison with Tesla, but a comparison with human drivers.)

How many jurisdictions agree on what the standards specifically are?

First, “Since 2021” includes mostly earlier than FSD V14 reports.

Second,

Tesla has its own reports, for instance,

But, again, these are different applications. Once Tesla has taxis without humans in them (or Waymo-Toyota offer cars with some level of autonomy) we can then start to make actual comparisons.

And everybody’s knows Tesla’s reports are crap, since they measure different things but pretend they’re the same. Tesla’s data, since the beginning, has been largely on limited access highways which are much safer, but they compare it to “all driving” by humans including congested city driving where more accidents occur because of the start/stop, driveways, and turning cars. It’s bunk.

And yes, Tesla has used data which includes turning off the FSD just seconds before a crash, thereby making it “not count” - which was revealed only under duress by the NHTSA.

Yes, data from the National Highway Traffic Safety Administration (NHTSA) and investigations indicate that [Tesla's Autopilot system often disengages less than a second before an impact], a pattern highlighted in crashes involving stopped emergency vehicles, leading to questions about whether this behavior allows Tesla to shift blame to the driver despite the software being active moments before the crash.

(Google Search)

If they’re so proud of their data, why don’t they let independent researchers examine it?

1 Like

Yeah, no argument from me that Tesla’s data isn’t up to snuff. Once they start deploying actual Robotaxis, though, they’ll be forced to be providing real data.

As it is, we’re comparing oranges to apples, is my main point.

Waymo didn’t have an enormous number of real-world miles as it began successful autonomous city driving and now they have 100+ million rider-only miles - they are not at even 1 billion real-world miles today.

Without massive real-world miles, and if as you say, simulation doesn’t help with edge cases,

How did Waymo get to autonomous city driving growing over 100 million miles (with supporting data claiming much better than human safety)?

How did Waymo solve edge cases (good enough safety-wise they say with data) with so little real-world data and simulation that doesn’t help with edge cases?

I don’t know exactly how many, but quite a few. Maybe all. For example, in the recently passed Texas robotaxi law, the text specifically incorporates SAE International Standard J3016 by reference. This means Texas ensures that its legal definitions for “Level 4 Automation,” “Dynamic Driving Task,” and “Operational Design Domain” match exactly what the industry’s engineers are using.

Specific to this conversation, in SAE L4, if the computer fails or leaves its geofenced area, it must be able to pull over safely without human intervention. The Texas law makes this a permit requirement.

To put it another way, a company must prove to the Texas DMV that their car can reach an MRC (e.g., pulling onto a shoulder and putting on hazards) in the event of a system failiure.

As discussed above, existing Tesla vehicles can’t do this except in very limited conditions.

Lots of other states have also incorporated the SAE standards into legislation by reference. So unless Tesla vehicles can meet the SAE standards (which again, they don’t except under very limited conditions), they won’t be usable as robotaxis in most or all states.

1 Like

Depends on how you look at it. Back in 2016, Elon Musk wrote in Master Plan Part Deux that robotaxi was a core part of the business model.

Here we are ten years later and there is no robotaxi business. Maybe it is all part of the process, but you can’t argue Waymo is “slow to scale” when the alternative is much slower to scale.

On the other hand, it could be as you suggest: Tesla is five years behind Waymo. Five years behind is the equivalent of an ice age in the tech world. I don’t think Tesla is that far behind, but you could be right.

1 Like

Way to miss the word “start.”

Spin much? Yup.

1 Like

This is incorrect. At best, it’s a guess based on incomplete information. At worst, it’s a guess based on incorrect assumptions. But, it is a guess. If it wasn’t for my posts, you wouldn’t know that Tesla uses separate power supplies, separate servo motors for steering, etc. And we both know Tesla has redundant AI chips on board.

To think that Tesla went to the trouble to equip its vehicles with this kind of redundancy, but missed something that somehow an armchair non-participant, non-engineer, and not privey to the design and implementation of Tesla’s architecture caught is, well, just wrong.

And, even if by some miracle you found something hundreds of Tesla engineers overlooked for their existing vehicles, Tesla could probably either add it, or has already added it to the CyberCab, which may be the vehicle for which they apply.

1 Like

Perhaps - but this overlooks part of the Tesla valuation story.

From the start, FSD has been supposed to be a massive value-add to Tesla’s big existing fleet. Tesla was going to flip a switch one day, and all the millions of cars they’ve sold would now be fully autonomous. The biggest increase in value in history. This was going to be yield enormous returns to the company. They’d be able to sell more of their current product line-up - no need for updated cars or multiple model types if you’re the only true AV people can own. They’d generate massive revenue (nearly all profit) from existing owners racing to buy FSD. And that massive stock of existing vehicles would cheaply fill their new robotaxi service - because used cars are cheap cars. If you can build a robotaxi fleet with a bunch of 4-6 year-old vehicles, you’re going to have a huge advantage both on cost and capital structure - on cost because you can buy those cars more cheaply, and on capital structure because you will be able to get some people to retain ownership of their older cars and just put them in your taxi pool.

All of that goes away if existing cars can’t be autonomous because they lack the necessary hardware. HW3 already puts a big dent in that model. Most Teslas (and all of the old Teslas) have HW3 rather than HW4, and it’s increasingly looking like HW3 won’t ever be fully autonomous. The lack of front bumper camera for all but the newest Model 3 and Model Y might also pose a problem. If there are other hardware deficiencies, then Tesla’s massive existing fleet is no longer much of an asset. To the contrary, it becomes something of a liability - because all of those millions of owners were promised that their cars had the hardware necessary to “self-drive” some day with just a software upgrade, and that promise carries both a reputational and perhaps a legal problem if it isn’t met.

Tesla gives up a huge economic advantage it was counting on if it can’t implement actual autonomy on previously-sold cars - both generally and specifically in the robotaxi segment.

3 Likes

The word “specifically” was in there exactly because much of the SAE standard is not specific, i.e., incorporating it incorporates general goals, not specific behaviors.

1 Like

Which takes me back to my question of long ago. Is Tesla trying to be in the Uber business - using everyone else’s hardware and taking a cut of revenues, or are they trying to own the business, including the metal rolling stock on the streets? Those are two very different businesses, and each comes with its own challenges;

One requires a lot of capital and oversight, the other, renting (or giving the software to others in exchange for a piece of the action is an area Tesla has not been in before. Both are certainly do-able, and maybe they get away with trying to do both, but generally businesses succeed when they have a clear vision of where they’re going, pardon the allusion to selfdriving automobiles.

1 Like

There’s always been a fair amount of ambiguity/flexibility in that, but from their public descriptions they’ve wanted to be in both. Or rather, a hybrid version that has both aspects and isn’t entirely either one.

Based on their public statements, a huge chunk of the value of AV is in direct sales to car owners for their own use. They would pay a huge premium for the car’s self-driving ability.

Therme weren’t a lot of specifics of “Tesla Network,” where people would enroll their cars in a Tesla-run robotaxi business. But it seeed to be something of a hybrid between what Uber does and Tesla owning the business (but not the cars) themselves. There’s no driving/driver at all, so the car owner wasn’t likely making any choices about how the car is run or where it goes or anything - they would just be choosing “in or out” at any given time. So Tesla would be running much more of the actual business than Uber (or Airbnb) does, and positioned less as a market maker for private parties entering into a contract - you would probably be getting a ride from TeslaNetwork, not from an individual car owner.

Now, it looks more like they’re aiming towards more like a cab company, where they (or some third-party corporate entities) own rolling stock that’s used solely for taxi rides. Especially if actual autonomy and/or TaaS requires physical features that will be present on a Cybercab but not on existing privately owned Teslas, like self-closing doors (apart from Model X) or more advanced hardware that they don’t yet want to roll out to every Tesla. None of the stock would be owned by individuals that also use the vehicles for their own transportation. That might just be an interim step, though - they might go back to the first arrangement if we enter a world where people are buying fully AV cars for their own use.

1 Like

SAE standard is almost entirely silent about how the vehicle has to achieve certain behaviors, but is clear what those behaviors are.

Lots of regulations are written like this. Building codes might specify the R-value in your ceiling, but doesn’t say what the insulation needs to be made of.

In the current case, the Texas law says that in order to obtain an autonomous vehicle ride hailing permit, the vehicle must be L4 or L4 as defined by SAE. Neither the Texas law nor the SAE standard say anything about the need for lidar or geogencing. However you want to get to L4 is fine by them. But what L4 means is clearly defined.

1 Like

Unfortunately, you left out a lot of important details, some of which we’ve discussed previously. Only the cybertruck has a fully redundant steering system. It has to, it is drive by wire. The steering systems for all other Teslas have some redundant features, but there is no second steering motor, no second belt, and no second ball‑screw. All steering assist and Autopilot steering depend on a single mechanical drive path. This video below a good breakdown of the steering system for those interested.

Is the steering motor part of the ADS? You bet it is. If the steering motor goes out can the car safely achieve an MRC, like by pulling off to the side of the road? Possibly in some conditions. Big trouble if it happens on a curve.

We talked about the driving computers previously. As we discussed, there are indeed two chips, but both chips sit on the same physical board. If the board itself suffers a hardware fault for any number of reasons (over‑temperature, connector, liquid ingress, physical damage) both chips can fail. And for record, the boards can fail. Tesla issued a recall for this very issue.

We can go on down the list. The cameras are not fully overlapping. Failure of a single camera can leave a blind spot.

You missed the most obvious, simple, and logical conclusion: Tesla engineers weren’t trying to design an L4 system. They were designing an L2 system. They did a great job. FSD works great as L2.

The disconnect is that Elon Musk claims it is L4 capable. Elon Musk has been known to say things that…well, lets just say he’s not a credible actor when it comes to predictions.

So my conclusion is the Tesla engineers did just fine and Elon is doing what he’s always does: telling tall tales.

Throughout this thread I’ve used the term “existing Teslas” because it should go without saying that cars can and will be designed differently in the future. But apparently I had to say it.

That said, adding features to future vehicles directly conflicts with one of main Tesla bull cases is that existing Teslas are only a software update away from becoming L4 capable. This unlocks tons of value, not only from the robotaxi element, but autonomy in general. Door Dash, other types of delivery, accessibility for people who can’t drive, on and on. There would be potentially millions of current owners willing to make high margin software subscriptions.

But that kind of goes away if we’re only talking about future vehicles.

3 Likes