Musk craves Buffett's blessing even as Ajit Jain scoffs at his claims

Not all that strong. The accident rate is lower than the national average, but that’s to be expected, because the national average isn’t an appropriate baseline for comparison. Tesla cars are vastly newer than average (most of them were made within less than four years), they have higher quality equipment than average (like brakes and such), they’re more expensive than average (making them less likely to be driven by teens), and they’re disproportionately not driven in snowy or icy weather. Couple that with the fact that FSD is a self-selected system - drivers choose whether to activate it or not and the system deactivates in scenarios it’s not great at - so the comparison isn’t between all driving scenarios (like the national figures), but just the ones that the drivers and the FSD system choose to operate in. Which are almost certainly going to be the safer/easier scenarios.

And there’s this - a very large proportion of accidents, especially bad accidents, are caused by teens and drunk/impaired people. The overall accident rate is much higher than the accident rate for sober adults. So if you’re a sober adult, the fact that the FSD has a lower accident rate than the national average doesn’t tell you whether the FSD is safer than you are - because you’re safer than the national average, simply by virtue of being a sober adult.

So, no. We (the public) have no idea whether FSD is safer than humans or not. The reason that I say that Tesla might not know is because I don’t know if they have access to comparison data. They know what their cars are doing, but they might not have data on what the national accident rate is for a <4 year old, >$60K vehicle (when you add in the FSD cost). They only cite the national averages - which makes them look good, but they have to know that it’s not an appropriate comparison set.

1 Like

We do know that cars being driven with FSD have a significantly lower accident rate than other Teslas being driven by humans. That covers most of your issues.

1 Like

It narrows them down, certainly - but that comparison just makes the self-selection more significant.

I assume you’re referring to this data (the chart was made by ARKK based on Tesla’s release of FSD data at an investor day last year):

You can see some of the factors I discussed in comparing the center and right bars. Teslas get in fewer crashes than cars generally - the cars are newer, have better equipment, probably fewer teens driving an expensive new car, etc.

Comparing Teslas in FSD to Manually Driven Teslas, though, is going to have huge self-selection issues. Drivers choose when to activate FSD, and drivers presumably both want to avoid collisions and have some sense of what scenarios are safer for FSD and which ones aren’t. Unsafe situations where crashes are more likely (bad weather and poor visibility, disrupted traffic patterns, unusual movements) are far less likely to be handed over to FSD.

And of course, remember that when FSD runs into a situation that’s hard for it to handle it passes control to the driver. We don’t know how Tesla accounts for crashes that occur when FSD made a mistake or encountered a “needs intervention” situation and handed control over without enough time for the driver to respond properly. They won’t be in FSD mode at the time of the crash - so unless Tesla makes a point of putting them in that bucket, crashes that are caused by the FSD/human system not working well will get put in the wrong column.

Again, these are questions that could be resolved if Tesla were more transparent about their data. These are the questions that NHTSA is trying to get answers on, so perhaps Tesla’s mandatory disclosure of that data in that investigation might provide some clues (though I think they’re only looking at AP right now).

Was there other data you were referring to, though?

1 Like

I think it has been widely written that Tesla counts FSD accidents if FSD was engaged up to 30 seconds prior to the event.

Mike

1 Like

Could be, but I’d question “widely written.” I went looking for that information and couldn’t find it and I’m usually pretty good at the Google machine.

There’s a general requirement that crashes have to be reported to NHTSA if a Level 2 (or higher) ADAS was engaged up to 30 seconds prior to the event - but not whether Tesla counts them as FSD accidents or manual accidents. NHTSA only requires they be reported, not that the automaker treat them any particular way.

Like Goofy, I couldn’t find anything that confirmed how Tesla classifies those types of incidents. Do you have a link?

I’ve seen it a number of times so I don’t know where I’ve read it.
Here is a link that discusses it that I have not previously read, but it was pretty easy to find
(but not a definitive authoritative source)

Mike

1 Like

I suppose that, countering your suggestions there are things like the driver using FSD when perhaps he shouldn’t in order to see if it can handle the situation. Far more significant, of course, is the driver paying less attention than he is supposed to be.

Not completely related but is the fsd programmed to protect the driver/car or to protect pedestrian/biker/colliding vehicle/object?

As in if the choice is veering off a road or bridge vs hitting a bicyclist or school bus.

And is this setting adjustable

Of course it’s adjustable. You just use the Foot pedal…

sorry

Lol i guess i was envisioning an FSD where i could someday be asleep for the drive or even in the backseat.

Personally thats the appeal of fsd to me. If i have to be awake and monitoring traffic may as well drive. We’ll get there sooner or later tho

Neural Network AI does not work like traditional heuristic programming, there are no “settings” as such. It’s determined by the training data. Depends on how the data are curated.

It’s in the making.

The Captain

1 Like

Does that mean that everyone in an AI car is going to be stuck at the speed limit? While everyone else is passing them (or getting frustrated that they’re driving exactly the speed limit)?

I mean, I expected that for legal reasons these cars would be stuck at the legal limit - but it seems that they might be locked into that by their programming as well.

That’s not how it works. The maps and reading of road signs sets the speed limit as you drive. But the driver can adjust this number up or down via the steering wheel scroll wheel…just like you can in most cars that have cruise control.
So, no, you are not stuck at the speed limit.
And, legally, well that is a grey area just like in any car that allows you to use cruise control.

Mike

Ah…but these cars are different than cruise control. In a car with cruise control, the car isn’t driving - you are. In a dumb car, the car doesn’t know what the speed limit is - it has to be the driver’s decision. The manufacturer isn’t responsible if the driver chooses a speed that violates the speed limit, because the car can’t know what the speed limit is.

That changes with an AV. The car knows what the speed limit is (I’m incorrectly anthropomorphizing, but it’s just easier to write that way) - and the car is driving.
Which means that the manufacturer probably can’t give you the choice to override it. I mean, they physically could. But they wouldn’t just be exposing themselves only to massive civil liability. They would know you’re breaking the law (not in real time - just any time this option was exercised), and they would be allowing the car to do it anyway (unlike the dumb cruise control, the car could override your speed choice and go the speed limit). That’s a recipe for criminal liability if anyone gets hurt.

Seems really unlikely.

Ignoring the similarity (or not) to cruise control. No, autonomous cars aren’t stuck at going the speed limit while everyone else is going faster.
In a pure level 5 robotaxi, I suspect the speed limit will be the max

Mike

We’ll see. From a liability standpoint, I can’t see how they will be able to release an AV driver that can break the traffic laws. I don’t think they’ll be able to let the AV driver exceed the speed limit, any more than they’ll be able to let it choose to run stop signs or traffic lights. Unlike a “dumb” car, it’s the AV driver that’s in control of the driving - so it’s going to have to follow the law, else the manufacturer is in a world of problems.

1 Like

There will be a few phases of AV (if it ever really happens).

Phase 1: Mixed full AV, partial AV, human driving. Minimal communication between vehicles and vehicles and between vehicles and infrastructure.

Phase 2: Mixed full AV, partial, and human driving (which will always be augmented by the AV for safety). Some communications between everything.

Phase 3: Mixed full AV, and partial AV, very few human drivers. More communications between everything.

Phase 4: All AV, no human driving, full communications. Once you reach this phase (if ever), there will be no such thing as a “speed limit” or “traffic lights” because everything will be communicating at all times and flow will be managed based on instantaneous and predicted demand.

There may also be all sorts of intermediate phases, and likely sections of roadway that are in a different phase than other sections.

I mean, certainly adoption of AV’s will take place over time - so certainly there will be phases. But the issue of speed limits comes up right there in the first Phase, with the very first full Level 4 or 5 AV vehicles. If they can’t exceed the speed limit, that’s going to be very annoying, both to the owners and to other drivers. But I can’t see how a manufacturer could allow them to deliberately violate traffic rules.

By this comment, I meant that a car like a Tesla with FSD (supervised) you will be able to exceed the speed limit…because they already can do this. You just engage FSD and then roll the scroll wheel to whatever speed you want, up or down. You can’t go higher than IIRC, 85 mph anywhere.
You can also tap on the go pedal as you approach a stop sign at 3 mph and not come to a complete stop.

This, of course, is the human driving (or overriding the FSD defaults).
One could argue these are safety features. For example a car behind you is about to rear end you and you want to get out of their way.

Mike

1 Like