How Elon Musk Knocked Tesla's "Full Self-Driving" Off Course

An infantile approach. Tesla will see a lot of courtroom time with that approach.

Um. no. I don’t see that at all. Seriously. I can see a million Tesla owners being p*ssed off that after spending all that money they still don’t have something that they can use and passes regulatory hurdles.

Have a million Tesla owners paid for the FSD option? I highly doubt it. I think somewhere around 10 or 15 percent of purchasers buy that option.

Does anyone know what those regulatory hurdles are? Anyone have a link to the documents specifying what those hurdles are exactly? (I’m pretty sure those regulations don’t exist yet, so it’s impossible for any car company to make sure their designs meet those yet to exist regulations.)

3 Likes

Quite true. And when Rob Maurer talked about numbers not being available, that’s exactly what he was talking about. Making a meaningful comparison is tricky, and not really possible with what Tesla provides.

On the other hand, Tesla provides enough data to make it extremely clear that people claiming that Autopilot is dangerous are wrong. It’s relatively safe, and clearly improves safety in the exact same vehicles when drivers choose to use it.

This is just you making stuff up again. Behind whom? Nobody has gotten even a shadow of real autonomy to market. And I would argue that the path that most purveyors of attempts at autonomy are taking cannot ever get there (high def mapping as a requirement is a dead-end), so they’re really exactly nowhere.

Tesla is actually way ahead of everybody else.

This is just silly. Being ahead in “official testing” is meaningless. And what regulators say is also meaningless until something comes close to actually working. The miles that other “AV entrants are racking up” are puny numbers. Tesla will collect data on more miles driven in a single week than all other testers put together over their entire testing effort. You’re totally out of touch, as well as being misinformed.

Also, we know from Tesla’s recent “recall” of FSD Beta that there are currently somewhere around 400,000 FSD Beta users, racking of testable miles every day. Huge numbers compared to everybody else.

You keep writing about stuff you know nothing about. Maybe less than nothing. Tesla is collecting data all the time from (almost) every vehicle they have on the road, because they are all equipped with the hardware and they are all running some version of Autopilot. They run the software in the background when the driver doesn’t have it on. Tesla knows all the time when a driver’s decision differs from what Autopilot would do.

Please, please, please consider not posting about things you know nothing about. You’re just spreading misinformation.

-IGU-

3 Likes

But what does this actually simulate?

Does it mean that every instant, say time t, the software is simulating making a driving decision and recording it (along with sensor data), but at each prior instant the driver actually made the decision, say times t-1, t-2, etc. So the Autopilot decision will be dependent on all of the driver’s prior decisions (because all of the prior driver decisions determine the vehicle’s current state - position, velocity, etc - and Autopilot must respond to the vehicle’s current state).

This isn’t autonomy, or even a simulation of autonomy, because of the dependence on prior driver decisions, I think.

Or are you saying the software is just collecting data (images, other sensor data, etc)? But even then, this collection of images and other sensor data is measuring what is sensed under driver behavior, not what is sensed under autonomy.

I’m not saying simulating autonomy in parallel with human driver behavior or collecting sensor data in this way is not informative, I’m just wondering how well it mimics autonomy. I’m thinking not very well because in machine learning the software needs to learn from its own decisions (and collect the sensor data that arises from those decisions).

What does this mean exactly?

JimA

What I mean by knowing less than nothing, in this context, is there being more misinformation than information. That is, stepping backwards on the path to being less wrong.

-IGU-

2 Likes

I don’t know exactly how Tesla’s Autopilot works. I can tell you that the driver can engage it at pretty much any time that it is usable, so it has to be ready to start instantly in whatever the current situation is. That means it has to be continuously reevaluating everything.

Currently there are many situations in which it cannot be engaged (e.g. really bad weather, at excessive speeds, in certain places, if its sensors aren’t working). But regardless, it is always running in “shadow mode”, deciding what it would do and noticing when the driver does something different.

It’s neither autonomy nor a simulation of autonomy, it is collecting data as to how it differs from the human driver. The driver may be a good driver or not, but the car is aware of how the driver is doing, at least to the extent of crashing or having to brake hard or whatever.

But regardless, Autopilot is running at all times just the same as if it were driving. But it isn’t controlling anything. And it reevaluates everything continuously, so every time is time 0.

It’s not mimicking anything. It’s not controlling the car unless you ask it to.

I don’t think you know how machine learning works (at least currently). The training happens before the software is installed in the car. The car doesn’t learn anything, at best is collects data that goes into the next round of training.

So what “shadow mode” can collect is such things as “I didn’t disagree with what the driver did in the ten seconds before the crash, so I would have crashed too.” And “I would have taken that right turn from a different position and angle and speed than the driver did.” That might indicate that the way the turn geometry is decided by the software might need some adjustment. Or not.

It’s complicated and I don’t know much about it beyond what I can infer from having used it for years and watched it slowly improve. Besides, it keeps changing.

You can take a look at the release notes from the latest FSD Beta 11.3.2 and see what you can make of it. This release is currently on about 9% of the vehicles that have FSD Beta. I don’t have it yet.

-IGU-

3 Likes

I heard 15% but you can get FSD on a monthly basis where you don’t “spend all that money to be p*ssed off” about.

The Captain

1 Like

Heh heh … I am planning on trying it some month if I have a lot of driving to do. However, they pulled back last month and as far as I can tell, it (SW version that supports FSD beta for monthly subs) will only be enabled again after they fix the bugs in the recent recall.

Two steps forward, one step back. :upside_down_face:

When you do I would love to hear your experience with FSD beta.

The Captain

2 Likes

How do you know? Tesla doesn’t release data to support that conclusion. They compare accident rates in Teslas with and without Autopilot engaged - but since drivers get to choose when they engage Autopilot, you don’t know whether the lower rates are because Autopilot makes the cars safer or if people are selectively engaging the Autopilot in scenarios where accidents are less likely (highway vs. urban, good weather vs. bad, etc.).

Behind Waymo and Cruise. Are you really that unaware of what other companies are doing? Those two companies have been offering fully autonomous robotaxis in California for years. They were among the first to obtain permits to run driverless testing programs, and then they were among the first to obtain permits to run driverless deployment programs. Cruise just recently applied to expand that permit to the entire state.

They’ve now compiled several years’ worth of data on how their driverless systems perform in the real world with full autonomy - pickup-to-destination autonomy, without a driver being able to pick and choose when to engage the system. Actual full self driving. So they can show regulators how often the system works (and doesn’t) across the entire spectrum of vehicle use, not just in selected situations. This is already happening:

Tesla doesn’t have that. Yes, Tesla has gobs of other data that are useful for its internal purposes of training the AI system and other things - and yes, it’s always collecting data about driver behavior and road conditions and what have you. But the engagement of Autopilot is up to the drivers’ choice. Because of that, Tesla only has data on how the system reacts in the real world in the scenarios that drivers feel comfortable enough turning the system on. Not only is that only a subset of driving scenarios - it’s a subset that’s selected to be more favorable to the system, because drivers are going to turn it on more often in more favorable driving conditions.

I can’t see any scenario where regulators accept that data as being sufficiently informative of the overall safety of the system. They’re going to make Tesla go through the steps that Cruise and Waymo have already satisfied. They’re going to have to test their system in the real world going “door-to-door,” not driver-selected segments of the trip, and measure how well it functions in that context before being allowed to turn it on for the whole fleet.

So, yes - they’re years behind Waymo and Cruise in compiling data that would be sufficient to get regulators to approve their “full self driving.”

4 Likes

Let me fix that for you:

Two steps back, one step forward. :grin:

1 Like

Are you really unaware of the fact that these systems have been geo-fenced?
Are you unaware that for much of the time they were time-of-day restricted as well?

So these systems are possibly ahead of Tesla in some ways, but we just don’t know the full scope since we have no Tesla data that matches the restrictions that these systems operate under.
You are probably right that Tesla data leans toward the driver chosen easy miles, such as highway driving. But the competitors, at least initially, were prevented- from highway driving and from any high speed driving AFAIK.
So, IMO, there is no basis for declaring anyone ahead. Certainly Tesla is behind in paid taxi trips.

Mike

3 Likes

Actually I got that from Maurer video 6:43-6:47

1 Like

July 9 2022
And repairs taking months is not uncommon - you can search TMC for similar threads.

1 Like

I don’t know. But it’s obvious. You see, I’ve actually used FSD. I engage it everywhere I can. And it, along with not causing any problems, has avoided accidents several times, including at least one crash I could not have avoided. These incidents, of course, don’t appear in any released data as such because there’s really nothing to compare them to. Nobody has specific data reporting crashes they didn’t have.

You choose to nitpick that if you twist your face up funny and squint just right you can see ways in which the system could be worse but the data Tesla reports would look better. Sure. I choose to believe the data is reasonably presented because I’ve experienced using it and it matches what I experienced.

And Tesla, since they are collecting data from the use of Autopilot in shadow mode, has (but doesn’t publish) all the relevant data. So they know. If you imagine that they are exposing themselves to unambiguous proof that they are hiding such data, with people coming and going from the development team all the time, you’re out of your mind.

Waymo and Cruise got nothing. HD Mapping and Lidar are dead ends. They’ll get close with enormous expense and effort, but never produce a system you would use to send your kid to visit grandma five hours away. Never.

There’s no such thing as “years” of data. That’s not the relevant metric. It’s miles and time on the road. So they have a puny amount of data. And calling it “full autonomy” is a joke last I checked. Safety drivers, regular abandonment, limiting destinations to avoid difficult places, …

Once again, you’re undoubtedly talking about something you’ve never tried. But at least we’re on an equal ignorance footing on this one, as I haven’t tried it either.

Sure, that’s the majority of miles. But Tesla FSD is tried in all sorts of wild conditions, pretty much anywhere Tesla will allow it. A small subset of people testing it out are pretty extreme and seek out the edge cases rather than avoiding them. With over 400,000 actual cars on the road using FSD Beta, I’m quite sure that just that small subset of their users drive far more miles than all others systems combined.

For where Tesla’s FSD is now, the regulators are (mostly) irrelevant.

Please stop spreading misinformation.

-IGU-

2 Likes

No. What Rob said was that Tesla has the data on the one specific scenario under discussion (frequency of phantom braking), but doesn’t publish it. You then characterized that as “if accidents were less Tesla would proclaim that information. They haven’t so we, the public, are still in the dark.”

One thing has absolutely nothing to do with the other. BS, as I said. Or maybe you just don’t understand what you read and hear.

-IGU-

Like I said, no worse than anyone else, probably better. Actual information on the subject looks like data regarding similar repair situations across many times and manufacturers and repair shops. I’ve never seen that, although I imagine something exists.

Everybody has failures to fix things immediately, for many reasons.

The particular case you pointed to at TMC was more a repair shop problem than a Tesla problem anyway. The guy’s car was drivable, so he could have been using it the entire time until the parts came in. Been there, done that.

One of my two Tesla crashes that required body work involved a similar scenario. I took the car to a Tesla approved body shop and got the same story. The car was drivable, but badly damaged. They told me they would have to take it apart to figure out all the parts they would have to order and get insurance company approval for the repair, and they wouldn’t put it together again until they got all the parts. Maybe months. So I took it to a Tesla collision repair facility (actually run by Tesla).

They explained that it would indeed take a while to get the parts, but they would just order whatever they thought they might need without taking it apart, and they would quote the likely cost to the insurance company, but it would probably change later. Meanwhile I could drive the car around.

A few weeks later they told me to bring it in, and they did the repair. They said they simply refused to give the insurance company final numbers until the repair was done, because the loop of estimating for the insurance company and waiting for approval over several iterations was very time-consuming. And they’d have to take it apart to be certain and then I couldn’t drive the car while waiting for parts. The whole reason the insurance company did it that way was to avoid fraud from repair shops, and Tesla wasn’t going to play that game because it slowed down repairs.

Worked fine for me. But, of course, this is all anecdotal. Without actual useful data it’s impossible to say anything definitive.

-IGU-

1 Like

So what? If Level 4 autonomy - using LiDAR and maps - is able to cover virtually all of the trips within a broad metro area, then the inability to send a kid five hours away will be pretty irrelevant. And why are LiDAR and maps dead ends? They’ve gotten those companies to actual robotaxis years before Tesla.

To say nothing of the fact that it would be pretty irresponsible to put a kid alone in a car for five hours - what if it got a flat? Or the kids got carsick and threw up or had to go to the bathroom?

That may be true - but the regulators have no way of identifying who those folks are. They can’t pick and choose which drivers’ accident rates to look at based on Teslas claim that these are the hardore drivers. They will need Tesla to actually participate in the type of pre-deployment testing programs that Cruise and Waymo have been enrolled in. They’re not going to take Tesla’s word that any given 1-2% (or whatever) are pushing the edge cases - they’ll want a controlled study, where drivers aren’t picking and choosing when to activate the system.

Again, Tesla is years behind the competition in terms of being prepared to get regulatory approval for their system to be used for autonomy rather than driver assist.

3 Likes