How Elon Musk Knocked Tesla's "Full Self-Driving" Off Course

Not only will all of those companies compete head on with Tesla over the coming months but Tesla read Musk does not know what he is doing. Buying Twitter proves that hands down.

Is Musk more brilliant than the rest of us? Of course he is but that does not mean this year he knows what he is doing. Pay attention Elon.

Ah, the anti-Tesla echo chamber is back in session.

3 Likes

Exactly!

The Captain

1 Like

It is not anti Tesla to be realistic about the competitive landscape.

1 Like

Rob Maurer at Tesla Daily reveals the author of the article, a Tesla ex-employee fired for making unauthorized videos about FSD

In MacroNews…

The Captain

3 Likes

No the author is Faiz Siddiqui:

Since joining the tech team, he has focused on Tesla’s rollout of driver-assistance technology, labor and workplace issues inside the company and the decisions of its chief executive, Elon Musk. Prior to joining the tech team, he covered the D.C. Metro and local transportation scene, including, the system’s chronic safety and reliability issues.

Bernal was the fired employee.
Maurer a Tesla proponent poo-poo the accident data but states he & we don’t know the data only Tesla has the data. We don’t know if any of the multiple individuals [about a dozen] interviewed have a summary of the accident data. I would think that if accidents were less Tesla would proclaim that information. They haven’t so we, the public, are still in the dark.

Tesla & other EV makers have some mountains to climb.
Some buyers of the FSD system are cheesed off & have filed a class action suit against Tesla.
Tesla has a parts supply problem for Tesla vehicles involved in crashed. There are long wait times for those vehicles. And there is the EV insurance problem with an EV involved in an accident spelled out here:

An EV would be a valuable second vehicle used in commuting & shopping if one has a 240 volt charging system at home. But as a sole vehicle in the hinterlands [where I reside]…nah more time needs to pass. Especially if one is a late adopter of technology, as I am. I did not own a computer until 1995 or a cellphone until 2014. And I was late to online bill payment [2014]. I still get the paper statement for my credit cards. And I still like reading physical books, no kindle for me. I like to see a vast majority of bugs worked out on any purchased technology. Yep I is a fuddy duddy Luddite! LOL

1 Like

Owning a gas powered vehicle will likely get increasingly inconvenient. The number of gas stations has been in decline for a number of years now. There were over 200,000 stations in the US in 1994. That has declined to a little of 100,000 today. At least one analyst projects that 80% of gas stations will be out of business by 2035.

There will be some threshold of BEV adoption that when reached will result in a sharp decline in ICE related industries that will include gas stations, mechanics, auto parts etc.

1 Like

Thanks for the correction!

The Captain

1 Like

Actually, I have seen reports on accident data from Tesla multiple times over the years. The one problem with them is the question of whether Tesla with FSD can really be compared to the average car because of the likely difference in the driver.

2 Likes

You do seem to love to parrot BS you read somewhere. Tesla publishes its accident statistics, and has for years. It’s hugely better than average, and much better still when using its Autopilot. The link has been posted here many times. Rob Maurer’s comment about only Tesla having the data is in regards to having more detailed insight into how exact routes compare.

https://www.tesla.com/VehicleSafetyReport

Some buyers of [anything you care to name] are cheesed off and …

No worse than anybody else. Probably better. Provide numbers for your bogus claims.

There is no Tesla insurance problem. The article is just more BS. Tesla is even available to provide insurance directly to about half its US customers at this point, and the number is growing as they expand the program.

The silliness about battery packs is just ignorant speculation. Batteries are easily recycled. Essentially old battery packs can simply be treated as high grade ore. Check out Redwood Materials for how it’s currently going.

-IGU-

4 Likes

IGU,

Thank you for the facts and analysis. The jury on those items had been out in my mind before your post.

That does not mean the competition wont show up very strong in the next three years.

Which doesn’t tell you whether Autopilot makes the cars safer.

Tesla’s cars should have fewer accidents than average even if they didn’t have Autopilot, because they’re not representative of the average car. They’re vastly newer than average - meaning less time for them to start suffering the safety declines that come with vehicle aging, like worn brakes and tires. Some of the biggest contributors to accidents are weather and teenaged drivers - with Teslas being geographically concentrated in fair-weather California and economically concentrated among people who can afford $45K-120K new cars, which will disproportionately exclude teenagers. To say nothing of the fact that expensive cars are spec’d out better than the average car - so being better than average doesn’t tell you whether Tesla’s any better than the average $45K-120K new car.

And as for Autopilot…well, it’s entirely up to the driver to choose whether to engage Autopilot or not. Which means that the results aren’t representative of average driving. The driver is going to disproportionately choose to engage Autopilot in circumstances that Autopilot will do well in (such as fair-weather driving on normal highway conditions), and less so in the circumstances where accidents are more prevalent (bad weather conditions, unusual traffic conditions or disrupted traffic patterns, complicated environments to navigate).

IOW, Tesla’s comparison of it’s average accident rate to the national average accident rate tells us virtually nothing about whether Teslas in general, or Autopilot-driven Tesla’s in particular, are an improvement over baseline. Which is one reason why Tesla is so, so, so far behind in getting autonomy to market. Other AV entrants are racking up the miles in official Level 4 testing (when the car is self-driving all the time, not when a driver picks and chooses to engage it), so regulators can actually assess whether they’re safer or not. Tesla’s losing years on getting regulatory approval by refusing to start doing that.

10 Likes

I agree with all your points on how we have no statistics that allow for a true apples to apples comparison. Maybe the problem, though, is that the national statistics don’t include all the required details, such as the car age, cost, driver age, etc.

I don’t see how your first sentence leads to Tesla being so so so far behind, can you explain?

This is factually false.
I took two different autonomous taxi rides in Las Vegas. The safety driver said he had to drive the car while in the hotel property (the harder parts of the trip, IMO) because their city approval only was for city streets and not private property. I don’t know about other cities, but the first and last hundred feet can require fairly difficult driving, while usually slow and not dangerous.

Mike

2 Likes

IGU,

Al’s analysis is even better, sorry.

Sure. Because they keep relying on the “apples to oranges” comparison, Tesla is years behind in compiling the data on product safety that will be necessary to obtain regulatory approval for a Level 4 or 5 autonomy assist program.

Cruise and Waymo have been operating fully-driverless Level 4 programs in certain California cities for several years now (Cruise just got permission to expand statewide). So they have years of “apples to apples” data - how the cars operate when a driver isn’t picking and choosing when to activate self-driving.

Tesla doesn’t have that kind of data. And I think it’s very unlikely that once the FSD system is “feature-complete” to Tesla’s internal satisfaction they will be allowed to just turn it on in a million cars. They’ll have to “show their work” to the government - which will mean doing the types of iterative programs to demonstrate safety that Cruise and Waymo have been doing for the last few years in California (not Nevada).

Given Musk’s disdain for regulators in other contexts, it may be that he genuinely is planning to just turn the system on once he thinks it’s done - counting on a million Tesla owners to pressure the government into letting them use the feature-complete system that cost them so much money. But I don’t think that’s going to be successful.

5 Likes

An infantile approach. Tesla will see a lot of courtroom time with that approach.

Um. no. I don’t see that at all. Seriously. I can see a million Tesla owners being p*ssed off that after spending all that money they still don’t have something that they can use and passes regulatory hurdles.

Have a million Tesla owners paid for the FSD option? I highly doubt it. I think somewhere around 10 or 15 percent of purchasers buy that option.

Does anyone know what those regulatory hurdles are? Anyone have a link to the documents specifying what those hurdles are exactly? (I’m pretty sure those regulations don’t exist yet, so it’s impossible for any car company to make sure their designs meet those yet to exist regulations.)

3 Likes

Quite true. And when Rob Maurer talked about numbers not being available, that’s exactly what he was talking about. Making a meaningful comparison is tricky, and not really possible with what Tesla provides.

On the other hand, Tesla provides enough data to make it extremely clear that people claiming that Autopilot is dangerous are wrong. It’s relatively safe, and clearly improves safety in the exact same vehicles when drivers choose to use it.

This is just you making stuff up again. Behind whom? Nobody has gotten even a shadow of real autonomy to market. And I would argue that the path that most purveyors of attempts at autonomy are taking cannot ever get there (high def mapping as a requirement is a dead-end), so they’re really exactly nowhere.

Tesla is actually way ahead of everybody else.

This is just silly. Being ahead in “official testing” is meaningless. And what regulators say is also meaningless until something comes close to actually working. The miles that other “AV entrants are racking up” are puny numbers. Tesla will collect data on more miles driven in a single week than all other testers put together over their entire testing effort. You’re totally out of touch, as well as being misinformed.

Also, we know from Tesla’s recent “recall” of FSD Beta that there are currently somewhere around 400,000 FSD Beta users, racking of testable miles every day. Huge numbers compared to everybody else.

You keep writing about stuff you know nothing about. Maybe less than nothing. Tesla is collecting data all the time from (almost) every vehicle they have on the road, because they are all equipped with the hardware and they are all running some version of Autopilot. They run the software in the background when the driver doesn’t have it on. Tesla knows all the time when a driver’s decision differs from what Autopilot would do.

Please, please, please consider not posting about things you know nothing about. You’re just spreading misinformation.

-IGU-

3 Likes

But what does this actually simulate?

Does it mean that every instant, say time t, the software is simulating making a driving decision and recording it (along with sensor data), but at each prior instant the driver actually made the decision, say times t-1, t-2, etc. So the Autopilot decision will be dependent on all of the driver’s prior decisions (because all of the prior driver decisions determine the vehicle’s current state - position, velocity, etc - and Autopilot must respond to the vehicle’s current state).

This isn’t autonomy, or even a simulation of autonomy, because of the dependence on prior driver decisions, I think.

Or are you saying the software is just collecting data (images, other sensor data, etc)? But even then, this collection of images and other sensor data is measuring what is sensed under driver behavior, not what is sensed under autonomy.

I’m not saying simulating autonomy in parallel with human driver behavior or collecting sensor data in this way is not informative, I’m just wondering how well it mimics autonomy. I’m thinking not very well because in machine learning the software needs to learn from its own decisions (and collect the sensor data that arises from those decisions).