No way. It will have to be much better, perhaps much much better. Fails will be headline stories, as they are now. A significant body of positive evidence will have to be established and that will take time. There will be lawsuits against people “not driving” and against companies which encourage people not to pay attention (even as their fie print says “Pay Attention.”) Legislation will be offered, debated, perhaps laws will be passed.
It will take years and years to be widely adopted, not least because of legal issues, but also simply because of price: outfitting a car is going to cost thousands more (not even including the possible liabilities). And, of course, the simple turnover of used cars taking more than a decade…
None of this is about “when it’s possible”, but even when it’s possible that doesn’t mean it will be in reach of most. (My view on when it will be possible - technically - reaches well out, and then add much further out for it to become practical, accepted, and much much further for ‘common.’
It’s already been almost a decade since Musk started talking about it, mislabeling the option “AutoPilot”, and then backtracking repeatedly admitting that it’s way harder than he thought. I think it’s even harder than that.
Probably. But once it’s as good, becoming much better will happen quickly. And insurance companies will be quick to tell us what the situation is by what they charge for their policies.
Of course. Truly autonomous vehicles effectively do not exist until the purveyors of such systems indemnify the occupants of the vehicle against any (reasonable) liability. There’s nothing like that on the road now.
Of course. And much already has. Sufficient so that autonomous vehicles can drive in much of the US at least. Well, that’s what I’ve been told. I have not verified.
The price of outfitting a Tesla to be fully autonomous is $0. All cars are sold with all the equipment. It’s just the software that’s not up to the task yet, and the price of the software is completely arbitrary. And people claiming the current hardware will never work would do well to go look at what happened with AlphaZero: once DeepMind figured out their new approach it needed a smaller and slower computer for a massively better result.
As to legal issues, that will be interesting. Elon Musk has been quite clear that he believes that once the use of autonomous software leads to significantly fewer deaths and injuries on the highways, that it becomes a moral imperative to deploy that software. He has said that Tesla will deal with whatever lawsuits result. So who knows how that will go, but it seems likely that at least one company will be leading the way.
Tesla has also said that they are currently working on a vehicle that will cost about 1/2 of what a Model 3 costs to build, so they will presumably sell it for much less than a Model 3. I wouldn’t expect that until 2025, but it really depends mostly on how the market for vehicles and autonomy plays out.
Once new cars are much more capable than old cars, people won’t be able to get rid of their old cars fast enough. And, of course, many households that today find it necessary to have two cars will be able to get by with one.
You’ve piloted a plane on autopilot and a Tesla vehicle and found their “autopilot” capabilities very different? (Hint: it’s not mislabeled, it’s misinterpreted by people who’ve never used an autopilot system).
It’s been eight years since the first Teslas were delivered with Autopilot. I got one built the first week they built them in September, 2014. I had no idea I’d be getting it as they just put it in without announcing it – I got lucky that my order was filled a couple weeks later than they originally said it would be. In fact when I took delivery of the car and asked “What do these markings on this stalk mean?” I was told “We can’t tell you, but you’ll know soon.” But they certainly weren’t saying it would be fully autonomous back then, just do some cool stuff on limited access highways.
The actual problem is that he hasn’t backtracked (much). He keeps repeating that for sure it will work next year. So yeah, Musk has been very wrong about how difficult the problem has turned out to be.
A tepid defense is that estimating the time it will take to solve a software problem has always been pretty much impossible. As a software engineer, it was something I avoided to the best of my ability. One vaguely humorous rubric I remember was: whatever time you think it will take, triple that and then use the next units up. So if you think it will take two months, then say it will actually take six years, and a one day quick hack will really take three weeks. One addition to the rubric was: even taking this into account, it will take longer. And then, for extra accuracy, add in the well known rule that adding people to a late software project will make it later.
The real criticism I have of Musk in this regard is that he should have known better, and he should certainly know better by now. He should stop saying anything public about autonomy, as in STFU, or just leave it at “we’ll get there when we get there”. Because Tesla will get there eventually.
There’s no way for us to know this. It is in fact, not a fact. YOUR Tesla likely requires an MCU2 upgrade if you ever want full autonomy. And there is no concrete reason to believe that more recent Teslas won’t also require hardware upgrades for true full self driving.
In my experience, there are often substantial lags in certain cases which indicate to me a lack of sufficient processing power even today without true autonomy.
You’re right that there’s no way to know this for sure until it happens. And it’s possible that it will never happen. But my concrete reason for believing this is that Elon Musk said early on that people who purchased FSD would get the full thing when it was delivered, and if any hardware upgrades were required they would be done free of charge.
They already upgraded my FSD computer from 2.5 to 3.0 for free (and lots of other people’s), so it’s not as though it’s simply blind faith.
And I already paid to get the MCU2 “infotainment” upgrade. There were enough advantages to it that I decided it was worth paying for it now rather than waiting for something to eventually break. I’ve had the upgrade for almost a year now, and I think it was worth getting, partly because Tesla keeps providing new features (not FSD related) that weren’t even thought of when I got my car almost five years ago, and some of them aren’t available with the older non-upgraded hardware.
Yup. We won’t know for sure until we know for sure.
Unless, as you yourself it requires new hardware sensors, differentiated from what already exists. And/or news softwares, unable to be run on the current platform. And/or a whole new suite of hardware, OS, sensors, etc. IT seems unlikely that the company will be able to retrofit everything they’ve ever sold. Not perfectly parallel, perhaps but close enough to how Apple won’t retrofit your 1999 iPod to work with iTune 14.8 because they can’t.
I was involved in the airline industry for several years. I know what “autopilot” means - and does. And yes, I have a friend with a Tesla and have driven it, and no, it does not do what “autopilot” in and airplane does. It has some of the characteristics, but not all. It’s mislabeled, as even the government has decided:
IF they had called it “Driver Assist” I’m sure they would have no trouble, since that’s what it is. It’s not close to “autopilot” in any sense.
No, any hardware upgrades needed for FSD (for somebody who bought FSD) should be $0 regardless. That’s what they’ve said and what I expect they’ll do. But, of course, we won’t know for sure until it happens.
Remember, one aspect of Tesla growing deliveries by >50% every year is that most of the cars on the road are newer. My 2017 vehicles with FSD are from a year when Tesla delivered ~103K vehicles, and in 2021 they delivered ~936K. That’s a CAGR of over 73%. So the upgrade burden to Tesla becomes relatively small over time. Not even close to “everything they’ve ever sold”.
And, of course, they would only have to upgrade those vehicles for which FSD was purchased, best guess being 10-20% in the US and 1-5% elsewhere. So, relatively small numbers.
Sure. “Involved.” Were you a pilot? Or a liability lawyer? Pretty much any other kind of “involved” doesn’t mean much. But if you do know something, please describe who’s responsible when things go wrong, and under what circumstances. I’m ignorant, having never been a pilot or a lawyer of any kind.
And you drove a friend’s Tesla once (or maybe more)? That certainly makes you more informed than most who post here and pretend to know something. As to your experience, when driving did you use Tesla’s “Autopilot”? What vehicle? What version of the firmware? Was it Autopilot, Extended Autopilot, or Full Self Driving? Capabilities vary.
And in what sense does it not do what an airplane’s autopilot does? In even its basic configuration, Tesla’s autopilot drives nicely in long distance limited access highway situations without much interference needed. Sure, complex and emergency situations require the human to take over, exactly as with airplanes. So please be specific if you’re going to claim it doesn’t measure up.
No, the government hasn’t decided that. The NHTSA is conducting one of its endless attempts to find some problem with Tesla’s systems. They have not concluded anything and have moved from the “should we investigate this?” phase to the “let’s collect some evidence” phase. And it’s not even a “doesn’t measure up” investigation.
As with all of these situations, if you look you’ll find that vehicles that are not Teslas on Autopilot cause far more crashes and mayhem than Teslas on Autopilot. Whether this is also true when adjusted for the relatively small number of cars on the road that are Teslas on Autopilot remains to be seen. My guess (and it’s just a guess at this point) is that they’ll find that Teslas on Autopilot are safer than both Teslas not on Autopilot and cars that aren’t Teslas. We’ll learn more eventually. Maybe. These investigations have a way of just quietly disappearing when the NHTSA gets tired of them.
One problem with this sort of exercise is that if they ever come up with anything it will be long after there is (almost) nothing on the road that looks like what they’ve investigated. Tesla updates its software regularly, often with changes to Autopilot. In the case of my Model S that I got almost five years ago, I’ve accepted 90 software updates.
The NHTSA has forced Tesla to change some things, all for the worse. One was something about custom horn sounds that I didn’t pay much attention to because I don’t have the hardware for that.
The most obnoxious change, one that makes FSD less useful, is that they forced Tesla to not allow “almost” stops at stop signs. For a while, the FSD beta software was driving like a human, doing rolling (2mph) stops at stop signs where there was no traffic of any sort on the cross street; but the NHTSA said that this was illegal and so Tesla couldn’t do it, despite it being the way most humans do it, and it being perfectly safe. What this has meant is that the car now comes to a complete stop even when it’s pointless, annoying any following drivers, so it’s polite to override the system and then reengage it if anybody is behind you. Not a win for anybody.