Is self-driving oversold?

China Pushed a Hard Sell on Autonomous Driving. After a Deadly Crash, It’s Pulling Back.

SHANGHAI—Beijing is slamming the brakes on China’s self-driving marketing frenzy.

As [Tesla] and other automakers have touted their assisted-driving systems with terms such as “autonomous” and “self-driving,” one deadly crash last month involving the technology has sparked a broad debate over its capabilities—and how the features are being portrayed to the public.

Coming in for particular scrutiny by Beijing’s regulators is the question of whether automakers are portraying the artificial intelligence-powered technologies accurately, as well as the related issue of whether the public fully understands how it should be employed.

It is a high-stakes moment for China’s electric-vehicle industry, which has upended the global automotive world with a wave of affordable, sleekly designed cars. Chinese EVs have forced Western automakers to fight for survival in China, the world’s largest automotive market, and increasingly to play defense in their own home markets. Washington has slapped sky-high tariffs on Chinese EVs to keep them out of the U.S., while China leads in [car-battery technology].

https://www.wsj.com/business/autos/china-pushed-a-hard-sell-on-autonomous-driving-after-a-deadly-crash-its-pulling-back-b12072e3

The article notes that Chinese authorities aren’t banning the technology, but they have called in the automakers to insure that it’s being advertised and promoted correctly, and not with misleading terms like, uh, “full self driving” or similar.

Farther down in the article it mentions that there are over 100 car manufacturers operating in China, and that authorities have contacted 60 of them who are promoting the technology - not to stifle it but to insure safety so the public doesn’t sour on the whole idea. They still see “assisted driving” (or whatever term they settle on) as a valuable technology and one where they have the ability to distinguish Chinese technology from most others.

Still, they’re saying that most of these are still operating at “Level 2”, a far cry from true “self-driving”.

10 Likes

BYD is giving self driving away for free but when driving you must have your hands on the wheel for it to operate.

2 Likes

I drove back and forth to the vascular specialists’ office yesterday using Tesla HW3 FSD. This was a 60 mi round trip on busy Interstate freeways, through 2 construction zones, and past a tradesman on foot in the left breakdown lane who was retrieving a ladder that had fallen from his truck. Never had to intervene once, and FSD was actually driving a bit more aggressively than I would. (I haven’t tried the “Hurry” mode in FSD yet.)

I’d say the technology is about 99% there. Elon’s toxic reputation is now probably delaying approval.

intercst

5 Likes

That is Tesla technology which is not available to other brands.

The Captain

Good news is that DOT just slashed accident reporting requirements. An accident serious enough that it renders the car undrivable is no longer a necessary reporting condition.

The logic is we can’t compete with the Chinese if we had have good accident data.

12 Likes

That’s crazy logic. Limiting data rarely results in improved outcomes. Making AD vehicles safer depends on gathering more information, not less.

After years and years of broken promises, how can we say that self-driving hasn’t been oversold. Unless, we’re talking about how ridiculous promises made during earnings calls boosts stock price. Oh wait, that’s oversold too.

Of course it’s crazy logic. That was his point. And it is coming from the Trump administration, so it has to be crazy. This was the administration that wanted to stop testing for COVID after all.

4 Likes

You’d have no way of knowing. The typical U.S. driver goes 700K miles without a crash. So a system could still be 10x more dangerous than a typical drive, and the most likely outcome if you drove it a few times - or even a year or two - is that it would work totally fine over that limited window. And still not be 99% of the way there.

11 Likes

Nonsense. The average US driver files a collision claim once every 18 years. If you assume 15,000 miles per year for the average driver, that’s 270,000 miles between collisions.

Tesla reports that vehicles with Autopilot engaged are driving 7 million miles between collisions.

https://www.tesla.com/VehicleSafetyReport

From what I see in the data, and personal experience in the vehicle, FSD is a very skillful driver.

Of course, you’d need to audit Tesla’s data for accuracy (they’re holding it as proprietary), but like I said, they’re 99% there.

intercst
(long a skeptic on Tesla claims)

1 Like

I have told the story before. The pump seal company hired a “management consultant” to do a study of the seal market, and their company’s standing. The survey showed their company was not regarded well at all. The field reps were well respected, but the scores for product quality, price, and delivery, were way down. So management played the “you don’t understand” card: questioned the consultant’s methodology, and threw the report in the trash, because it didn’t say what they wanted to hear.

So, yes, in my experience, it is perfectly possible for a “JC” to bury data he doesn’t want to see.

Steve

1 Like

Hey, I’m just going by what Tesla has said - 700,000 miles between collisions, which they attribute to NHTSA data. It’s not inconsistent with what you observe, though. Collisions usually involve multiple vehicles - most of them will involve at least two, and some of them will involve several more.

Right - because that’s looking at a Level 2 system. The car isn’t driving itself. The human is driving the car with the help of a driving assistant. That hasn’t been oversold - automated features can definitely help a human drive more safely. That in no way tells you how close the AI is to being able to drive the car by itself (say, with you in a passenger seat).

6 Likes

The car is driving itself if I’m not intervening in it’s operation.

The question is, how often do I have to intervene to avoid a crash, and how does that compare to an unaided human driver.

intercsr

Not legal advice, but that’s not what’s actually happening. At all times, you are legally the driver. The car is just aiding you. If the car gets into an accident while you’re behind the wheel, it’s you that’s on the hook for the consequences of that accident.

That is an important question, but it’s not one that’s answered by Tesla’s “7 million miles between accidents” data. Because Autopilot does disengage, so that data is measuring the combined human-AP performance, not the AP alone.

Tesla doesn’t release that data. Crowdsourced efforts to collect that data (which Musk has referenced) indicate that FSD is several orders of magnitude below the number of miles needed for safe driving - somewhere south of 700 miles between interventions, rather than 700K. More than enough for you to take many, many trips between interventions - especially if you drive routes that have very “ordinary” traffic patterns - even though FSD isn’t anywhere close to being able to function on its own yet.

3 Likes

Then why isn’t it legal ?

Are they waiting for 100%? Is that even possible.

1 Like

I understand the legal liability (i.e., that I’m liable for a crash whether FSD is engaged or not.)

I’m talking about the actual technology. It may well be that FSD isn’t being approved because it’s too disruptive and would put too many people out of work.

intercst

1 Like

The Teamsters are certainly no fans of self-driving.

DB2

Of note, miles of driving without a critical disengagement* less than 1000 for all versions.

% of drives with a critical disengagement, between 3 and 10% for all versions.

In this context, a *critical disengagement means the FSD system will crash if the user does not intercede .

LOL! Seriously.

3 Likes

… and independent truckers.

intercst

(FYI…I don’t agree with the new reporting rules)

But the reporting rules sends data to a government agency. This has little to do with gathering more information to make self driving cars better or safer.
Any self driving car company can still collect all the data they want in a non-reportable crash and use it to improve their software. They still have (I would think) all the collected videos, telemetry, GPS and other data that would be needed to analyze the crash.
A line item in a list isn’t very useful to a company with a different ADAS implementation.

Mike

The community tracker is a problematic source, not only because it is a self-selected community but because there are intrinsic issues with such reporting that the community has not overcome. E.g., who decides that a disengagement is critical … what would have happened without the disengagement … did the driver or FSD make the disengagement … etc. etc.

1 Like