Latest comedic fantasy from Tesla Q2 2025 earnings call

It’s not an absurd statement, because L2 and L4 aren’t just empty labels. Cars that are L4 have the ability to perform actions on their own that L2 cars do not.

A car that is L4 can (and does) maintain 100% of the driving functions even when it encounters a problem. This means that if the car doesn’t know what to do, it doesn’t “give up” and turn over control of driving back to a human in real time - the car will take the necessary steps and continue to drive until it can come to a safe and complete stop.

A Level 2 system cannot or does not do that. If it encounters a problem, it won’t solve the problem by itself - it will revert control back to the human driver. AIUI, FSD is that type of system - if it disengages, it tries to hand control back to the driver. I’ve seen some suggestions on the internet that if the driver doesn’t take control after FSD disengages, the car might just slow down and stop immediately (which isn’t always the right or safe response, but better than just running without a driver at all); but I couldn’t find anything that confirms that’s the case. Tesla’s official discussion is just laden with warnings that the driver is always driving when FSD is engaged, that FSD can make the wrong choices and the driver always has to override when that happens, and the driver has to be ready and able to take over if FSD has a forced disengagement.

If you think that FSD - or any L2 system - can or is actually driving itself, you’re mistaken. It is always relying on a human in the car to be doing part of the driving function. That’s what makes it different from systems that can actually do their own driving.

1 Like

These are absurd statements and assertions.

Why are they absurd? Level 2 systems are more limited than Level 4 systems. They can’t do everything necessary to drive on their own - they have to have a human being in the car to fill in for them for some of their driving functions.

Tesla makes it abundantly clear in its instructions that FSD is a Level 2 system and that there must be a human being actually driving the car at all times. These aren’t just empty words. They’re not just legalese. They’re not just disclaimers. They are an accurate description of the limitations of the system and the way that it is different from an L4 system.

1 Like

I asked ChatGPT if Tesla could detect a dangerous situation, and slow down , and pull over and come to a complete stop.

ChatGPT reports:
{. :brain: Summary:

Scenario Can Tesla pull over & stop?

Driver unresponsive :white_check_mark: Yes — slows down and stops safely if possible
Software confusion / edge case :warning: Sometimes — may stop in-lane or try to pull over
Dangerous traffic situation :warning: Partial — may slow or stop but not always pull aside
Fully autonomous emergency :cross_mark: No — Tesla is not Level 4 or 5 autonomous. }

:spade_suit:
ralph

Hey, don’t take my word for it…

https://www.tesla.com/ownersmanual/modely/en_us/GUID-2CB60804-9CEA-4F4B-8B04-09B991368DC5.html

Like other Autopilot features, Full Self-Driving (Supervised) requires a fully attentive driver and will display a series of escalating warnings requiring driver response. You must keep your hands on the steering wheel while Full Self-Driving (Supervised) is engaged. While Full Self-Driving (Supervised) is engaged, the cabin camera monitors driver attentiveness (see Driver Attentiveness).

Full Self-Driving (Supervised) is a hands-on feature that requires you to pay attention to the road at all times. Keep your hands on the steering wheel at all times, be mindful of road conditions and surrounding traffic, pay attention to pedestrians and cyclists, and always be prepared to take immediate action (especially around blind corners, crossing intersections, and in narrow driving situations). Failure to follow these instructions could cause damage, serious injury or death. It is your responsibility to familiarize yourself with the limitations of Full Self-Driving (Supervised) and the situations in which it may not work as expected. For more information, see Limitations and Warnings.

NEVER make assumptions and predict when and where Full Self-Driving (Supervised) will stop or continue through an intersection or road marking. From a driver’s perspective, the behavior of Full Self-Driving (Supervised) may appear inconsistent. Always pay attention to the roadway and be prepared to take immediate action. It is the driver’s responsibility to determine whether to stop or continue through an intersection. Never depend on Full Self-Driving (Supervised) to determine when it is safe and/or appropriate to stop or continue through an intersection.

In situations where Autopilot is unable to steer Model Y, a warning chime sounds and the touchscreen displays the following message.

Take over immediately

1 Like

You guys are making more and more irrational statements. It is very obvious you guys have no experience with FSD.

These aren’t irrational statements. These are accurate statements about the limits of the system. Using the system might make you feel like the car is capable of driving itself, but that’s not an accurate sense of what the car’s abilities and limitations are. Indeed, one of the dangers of a very advanced L2 system (and even more prominently in L3 systems) is that it can mislead a driver into thinking that the car is capable of driving itself, when in fact the car still always requires an alert and active driver to be present.

Clearly a false statement, as Tesla has a car without any human inside drive itself from the factory to its new owner. So, yeah, literally “a single vehicle” on a single route.

Maybe not much to brag about, but still.

1 Like

You do realize that Waymos get in accidents, including one where it failed to brake for a delivery truck backing up, right?

It’s a good track record, but it’s not perfect.

Heck, sometimes a Waymo crashes into another Waymo, lol:

1 Like

True, and I forgot about that. They did do that one trip, one time.

Don’t want to get into another debate about the meaning of “deploy,” though, like we did with the humanoid robots back in the day!

1 Like

Irrelevant. At NO TIME can you depend on a L2 to disengage and safely come to a stop on the side of the road. When a Waymo doesn’t do it, it is news.

2 Likes

Whether it’s “news” or not, that it’s not 100% for this “essential function” indicates a failure of meeting your expressed criteria. Period. Full Stop.

As for “news,” Tesla accidents are certainly news, too. Matter of fact, I suspect more people hear the news about Tesla accidents than they know of that Waymo v Waymo crash I linked.

2 Likes

I think that misunderstands the criteria.

For a Level 4 system, the car is not programmed to ever ask a person in the car to take control. That doesn’t mean the car will be perfect, nor that the car will never get into an accident. It means that the car’s functioning does not incorporate a human in the car into the driving process.

Level 2 systems are structured differently, because they do incorporate the person in the car into the system. Level 2 systems can (and do) ask the person in the car to take over driving functions, and can (and do) accordingly require that a person in the car be monitoring the decisions of the car at all times.

They operate differently - and that difference is one of the key differentiators between L2 and L4. Not just error rate, but how the decision-making systems are set up. Humans in the car are part of the driving process in L2, and are not part of the driving process in L4.

2 Likes

I fully understand the difference between L2 and L4. What remains less clear are your statements here.

1 Like

Hmmm…perhaps I can explain myself better, if you tell me what you find unclear.

3 Likes

Two Waymo cars hit each other 4 days ago. And reportedly there were no contributing causes external to those two vehicles.

Waymo has remote operators, just like Cruise had. In case of issues, remote operators (humans) help.

Let’s grant for the sake of discussion that Waymo is not perfect and not only have there been failures but there will be more in the future. Certainly no one here has claimed otherwise. But, this “whataboutism” as it pertains to Waymo doesn’t change the fact that Tesla L2 is no where close to being L4 - as you yourself, a 2x Tesla owner, have eluded to many times.

4 Likes

I didn’t mention anything about Tesla or levels here at all. It just struck me as funny that a Level 4 vehicle would hit a second level 4 vehicle with no difficult circumstances in the environment. No rain, no fog, no traffic, no bad road markings, no low angled sunlight, no confusing signs, reportedly no failures in any sensors, etc. Only a failure in the driving system itself, because the first and most important rule of any driving system is "don’t attempt to occupy the same space as something else (another car, a person, an animal, a fence, a pole, etc).

You mean “alluded” here, and yes, I’ve said many times that Tesla FSD is neither F nor S yet. But I do have to say that with each new version, FSD does indeed get better and better. It still doesn’t pull into my driveway, I wish it had a way to learn from me how to do certain thing for me. That video of a Tesla stopping at the proper position in a tollbooth (that had no gate or arm), and then allow the driver to complete the transaction, and then automatically drive away after the “thank you” was exchanged, was rather impressive. I suppose that’s why they requested permission to “listen” a few versions back. However, the system they are running in Austin is reportedly very close to being a true autonomous driving system. Though I haven’t been able to try it myself yet.

2 Likes

Really? Can you elaborate on why you think that? What parts of L5 can you say it has already achieved or is close to achieving?

Personally, I think both companies are a long way from having L5, even within the narrow geofence, but I am very interested in why you think otherwise.

3 Likes