Risk of Tesla camera-only self-driving

Not really. Take the Minimal Risk Condition, for instance:

The characteristics of automated achievement of a minimal risk condition at Levels 4 and 5 will vary according to the type and extent of the system failure, the ODD (if any) for the ADS feature in question, and the particular operating conditions when the system failure or ODD exit occurs. It may entail automatically bringing the vehicle to a stop within its current travel path, or it may entail a more extensive maneuver designed to remove the vehicle from an active lane of traffic and/or to automatically return the vehicle to a dispatching facility.

That’s a few “may entail,” not “must entail,” and nothing about when one kind of MRC would be preferable to another. There are plenty of examples, but those aren’t requirements.

Contrast that with the FMVSS. You’re absolutely correct that it doesn’t specify the “how” something is acheived, only what the performance requirements for that something are. Which is why, for instance, we’re not going to get a regulation that requires LiDAR. But, FMVSS gets very specific on performance requirements, for instance on braking there are specific speeds, vehicle weights, distances, pedal forces, etc. But, nothing on brake pad material or shape of the pads, etc.

You’re losing the forest for the tree details. Tesla doesn’t need complete redundancy, it needs just enough to be able to perform a DDT Fallback. I’ve not seen a regulation that defines specific DDT Fallbacks for certain conditions, and the lazy incorporation of J3016, as I showed above, doesn’t fill in those gaps.

Ah, so you want to eat your cake and have it, too:

How much of steering is part of the ADS, if nothing in the suspension is part of the ADS? A failed suspension arm may mean the car can’t move at all due to either no propulsion to the wheel, the wheel completely disconnected from the drivetrain, or the wheel just jammed in the wheel well. In any of these cases, the car won’t be able to perform some DDT Fallbacks, “like pulling off to the side of the road.”

As you said, “There nothing about being completely fault tolerant.” Picking and choosing what is or isn’t in the ADS isn’t part of any regulation.


EDIT: I should add that I agree that HW3 vehicles are unlikely to become L4 capable, as Tesla probably won’t even try. Maybe Tesla will take the time to apply for L3 certification for these vehicles, but that’s probably a lot of time and cost with little financial benefit for the company, so I think it’s unlikely to happen.

And yes, I do agree that Musk has and will continue to make promises that won’t ever be fulfilled.

As Tesla is likely focusing on CyberCab production (supposedly the line is close to completion for first vehicles now), it does seem likely that Tesla will apply for robotaxis using CyberCab first. And I won’t disagree that perhaps Tesla will not apply for L4/robotaxis for existing vehicles, even those being built today. Seeing that Tesla already has some CyberCab prototypes with manual controls is only surprising from the POV of Elon’s statements in Issacson’s biography of Elon, where Musk hammered down on any suggestion of having CyberCab versions with manual controls. My guess there is that once it was subsequently pointed out to him that to get Robotaxi approval meant first having trials with human drivers to take over forces Tesla’s hand.

But, I don’t believe that lack of redundancy is what will prevent today’s HW4 vehicles from being used for driveless robotaxis.

2 Likes

If the board with the driving computers fails (which it has), can the vehicle perform a DDT fallback?

No, it can’t.

It’s actually unclear whether past failures are indicative of complete systems failure to the extent that the vehicle could not perform a DDT Fallback. Taking the Tesla-hating site Jalopnik’s reporting on the past failures, for instance:

When the computer fails, many vehicle features stop working, like active safety features, auto wipers, auto high beams, cameras, and even GPS, navigation, and range estimations. Tesla’s fix was to replace the computer completely, but sources also mentioned a temporary software fix to enable some of the features in the meantime.

That “a temporary software fix” to enable some features indicates that Tesla could be using software to work around some hardware failures, and that enough could be done to enable a DDT fallback (low speed move to the side, or even just stop in lane with flashers) to be safely performed.

1 Like

I’m pretty sure that I read somewhere that Mercedes killed off their level 3 system (very limited level 3 system reportedly).

[I searched and found this, but I read it elsewhere - Mercedes pauses Level 3 driving assistance – for now - electrive.com]

I can’t comment about the standard because I have near zero familiarity with it. I’ve been out of the industry for about half a decade since I retired, and A LOT has changed in those years. But I can comment on what I think a vehicle should do in these types of cases. In general, ALL my comments here and on all social media is about what I think about things.

Yeah, an article on that came out 4 hours ago:

It appears they’re going all-in on the new Nvidia-based system, even though it’s only L2 (they call L2++). The ODD (Operational Design Domain) for their L3 system was very restrictive.

I wonder what’s going to happen with owners of those vehicles? With L3 continue to work? Will Mercedes continue to update the mapping for approved highways? Will there be software updates at all?

This is a highly complex topic with important terms like L4, L2, L2++, SAE, J30##, NHTSA, and hyper-exponential.

But if you break it down into a logical sequence, clarity emerges.

It goes like this:

  1. Autonomy is a solved problem.
  2. Release short-form demo video.
  3. Autonomous vehicles are just pending regulatory approval.
  4. Release short-form demo video.
  5. We’re deploying autonomous vehicles to 50% of the country.
  6. Release short-form demo video.
  7. We need 10 billion miles to solve autonomy.

That clears it up nicely.

2 Likes

That’s the way to get lost in the noise, like tracking stock prices hour by hour instead of quarter by quarter.

You don’t need to step back far to see Tesla’s progress over the past few years:
• Miles between FSD interventions have increased by more than an order of magnitude, and are still rising.
• FSD remains the only publicly available “point to point L2++” system that is not geo-fenced (Mercedes CLA with Nvidia AGX Drive won’t have point to point enabled until later this year, they say).
• Tesla Robotaxi trials have started.
• Tesla is building a production line to produce dedicated robotaxi vehicles at a scale well beyond the current leader in the field, Waymo.

As an investor, you can choose to focus on Musk being overly optimistic, which is certainly true, but you need to ask yourself whether that is actually adversely impacting Tesla’s progress. There’s no doubt that Elon’s timelines have been wrong, very wrong, but every other business leader in the space (from GM to Zoox to Mobileye) has also been wrong. Other leaders have learned not to make new predictions for timelines in this space; Elon hasn’t, because, well, it apparently still works for him in the stock market.

I do agree that Tesla’s promises of the “Tesla Network,” where a car you own can legally perform as a driverless robotaxi, is not in sight as hardware limitations apply. But, there are still two large markets: One for L3 personally owned vehicles and one for robotaxis. If Tesla needs to have different vehicles for each, that’s still two large and potentially highly profitable markets, and perhaps Tesla can merge the hardware for future vehicles. Yes, some existing owners will rightfully be upset, but the world is littered with cars sold on future capabilities that will never happen.

I think it’s also telling the arguments for LiDAR, and now even radar and USS, have evolved from being an absolute driving safely requirement to just potentially fulfilling a system redundancy need. And so we discuss whether without them Tesla vehicles today have enough redundancy (and tellingly not whether Tesla can make them redundant without those technologies).

Those would continue to argue that non-camera sensors are needed for adverse weather conditions need to be aware of two aspects: 1) It’s perfectly allowable for an L4 car to state that weather conditions are beyond its ODD and simply choose to not drive, and 2) If weather conditions means cameras aren’t good enough, then the system no longer has a redundant backup since nothing takes the place of cameras for seeing lane line, traffic lights, etc.

1 Like

I don’t think Smorgasbord reads my posts, but it’s worth noting that the markets he’s describing are vastly different economic opportunities than the original expectation. If Tesla had been able to deliver an L5 system using existing hardware, then it would have had an insurmountable lead on everyone in the field. They’d have millions of vehicles already in the field, many already well-depreciated, that could operate as either self-owned or TaaS cars without any significant supporting infrastructure (a la Uber). You would be many, many years ahead of any competitor even if they had developed an L5 AI driver the same day you did.

But an L4 system that requires a new vehicle doesn’t have those advantages. You can only scale as fast as you can build the new vehicles. You have all the capital issues with having to build a new factory or production line for new cars, and (likely) owning those cars yourself. You have the same hindrance to scale that comes with building out a support network of infrastructure (arrangements for places and people to store, charge, maintain, clean, and ‘rescue’ the cars).

Is that what happened? I might have missed it in the thread, but did anyone backtrack on the arguments that LiDAR and radar and USS might be necessary for achieving a sufficiently high level of safety to enable autonomous use, and which vision alone cannot achieve? That they’re also providing redundancy that isn’t present in Tesla cars doesn’t mean the first argument doesn’t remain possible.

2 Likes

As it pertains to a robotaxi - how can a service be successful if it is so weather-dependent?

If I take a robotaxi to work, I darn sure want to be able to get home at the end of the day, especially if the weather is getting worse.

Granted, we don’t have any evidence of any automated service working in really bad weather yet so this isn’t a Tesla-only problem.

2 Likes

Well, it’s not “so weather-dependent,” because cameras around the car, with multiple front views (wide and far, etc.) are actually better than one set of human eyes.

There are weather conditions today in which no human should drive. White out conditions during snow storms in the Sierras. Heavy fog down Hwy 5 in California. Unfortunately, some people continue to not only drive in those conditions, they drive too fast for those conditions. And so we get the 100 car pile ups every now and then.

During “normal” bad weather conditions, cameras will be just fine - actually even better than humans. But, when conditions are really bad, a robotaxi that refuses to drive is actually a safety feature, not a bug.

1 Like

What good is a camera if it is covered in snow or just really dirty from the dirty slush that gets tossed around after a snow?

Does Tesla have a way to mitigate that in an automated manner? I know I have to manually clean the cameras on my car if I want to see anything in them in those cases.

Sorry but I can still see to drive even when my cameras cannot. It is not uncommon for my backup camera to be totally obscured to the point that I feel the need to look over my shoulder - out my clean window - to back up safely.

3 Likes

So you’re moving on to something else. OK.

Tesla’s front bumper camera, already shipping in Model Y vehicles (maybe Model 3, too) has a built-in washer. The other cameras have a special slippery covering, but may also need a washer and/or wiper.

Again, you’re talking about something besides weather conditions. In bad weather, the cameras see better than you. If the cameras aren’t clean, there’s that, and Tesla has some automation, and maybe will add more. Also, camera field of view overlap can mean one dirty camera doesn’t take a car completely out of service, and maybe in the worst case someone has to be dispatched to clean cameras like someone would have to be dispatched to change a flat tire.

At any rate, nothing you’ve brought up seems like an impossible hurdle for Tesla to overcome for its robotaxi efforts.

How is that something else. Is not bad weather and snow synonymous?

Exactly. Thank you for answering on topic. That was my point that the cameras may not have sufficient means to overcome bad weather.

I never implied that was an impossible hurdle. You are inferring something neither stated nor implied.

1 Like

Bad weather and things stuck to the camera (or LiDAR) lens/exterior are not synonymous. They’re a Venn diagreem with some overlap and some non-overlaps. Dirt, mud, or snow stuck to USS sensors on bumpers can impair their operation, too.

When you write:

That was clearly aimed at dealing with the bad weather itself, not some resulting potential debris on the sensors’ exterior. What you’re now saying you meant was:

How can a service be successful if the sensors get dirty from weather?

Surely you can see the difference. Good luck with your future spins.

Dude, you are being pedantic. It is unbecoming.

2 Likes

Yeah, it’s a bummer when you attempt a spin by calling someone out for conflating “bad weather” and “snow” and being off-topic but it turns out you yourself went off-topic and confused weather with sensor obstruction.

At any rate, welcome to my ignore list. You can respond with impunity now.

You are getting more and more desperate.

Snow? Bad weather? Yes, and let’s argue about the number of angels that can dance on the head of a pin, while we’re at it. You keep putting people on ignore and you’ll end up talking to no one but yourself.

But perhaps that is your goal?

3 Likes

Maybe you need to check the thread history to see who brought that up and who confused weather with sensor obstructions.

Keep up the personal attacks, or just ask, and you’ll be on my ignore list, too. If you folks want to chase me away, I’ll go away.

I’m already in his ignore list…but is he really not aware that weather is going to be a non-trivial factor in sensor obstructions? If not the main reason for sensor obstructions? Sensors have to be able to deal with water and snow and mud covering their protective housing…which conditions are almost entirely going to occur when there is precipitation, and not on a clear sunny day.

2 Likes

This discussion thread has reached its logical conclusion: talking about the weather and another user on an ignore list.

3 Likes