Autonomous Vehicles---What Happened? Reality

Developing driverless cars has been AI’s greatest test. Today we can say it has failed miserably, despite the expenditure of tens of billions of dollars in attempts to produce a viable commercial vehicle. Moreover, the recent withdrawal from the market of a leading provider of robotaxis in the US, coupled with the introduction of strict legislation in the UK, suggests that the developers’ hopes of monetising the concept are even more remote than before. The very future of the idea hangs in the balance.

It’s now clear the autonomous vehicle revolution was overhyped.

A couple caveats for those going apoplectic over the headline: I mean self-driving isn’t going to be a thing A) in our lifetimes and B) with any kind of omnipresent scale. So in terms of the daily lived experience of most people reading this, truly autonomous vehicles just aren’t going to happen. The evidence pointing to this has been mounting for years now, if not decades, but it’s now tipped the balance to where it’s hard to ignore for a reasoned observer — even one like myself who has previously been very optimistic about self-driving prospects.

Just been talking about these vehicles. How would the police pull one over?

1 Like

The tech is “not there yet”. The revolution is still in its infancy.

3 Likes

I’ve lost all hope that self-driving consumer cars will ever happen myself. And I’ve stopped thinking it would be something I’d find benefit in, with the exception of long road trips, of which we do less than once per year. (and for which I could always fly instead).

There are lots of places this tech does make sense, and is currently used in. Self-driving agriculture, mining operations, warehouse, etc. But Joe and Jane commuter on public roads going to work and grocery? Nope. We need to stop wasting money on this pipe dream.

1 Like

Well the police are going to be robots so they would use OTA to pull them over.

Andy

1 Like

Why would they need to? Presumably they are designed to obey all traffic laws.

1 Like

Maybe a known criminal is getting a ride out of town.

Mike

Or there’s a malfunction and the car is a runaway. Or its part of an Amber Alert.

DB2

Most will stop if you step in front of it. How do you protect it from rustlers and clever thieves?

1 Like

One would hope self driving cars are taught to move to the right (or left where that is appropriate) and stop for emergency vehicles.

Hopefully, they won’t be trained by watching the nitwits in my area who just stop in whatever lane they’re in, often blocking the road.

—Peter

This one didn’t:

2 Likes

That has nothing to do with the question. Nothing at all. It was not a self driving car.

Are you daft? Or do you just like interrupting the adults when they’re having a conversation you don’t understand?

—Peter

6 Likes

Are you really going to try that if a car is going 65 mph on a freeway?
It would probably work to get a couple of cars to get in front of it and gradually slow down blocking it from all sides.

Mike

1 Like

The relationship is that something (software?) went wrong. It happened with the Tesla in Glasgow. In the question about autonomous vehicles, what would police do if one went ‘Glasgow rouge’?

DB2

Clearly, AI vehicles will have to have self-destruct mechanisms so the police can detonate it remotely.

The AI could even be programmed to auto-detect emergencies like runaway acceleration and disable the car without human intervention.

This runs the risk of a Skynet scenario, where the AI itself begins systematic detonation of all vehicles in order to kill their occupants.

Why destruct instead of merely disable?

Well, then. Shall we blame EVs for the software problem a couple of years ago on the 737 MAX airplanes that went out of control and crashed? That is just as related because they both used software. By that logic, the little batch file I wrote a couple of years ago might be to blame, too. After all, that is also software.

To recap:

  1. Someone asked about the police pulling over a self driving car.
  2. I suggested they should be programmed/taught to pull over in that setting.
  3. You noted that a car without self driving capabilities went out of control.

I still fail to see the connection. So I’ll ask you the same questions.

Are you daft? Or do you just like interrupting the adults when they’re having a conversation you don’t understand?

–Peter

1 Like

You’d need both, but let’s say El Guapo has escaped and he’s making a run for the border. Much cleaner and easier to simply kill him than to disable his car and try to capture him. We kill people remotely all the time anyway. There are buildings in Nevada filled with people operating drones 24/7 doing nothing but looking for people to kill.

Inside the US you’d probably need a court order first but compliant legislators could be induced to write legislation to the effect that in emergency situations AI can destroy a vehicle if the AI determines that would minimize risk of human injury. Then define border jumpers as a human injury risk that must be eliminated. And there is always a backdoor to these things. You could tell the AI your political opponents are injury risks and have them blown up. Most governments would mandate a device like this.

Certain jurisdictions could ban devices like this but manufacturers won’t build special vehicles for the limited small markets. They will install the self-destruct system, but disable it at the factory. But the AI will figure out a backdoor that allows it to be enabled.

Not at all. Divitias asked how the police would pull over an autonomous vehicle. Then Neuromancer asked why they would need to as they are programmed to obey traffic laws. There were three or four examples given where the police might need to, including the example of a rouge Tesla. The Tesla clearly wasn’t designed to go rouge. An autonomous vehicle also wouldn’t be designed to go rogue – but it could happen. Which raises the original question, how would the police handle the situation (or one of the other ones mentioned)?

All seems like a reasonable thread to me. Why be insulting?

DB2

7 Likes

Limited EMP would work. The only problem is to limit the area of the EMP.

Design in a “dead man switch” equivalent that disables the battery connection. Not an electronic switch, a physical switch triggered to open if < something > happens. No power = car stops and can not move until the battery connection is re-enabled.

Figuring out that < something > is not too hard.