Maybe a couple excerpts?
“ They were supposed to be the future. But prominent detractors—including Anthony Levandowski, who pioneered the industry—are getting louder as the losses get bigger.”
And
“ It all sounds great until you encounter an actual robo-taxi in the wild. Which is rare: Six years after companies started offering rides in what they’ve called autonomous cars and almost 20 years after the first self-driving demos, there are vanishingly few such vehicles on the road. And they tend to be confined to a handful of places in the Sun Belt, because they still can’t handle weather patterns trickier than Partly Cloudy. State-of-the-art robot cars also struggle with construction, animals, traffic cones, crossing guards, and what the industry calls “unprotected left turns,” which most of us would call “left turns.” “
They are nowhere close to solving the weather problem: rain, snow, fog, and oh yes, darkness. And they’re even further away from having the AI do anything at all about “edge” situations.
What’s NOT an edge situation? Driving between the lines on a limited access road without such inconveniences as traffic cones, pedestrians, kick balls rolling in the street, animals, drivers doing unexpected things, and lots and lots of others. It’s fairly easy to train on very normal streets, not so much any other place. And, as one proponent points out, even if you get the car to navigate an edge situation once, there’s no guarantee it will do it the next time. Or the next. Or the next.
One edge situation described in the article; a flock of birds walking around on or near the road. A human understands that the birds will fly away, a robot car probably doesn’t, so it brakes. (Most common accident involving self driving cars: rear end collisions.) But suppose the birds aren’t birds but are, oh, say cats, or gophers or squirrels. The AI doesn’t (yet) distinguish because those are rare enough that there isn’t enough training because it’s just too uncommon. (And no, simulations don’t cover all “edge” cases, even if you run them a million times. The “edge” case is the outlier, and you can’t train a bazillion outliers.
Warren Buffett famously said something like investing in airlines has been a net loss for investors since the beginning of time; it would have been better if they simply torched the industry (from an investment point of view) than trying to make a profit from it. It may be that “self driving cars”, along with “flying cars” and a bunch of other gee-willickers things is another of those cases: it just may never come true.