Self driving video/story

New York Times has an interesting story of Tesla’s continuing developing of Full Self Driving, including videos and animations of actual work outside the lab.

Much of the story is positive, but there are some cautionary parts as well, particularly the part where the software leaves the back of the car hanging out into the oncoming traffic lane as it prepares to cross a divided highway. And the near accident in a parking lot. And…

Andrew Clare, chief technology officer of the self-driving vehicle company Nuro. “It is not something you or I or our kids should be banking on to help them get around in cars.”
3 Likes

Still at this point Musk was touting by end of 2022 fewer accidents than a human driver.

And coming from CT I believe him. :rofl: :rofl: :rofl:

1 Like

Weird take. Why is the “back of the car hanging out” particularly concerning when it was given as an example of a problem that was solved in an update? Seems more encouraging than concerning. And if the reason behind the improvement was understood, that it was fallout from an entire subsystem being replaced by another with a better understanding of the world, it would be even more encouraging.

Not to mention the further encouraging knowledge that the new subsystem runs faster and takes up less code space. Which further feeds the understanding that maybe the newer, better software can run happily on older hardware.

Again, weird. There’s nothing concerning about the driver intervening to actually drive a level 2 driver assist system like Tesla’s. It’s what you do all the time. If that was in fact a “near accident”, then with well over 100,000 cars using the software now, surely we’d have actual accidents all the time. But we don’t. It’s routine.

You left out the important context that his comment was regarding AGI, not software that autonomously drives your car. Very different things.

This piece was better than most in that it wasn’t filled with outright lies. But it was just another human interest piece really, as the reporters knew nothing about the technology and could only report on what they were shown and told, without any actual understanding.

When the article says “The most telling moment came as the car drove us to lunch.”, what they mean is nothing more than that they got something they could report sensationally. That wasn’t even a slightly “telling moment” to anybody familiar with driving the software. It was routine and uninteresting, the sort of failure that’s around for a few releases and the then goes away because things are improved.

There were no doubt a ton of “telling moments” during their drive, but they were completely unqualified to recognize them. To perceive them you have to know something of the why behind the behavior.

So, basically a report on a house tour from the society editor, who could tell you it would be a nice place for a party, rather than from a structural engineer, who could tell you important things about the house that you might want to know. And to pick on the headline, this piece certainly tells us nothing about “the Future of Autonomy”.

-IGU-

4 Likes

Because it’s one of those issues that shouldn’t have been an issue in the first place? Don’t all but the most careless human drivers know not to leave your car hanging out in the oncoming traffic lane?

The video clearly shows the car wandering all over the road, nearly clipping a parked car. (This is after taking a wrong turn and driving through a parking lot.)

Yes. Try not to be so defensive. It didn’t purport to be a line by line analysis of the code, nor an engineering perspective on how responsive the sensor are, or anything else. It was an interesting piece - written about a guy who is a beta tester (and has been for years). Not every story about a building has to describe what kind of bolts hold it together, and not every car story has to be a hagiography for Tesla.

For the record, there have been several, if not many Tesla accidents. There is an MIT study showing drivers are less attentive and less cautious when employing the inaptly named “autopilot”. Musk claims it’s safer, but does so by conveniently ignoring and twisting the data. That doesn’t mean it should be proscribed, but I would like to see more oversight of a product which is overhyped and then tested on live human, and often unknowing drivers and pedestrians outside of the test vehicle.

Anyway, interesting piece, although I see it ruffled your feathers, as everything but a pure A+ recommendation does.

7 Likes

You must have much better drivers where you are. Human drivers do stupid stuff like that all the time.

Chuck is a very experienced driver. He let it do things that a less experienced driver wouldn’t. And stuff he normally wouldn’t, as he’d take over much sooner. So, in real life testing, you never see that sort of thing because you’ve already intervened. As I said, there was nothing concerning. There was no accident. It was a joy ride.

The statistics show that there have been negative Tesla on Autopilot accidents. The baseline is not zero, it’s whatever humans would have done in the same situations on their own.

There is a long history of how to design systems that provide the right number of buzzers and flashing lights. Details matter, and the Tesla system keeps getting better.

Try not to be so offensive. There’s lots to criticize about Tesla’s currently in progress attempt to achieve vehicular autonomy. This piece just didn’t get any of it right, but it seemed it was being interpreted in this thread as being meaningful.

There’s constant nonsense spewed here and elsewhere, for example by you: inaptly named “autopilot”. That is so far down the list of real problems as to just be laughable.

Nope. Love to see good analysis. Never see it in the mainstream press. The main problem with Tesla’s system is that it isn’t getting better fast enough for me. While it’s much better than it was a year ago, it still has a long way to go.

About the only good thing this piece had was that Chuck Cook had a few things to say.

-IGU-

This is the sort of nonsense that Musk and his fanbois regularly tout, and it’s entirely untrue.

They wave the total accidents per mile stat and compare it with human drivers without mentioning that upwards of 90% of Autopilot driving is on limited access highways, and the vast majority of accidents with humans happen in city or congested driving. Not remotely comparable, but a wonderful illustration of how to LIE with statistics to the unsophisticated.

(Oh, PS: You asked if I know what a plane does in autopilot. Yeah. I was involved with Motorola, Inmarsat, and Boeing to try to place “live television” in aircraft in the late 80’s/early 90’s. My company already did the audio and video for American, some smaller domestic, and a bunch of foreign carriers, we bicycled CNN tapes (it was all tapes then) to dozens of airports every morning and loaded them on board, and we were thrilled at the idea of bouncing the signal into the plane live. We also moved movies and audio channels in and out, a real nightmare working around the world. Our technology was not the one that won, but yeah, I had a fair amount of time in the airframe and communication biz. And “AutoPilot” has a very specific and well understood meaning, and what Tesla offers ain’t it. It ain’t close, actually, and it’s entirely misleading.)

9 Likes

You then proceed to complain about some kind of misdirection that has nothing to do with what I wrote. I wrote “same situations” and you’re complaining that the data comparing different situations is not relevant. I made no such assertion.

Misleading to whom? Certainly not to anybody actually driving the cars, who have by the point they’re allowed to use the features, seen endless warnings that the car doesn’t drive itself, and you have to pay attention at all times. And it warns you again every single time you engage it.

Now I’m not a pilot, but I’m sure you can pull out your copy of the FAA’s Advanced Avionics Handbook (https://www.amazon.com/Advanced-Avionics-Handbook-FAA-H-8083-6-version/dp/1546879463), wherein I’m told the following words appear: " 5. Be ready to fly the aircraft manually to ensure proper course/clearance tracking in case of autopilot failure or misprogramming." Not exactly the same, but surely in the same ballpark.

In any case, there are so many inadequacies in Tesla’s Autopilot and Full Self Driving systems that the names are the least concern of anybody.

-IGU-

3 Likes