Another *Autopilot* death blamed on the driver, of course

In settling the case of Apple engineer Wei Lun Huang’s death while using its so-called Autopilot, Tesla once again blamed Huang and got away with a sealed settlement to prevent outsiders knowing how much it paid for being held accountable for its false claims on Autopilot

’ Tesla’s lawyers asked the court to seal the settlement agreement so that the exact amount the company paid wouldn’t be made public. The company didn’t want “other potential claimants (or the plaintiffs’ bar) [to] perceive the settlement amount as evidence of Tesla’s potential liability for losses, which may have a chilling effect on settlement opportunity in subsequent cases.”

Tesla confirmed shortly after the accident that Autopilot was switched on at the time of the crash, but it also insisted that Huang had time to react and had an unobstructed view of the divider. In a statement to the press, the company insisted that the driver was at fault and that the only way for the accident to have occurred was if Huang “was not paying attention to the road, despite the car providing multiple warnings to do so.”

In the lawsuit, Huang’s lawyers pointed to Autopilot marketing materials from Tesla suggesting that its cars are safe enough to use on the road without drivers having to keep their hands on the wheel at all times.’

’ We took the image above from a video on Tesla’s Autopilot page, showing a driver with their hands on their lap.’

I don’t think there is any question that the driver was at fault here. No hands on the wheel, distracted driving, and at high speed.

OTOH, and I have argued this for a long time, Tesla is also at fault. They market “auto-pilot”, not “driver assist”, and those have vastly different meanings. You cannot promise auto-pilot and then be shocked when someone takes you at your word.

I’d like to know what the settlement was (wouldn’t everyone?) but I doubt it will be enough to convince Tesla to start marketing their software accurately.

1 Like

I don’t fault Tesla so much for the use of “autopilot”. The common, and pretty much only, usage of the word “autopilot” is to refer to a mechanism that assists pilots flying a passenger plane. There are no passenger planes without pilots, and no passenger plane flies without at least one attentive pilot in a seat up front. Also, Tesla Autopilot is a rather simple feature that almost all mid-to-high-end cars have - it is a glorified form of cruise control, maintains a maximum set speed, maintains a minimum distance to car up front, and centers in lane through gentle curves (it will NOT turn at a 90 degree angle even if that is the only way to go).

I do however fault Tesla for their use of “Full Self Driving”. It is not “full” in any sense of the word, it is clearly not “self” even according to Tesla, and it does handle 99% of driving (most of the time quite good, sometimes mediocre, and rarely terribly). I used FSD Beta for a few thousand miles last summer and was impressed, though not overly impressed. I wrote about my experience then. I am using the current version of FSD Supervised (apparently they changed the name with this version) and it is really quite good. I regularly use it for trips of 2, 10, 20, even 50+ miles without any major issues. Though there are plenty of minor issues, and I report them almost instantly many times per day (so much so that my family jokes around with me that I should take a job at Tesla because I am “working for free” now anyway).

For example, one trivial issue that I reported a few times yesterday was that the FSD, when making turns, often crosses the lines on the inside of the turn. Now that wouldn’t be a problem at all in many places, but here, nearly everywhere we have reflectors embedded in the pavement at those lines, so the vehicle ends up bumping over them almost every time. That’s annoying and adds to tire wear. Also, because most cars don’t drive there, all the road debris collects in those spots which adds to tire wear and possibility of damage.

Other trivial issues include extreme hesitancy before entering intersections, and then once in an intersection, rapid acceleration getting out of the intersection. I suppose it is safer to get yourself out of the intersection as soon as possible, but it still feels somewhat like a teen driver still getting the feel of driving.

But when driving down the road, even in traffic, it generally does admirably and works very well for patient drivers. If you are impatient and change lanes all the time to move a ahead a tiny bit faster, then it isn’t for you. I can see why Californians like Teslas so much, a typical CA driver is generally quite docile on the roads, they choose a lane, and suffer the traffic in that lane a lot longer than a typical driver in NY, DC, Miami, or even Atlanta. I doubt FSD Supervised can be used in NY or in Miami, it leaves too much of a gap in front of it, and other cars will constantly be pulling in. Even north of Miami where I’ve driven a few hundred miles with it, people will sometimes pull right in front due to the excess gap.

Another slightly more serious issue that I’ve reported is that it takes turns a little bit too tight, so there is danger of hitting the curb if the road isn’t configured exactly as expected. So, far my [FSD driven] car only hit a curb once, on a slow speed exit from a parking spot.

On the other hand, it has some astounding new features. Sometimes, when routing to a shopping area, it will literally enter the parking lot, find a spot, and pull into it! It won’t do that in my driveway though, it just bring me to my street and after that it’s up to me to park it manually (the primitive way).

1 Like

I have noticed that Tesla now seems to be referring to the program as “Full Self Driving (Supervised)” in some of their communications now. Adding the “Supervised” part is at least a nod to alerting the consumer that the car isn’t supposed to driving without active human monitoring. It’s still not great, but at least it’s a bit better.

1 Like