...

“Humanpilot” Was Super Deadly In 2020

People fought against seatbelts not that long ago. Though, it was long enough ago that an “anti-seatbelt movement” sounds absurd to us at this point in time. Cruise control got a lot of backlash a couple of decades ago as well, but now it’s in basically every new car and no one bats an eye. There was a big anti-cellphone movement when cellphones started getting popular. Who doesn’t have a cellphone today?

It doesn’t matter what has happened in the past, many humans are naturally scared of new things, and that includes new technologies. That isn’t going to change — it’s embedded deep in our DNA. But that doesn’t mean that we all have to accept illogical fear and that all new technologies must be stifled by fearful humans stomping on them. Clearly, technology keeps evolving and rolling out to the masses. However, some tech does face counterproductive obstructionism that can really slow progress, even progress that can help save lives.

On the opposite side of the coin from our fear of new things, if something has been happening for a long time, or even obviously plaguing us for a long time, humans have a separate ability/tendency to just basically accept it. It’s normal. Even if it’s deadly. A recent tweet from one of my favorite tweeters, Not_an_Analyst, pointed this out exceptionally well.

The clever wording here is of course referencing Tesla Autopilot and the more common alternative to Autopilot — “Humanpilot.” If one person per year dies in a crash involving Autopilot, many in the media go crazy, which triggers backlash well beyond the media. Yes, the loss of any life is tragic, but that doesn’t make the calculation 1–0. In 2020, 23,395 passenger vehicle occupants died in the US. That’s more than 64 Americans a day dying in car accidents (just counting the people in the cars, not the many bicyclists and pedestrians hit by cars). Essentially all of those deaths could be called “Humanpilot deaths.”

Perhaps there was one or maybe two Autopilot-related deaths in the past year, but I can’t find any such stories (I just went through our Autopilot archives for the past year to make sure my memory wasn’t betraying me, and I also searched through broader news reports of Tesla deaths). Also, I’m confident that we’d hear about an Autopilot-related death if there was one. If, indeed, there were no vehicle passenger deaths in the past year from a Tesla with Autopilot engaged, that would mean the actual annual death count was: Autopilot 0 — Humanpilot 23,395.

If we all had Autopilot on our cars by default and some company was introducing or expanding the use of “Humanpilot” as a replacement, is there any chance that effort would succeed?

Yes, these are absolute numbers, not rates of death, but if that zero was the actual total last year, then Autopilot’s rate of death was 0%, whereas Humanpilot’s rate was clearly much higher.

[Note that there have of course been a number Tesla drivers and passengers who have died in the past year. I looked at news reports of more than a dozen of these deaths and none indicated Autopilot was in use at the time — except for the false reporting on the Houston area crash that we covered extensively. Some were caused by another car jumping a median or driving in the wrong direction, some from one of the drivers running a red light, and some indicated the Tesla was going at a very high speed and/or slid off the road while trying to make a turn — cases where it’s extremely unlikely Autopilot was engaged. One was a case of a Tesla driving into an overturned semi truck on the 210 Freeway in California, which could have been a Tesla on Autopilot, but there is no word of that in the news report. Of course, even if Autopilot was engaged, it would have been the responsibility of the driver to be paying attention at the time (just as it would be if it was a driver of another vehicle using cruise control), and it seems that if there was enough time to react, no one would see an overturned semi truck on the road and not brake or swerve. Overall, remember, be careful out there and don’t drive dangerously. It’s simply not worth the risk.]

Wreck technician James Law told us this week that there have been 38 deaths this year in the US from cars striking people or other cars at emergency vehicle scenes, and that none of those have been deaths caused by Tesla vehicles.

There’s approximately one crash every 4 million miles for a Tesla driver with Autopilot engaged. The national average is approximately one crash every 484,000 (0.5 million) miles. That means that, relatively speaking, for every 1 crash involving a Tesla with Autopilot engaged, there are 8.7 auto crashes across the US. Yes, there are flaws with this data comparison — the classes of Tesla vehicles are above the US average, and higher class vehicles may get into crashes less in general; the Tesla fleet is much younger than the overall automobile fleet; more Autopilot miles are driven on highways, relatively speaking, and there’s reportedly (and logically) a lower rate of crash per mile on highways than in cities. Nonetheless, 1 to 8.7 is a tremendous ratio.

The irony here is that there are certain people calling for Tesla Autopilot to be banned, investigated, and sent straight to Hell — do not pass go, do not collect $100. It would make more sense to try to get Autopilot in more cars.

There are an estimated 328,000 crashes a year in the US caused by sleepy drivers. People can’t just go out and drive with Tesla Autopilot on while falling asleep or drunk and expect to make it to their destination, but recent stories have shown that Autopilot can actually get such an incapacitated person stopped and rescued safely before something tragic happens. Here’s a case from Norway, for example. It’s hard to deny that Autopilot helped in this case. It’s easy to imagine that driver dying or causing another person’s death if he was in another car. There’s also this case of a Wisconsin driver who fell asleep while driving and also didn’t end up crashing, thanks to Autopilot. That’s not to say every such story will end well, but “Humanpilot” apparently leads to 328,000 crashes a year from drowsy drivers, and I have to wonder how many of those crashes would have been averted if Autopilot was on when the drivers were dozing off.

Two people died in a Tesla in the Houston area earlier this year. There were quick claims that Autopilot was on, leading to widespread calls (from both the political right and the political left) to investigate Tesla over this and perhaps shut down Autopilot in the meantime. It turns out that Autopilot wasn’t even on. “Nonetheless, that’s irrelevant — investigate Autopilot” seems to be the call to action that some are making anyway. There wasn’t a connection between the two in reality, but the impression that there was — in this one case — was enough to stimulate or increase the pressure to investigate Tesla Autopilot.

It’s hard to see how anyone neutral and objective could look at this situation and not think it’s a bit absurd. Everything I’ve seen indicates that Autopilot helps to save lives, and we all know that “Humanpilot” is an absolute disaster with a horrible safety record, yet people who supposedly want to protect lives are up in arms about the need to investigate and shut down Autopilot. As I wrote in April, “Tesla Autopilot Is Just Better Cruise Control — Anyone Who Thinks It Should Be Banned Is Acting Stupid.” Fake controversy and manufactured fear from a non-Autopilot accident is not a good enough reason to launch an all-out offensive on an advanced driver assistance system (ADAS) that is saving lives. Let’s not go down that road.

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x