Tesla Autopilot, FSD and Robotaxi Safety Versus Human

Here is my analysis of human versus automated driving today and the level to be much safer than human in the future and why failed disengagements would have a low percentage being fatal. A truly much safer autonomous driving system would eventually cross over where interventions would be more dangerous than non-intervention. Think of the bad human passenger harassing the better human chauffeur.

Self driving and robotaxi Interventions are when human drivers takeover from AI driving systems. China regs, Waymo and Cruise use remote drivers. China requires 1 human monitor per 3 cars.

Tesla operation on highways with Autopilot are already eight times safer than human. Tesla recorded one crash for every 7.63 million miles driven with Autopilot engaged. Tesla human drivers not using Autopilot, there was one crash for every 955,000 miles driven. Tesla will only replace autopilot with FSD for highway driving when FSD is statistically much superior to Autopilot.

Tesla FSD/Autopilot is already about eight times safer for highway driving than humans in the same vehicle. Tesla vehicles have superior passive safety systems.

Failed Interventions (where humans miss potentially “correcting” for the self diving system) most of the time would not lead a crash. Even failed critical interventions would not result in a injury crash. About 1 in 200 human car crashes result in fatality. About 1 in 5 result in injury.
1.2 million deaths worldwide, 50 million injuries each year globally. About 10-20 trillion miles driven. 3 trillion in the USA. In the USA, 1 collision per 600k miles.

Humans in San Francisco had the highest rate of crashes where an injury was reported with 5.55 incidents per million miles, which was approximately three times higher than the national average. Human drivers in San Francisco have reported injury crashes every 180,000 miles. There is a non-injury or injury crash about every 30,000 to 40,000 miles in San Francisco.

Swiss Re study of Waymo demonstrated:

An 85% reduction or 6.8 times lower crash rate involving any injury, from minor to severe and fatal cases (0.41 incidence per million miles for the Waymo Driver vs 2.78 for the human benchmark)

A 57% reduction or 2.3 times lower police-reported crash rate (2.1 incidence per million miles for the Waymo Driver vs. 4.85 for the human benchmark)

Waymo has 1 disengagement every 17000 miles but one injury incident every 2 million miles or so.
Waymo has a police reported crash every 476000 miles vs every 206,000 miles for humans.
Waymo has an injury crash every 2.44 million miles vs every 360,000 miles for humans.
This is data adjusted for the San Francisco, Phoenix areas and mix of highway and road driving performed by Waymo.

Tesla FSD is now mimicking the best drivers which means avoiding almost all dangerous situations entirely.

With new AI, voice commands will be possible. Passenger will be able to give voice commands to override and request a safe pause in operation – aka pull over or to inform that they perceive and incident.

What are the types of disengagements for Tesla FSD? Mostly Lane issue, wrong speed, maps, another vehicle.
This is sampled from crowd sourced data at Teslafsdtracker.com.

An In 2023, Ars Technica Author read through every crash report Waymo and Cruise filed in California in 2022, as well as reports each company filed about the performance of their driverless vehicles (with no safety drivers) prior to 2023. In total, the two companies reported 102 crashes involving driverless vehicles. That may sound like a lot, but they happened over roughly 6 million miles of driving. That works out to one crash for every 60,000 miles, which is about five years of driving for a typical human motorist.

In addition to the statistics maintained by the California OTS, the California Highway Patrol’s Statewide Integrated Traffic Record System (SWITRS) also compiles car accident statistics. The SWITRS 2019 annual report, which is the most current report available, shows that there were 187,211 car wrecks involving an injury.

How many car accidents happen a day in California? The answer may not be knowable, but this statistic shows that there are more than 500 injury crashes a day in California. In addition, SWITRS recorded 3,737 people killed as the result of 3,438 total fatality wrecks. The 187,211 injury crashes resulted in 269,031 people being hurt in accidents.

All told, there were 190,649 car accidents in California in 2019 that resulted in either an injury or a death. Another 279,899 crashes resulted in only property damage. These numbers give an answer to those who want to know what percentage of car accidents are fatal in California. Just about 0.7 percent of all traffic collisions result in a fatality.

Back in February, 2023, Waymo released a report celebrating its first million miles of fully driverless operation, which mostly occurred in the suburbs of Phoenix. Waymo’s autonomous vehicles (AVs) experienced 20 crashes during those first million miles.

Waymo without driver in car but remote, Intervention every 17k miles in hypermapped zones with Lidar. 1 crash every 50k miles.

The vast majority of the collisions were low speed and were due to human error of the other vehicles or because of illegal human activity like spinning donuts in large intersections.

As of June 28, 2024, the California DMV has received 722 Autonomous Vehicle Collision Reports.

Here is a link to the California autonomous vehicle disengagement reports.

Here are statistics on Waymo and Cruise robotaxi crashes.

7 thoughts on “Tesla Autopilot, FSD and Robotaxi Safety Versus Human”

    • North America and Northern Europe are the best places to train and implement FSD. In those two places, (1) the majority of drivers are competent (not tunnel-vision like some other cultures) and (2) the majority of drivers actually follow the law/traffic rules.

      I’ve spent months at a time in cultures where the drivers followed the rules but the drivers were uni-taskers. They idn’t constantly check the rear-view mirrors or side mirrors or bother to understand what was going on around them. Their lack of awareness was startling.

      On the opposite end, I’ve also spent months at a time in cultures where the traffic laws were mere suggestions. Driving at high speeds on the shoulders of highways. Games of ‘chicken’ at stop signs and traffic circles. Pedestrians forcing traffic to stop by stepping in front of moving vehicles. Moped/scooters weaving in and out of roads and sidewalks. Crazy.

      I expect different forks of FSD for different countries/cultures. Even within North America and Europe, context matters…

      • Oh, you’ve been to the Philippines? I call it the land of “traffic suggestions”, because they sure as heck don’t drive like they actually had any traffic laws. I once had a taxi taking me to a hotel realize he’d passed the exit, and instead of turning at the next exit, he just drove onto the shoulder and backed up for a half mile on the highway.

  1. “Tesla recorded one crash for every 7.63 million miles driven with Autopilot engaged. Tesla human drivers not using Autopilot, there was one crash for every 955,000 miles driven. ”

    Keep in mind that the humans can decide when to engage the autopilot. It may be that they’re only letting it drive when things look easy, and driving themselves whenever it looks difficult.

    Ideally you’d try to evaluate driving conditions independent of who was in control, and compare human and self-driving under identical circumstances. Actually, there might be enough Teslas on the road to compare self driving and manually driven Teslas traversing the same stretches of road at about the same time.

    I’ve little doubt that the self-driving mode will end up safer than human. Certainly safer than me, I’m aging and my reflexes are slowing. The PR problem is that they’re probably going to have accidents under *different* circumstances than a human would.

    • “Different circumstances then a human would”? Does that mean our self-driving cars will come up with ways to crash us humans have not thought of yet? Really? Just when you thought us humans had the market cornered on the ways to kill others, and ourselves on the road. From drinking and driving, not getting enough sleep, talking on your phone, to taking your cat to the vet when he was not in a kitty-carrier. Trust me, NEVER, NEVER DO THAT!! The later is when terror and chaos create the perfect storm. Inside your car.
      Anyway…
      I believe in the not to distant future, insurance companies will give you a discount on rates if you let your car drive you. Technology makes and records such “control agents” easy to know, and monitor. Insurance companies, statisticians, the CDC, Stock market, Intelligence Agencies and others crave predictable events, and associated costs. A surprise is never fun. Never. As self driving cars become the norm, insurance companies will be able to more precisely predict payments vrs. premiums.

      To some predictable is boring. To others, anything not predictable is terrifying. Remember that, the next time you take kitty to the vet.

      • I said “PR problem”, because as far as I’m concerned, if the car is a safer driver than I am, I don’t much care that it gets into accidents I never would, because it’s doing it less often than I’d get into different accidents.

        But I’m not joking about worrying about what happens when a self-driving car (Without lidar, obviously!) comes to a wall with a tunnel painted on it.

        • I have driven about 1 million miles and been a passenger for another million miles. I have never seen a wall with a tunnel painted on it that was actually connected to a road.

          If the system is mimicking good drivers then it would be adjusting lane selection and speed and other actions that would minimize accidents and minimize the severity of accidents.

Comments are closed.