Home » Is Tesla Miscounting Crashes To Make Autopilot Seem Safer? New Study Suggests It Might Be

Is Tesla Miscounting Crashes To Make Autopilot Seem Safer? New Study Suggests It Might Be

Tesla Adas Crash Study Top
ADVERTISEMENT

We’re in an interesting stage when it comes to automated driving; we’ve left the honeymoon period, and lost a bit of the hopeful optimism that was spurred on by the rapid successes and developments of the 2010s, which saw development of advanced Level 2 driver assist systems, like Tesla’s Autopilot, which has been predicted to be fully self-driving “next year” since 2014. Of course, that hasn’t happened, and Tesla’s Autopilot, like all Level 2 systems, requires constant driver supervision.

The industry seems to have realized the substantial amount of work still required to get past Level 2 and into more unsupervised levels of driving autonomy, and getting there requires good data about how current systems are performing. To get this data, the National Highway Traffic Safety Administration (NHTSA) has a General Standing Order that requires carmakers that sell cars with Advanced Driver Assist Systems (ADAS) to self-report crashes while using these systems.

Vidframe Min Top
Vidframe Min Bottom

Tesla’s information has been very important because of the sheer number of ADAS-assisted miles their cars have driven: over three billion miles of on-road driving since 2015. A new safety report from Tesla just released , shows improvement in Autopilot crashes, but a peer-reviewed study from the journal Traffic Safety Research suggests that the revisions Tesla has made regarding how they report their data do not make statistical sense, and researchers speculate that Tesla may be miscounting crashes in order to make Autopilot seem safer than it actually is.

Here’s what Tesla notes in their Q2 2024 Safety Report:

In the 2nd quarter, we recorded one crash for every 6.88 million miles driven in which drivers were using Autopilot technology. For drivers who were not using Autopilot technology, we recorded one crash for every 1.45 million miles driven. By comparison, the most recent data available from NHTSA and FHWA (from 2022) shows that in the United States there was an automobile crash approximately every 670,000 miles.

…and, to compare, here was the statement from the Q2 2023 report:

ADVERTISEMENT

In the 2nd quarter, we recorded one crash for every 6.18 million miles driven in which drivers were using Autopilot technology. For drivers who were not using Autopilot technology, we recorded one crash for every 1.46 million miles driven. By comparison, the most recent data available from NHTSA and FHWA (from 2022) shows that in the United States there was an automobile crash approximately every 670,000 miles.

So, over the course of a year, Autopilot-enabled crashes went from one every 6.18 million miles to one every 6.88 million miles, a notable improvement, while crashes for cars not using Autopilot held pretty much the same, one per 1.45 million miles in 2024, and one per 1.46 million miles in 2023.

Here’s the abstract from the study about these results from the  Traffic Safety Research  journal:

Abstract: Between June 2018 and December 2023, Tesla released quarterly safety reports citing average miles between crashes for Tesla vehicles. Prior to March 2021, crash rates were categorized as (1) with their SAE Level 2 automated driving system Autopilot engaged, (2) without Autopilot but with active safety features such as automatic emergency braking, and (3) without Autopilot and without active safety features. In January 2023, Tesla revised past reports to reflect their new categories of with and without Autopilot engaged, in addition to making small adjustments based on recently discovered double counting of reports and excluding previously recorded crashes that did not meet their thresholds of airbag or active safety restraint activation. The revisions are heavily biased towards no-active-safety-features—a surprising result given prior research showing that drivers predominantly keep most active safety features enabled. As Tesla’s safety reports represent the only national source of Level 2 advanced driver assistance system crash rates, clarification of their methods is essential for researchers and regulators. This note describes the changes and considers possible explanations for the discrepancies.

Noah Goodall, the author of the paper, is a Senior Research Scientist with the Virginia Transportation Research Council, and has published a large number of other papers about vehicle automation, traffic operations, and vehicular safety.

The author of the study also tweeted about the study and the suspected results:

ADVERTISEMENT

Essentially, what seems to have happened, according to Goodall, is that Tesla changed the way they classified the categories of the recorded crashes. At first Tesla grouped crashes into three categories:

• ‘Autopilot engaged’

•‘without Autopilot but with our active safety features

• ‘without Autopilot and without our active safety features’

So, the differences here are crashes with the full Autopilot suite active, then no autopilot but safety features like emergency automatic braking that operate in the background, and then driving with everything turned off, effectively like the old deathtraps I tend to drive.

Later, again, according to Goodall, Tesla changed the categories to be:

• ‘Autopilot technology (Autosteer and active safety features)’

•‘not using Autopilot technology (no Autosteer and active safety features)’

…and then finally simplified them again, to be pretty much a simple binary:

• using Autopilot technology’

•‘not using Autopilot technology’.

Here’s a more detailed breakdown of what the non-autopilot active safety features are:

ADVERTISEMENT

Essentially what happened is that Tesla combined the non-Autopilot categories together; one would think the resultant crash numbers of these combined categories would be, essentially, an average of the two categories. But, as Noah Goodall points out, they’re not, they seem to be closer to the crash results found from having none of the safety features engaged. This would only make sense if people routinely de-activated safety features like automatic braking, which are enabled by default when the car is turned on and would require about seven button presses to disable at the beginning of each drive. According to Goodall, drivers that willingly turn off all the safety features would need to be driving over six times more than other drivers to get the new crash rates to makes sense.

Chart1a

The result here seems to be that these new numbers appear to inflate the number of non-Autopilot crashes in relationship to crashes with Autopilot enabled, making the Autopilot record appear better.

From the conclusions of the paper:

ADVERTISEMENT

The analysis presented as part of this note revealed counter intuitive crash rates, particularly in the ‘notusing Autopilot technology’ category where the data suggested that crashes without active safety features were heavily weighted in the post revision reports. This finding is surprising, as prior research suggests that drivers usually keep active safety features enabled.

Finally, the possibility of unreported data or exclusions in the revised crash rates was considered. Some evidence suggests that certain data might not havebeen included in the revised rates, potentially affecting the overall results. In order to produce the revised crash rates when combining with and without active safety features crashes, vehicles would need to have traveled at least 6.3 times further with no active safety features active. This is unlikely, as default active safety features are rarely deactivated by individual drivers. Furthermore, the supposed smaller sample sizes of ‘with active safety features’ do not exhibit smaller variances as would be expected.

In conclusion, Tesla’s revisions to its safety reports raise questions and warrant further investigation. The discrepancies observed in the crash rate data, especially in the ‘not using Autopilot technology’ category, highlight the need for greater transparency and clarity in reporting methodologies. Given the limited data on automated driving system crash rates, a clearer understanding of the data sources and classifications used by manufacturers like Tesla is essential for researchers, regulators, and consumers.

Now, I’m not qualified to give an opinion either way, I’m just reporting on what this particular report says. I think I can say that I agree that it is extremely unlikely for drivers – of any kind of car, not just Teslas – to go out of their way to disable default active safety systems, unless they’re getting ready for track driving or something very specific like that. It’s just not something people normally do, because why would they? Why go through all that confusing effort just to make yourself less safe?

So, was this deliberate on Tesla’s part, to improve the stats for Autopilot? I can’t say and Tesla doesn’t like answering reporter questions. I can say that this report brings up some important questions that I agree merit further investigation. Plus, the paper is short– you should read it for yourself!

Relatedbar

These Tesla FSD Hands-Free Mod Devices Are Such A Bad Idea And Promoting Them Is Dumb

It’s Reasonable To Conclude Musk Knew About Autopilot ‘Failing To Detect Cross Traffic’ Says Florida Judge

Tesla’s Faked 2016 Autopilot Video Was The Start Of Where Things Went Wrong

New IIHS Study Confirms What We Suspected About Tesla’s Autopilot And Other Level 2 Driver Assist Systems: People Are Dangerously Confused

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
68 Comments
Inline Feedbacks
View all comments
Fuzz
Fuzz
1 month ago

This seems obvious to me. Autopilot only works on highway miles, which are easy and add up quickly. Accidents will be more limited. When not using autopilot(city streets) where the vast majority of collisions occur, with more limited mileage, well, uhm, duh? Of course the easy freeway numbers will have fewer collisions. Far worse, perhaps, but fewer in number. This has been an issue with Tesla I’ve noticed since they began releasing these numbers. They are meaningless to compare, and should not be directly compared. So of course they do.

Drew
Drew
1 month ago
Reply to  Fuzz

That’s a known flaw in their comparisons. This appears to be an additional problem with how they’ve compiled data that should already be stacked in their favor.

Vetatur Fumare
Vetatur Fumare
1 month ago
Reply to  Fuzz

100% – This study may have uncovered further manipulation of the statistics, but the main issue is much larger and much more blatant. It’s like when charter schools refuse to take dumb/troubled/damaged students and then brag that their student body does better than public schools.

SaabaruDude
SaabaruDude
1 month ago

“Autopilot” vs “Autopilot technology”

That additional word is doing some very heavy lifting. Whether this is morally worse than any other sort of marketing scheme (Subway’s use of Jared’s weight loss, JD Power awards, etc.) is debatable and will be colored by the fact that Musk will have at least exerted indirect influence over the definition change.

DEcarTrouble
DEcarTrouble
1 month ago
Reply to  SaabaruDude

That is Atlas lifting the earth heavy lifting, but certainly not shocking for a major corporation (looking at you tabaco industry).

John Beef
John Beef
1 month ago

The other interesting stat here is one crash for every 670k miles driven. I’ve been driving for 31 years and likely have not driven 670K miles in my lifetime, yet I’ve been in 3 crashes while driving (only one was sort of my fault) and a couple more as a passenger.

Anoos
Anoos
1 month ago

Is Tesla lying?

Does the Pope sh*t on a Catholic bear in the woods?

IRegertNothing, Esq.
IRegertNothing, Esq.
1 month ago
Reply to  Anoos

Those papal butt cheeks aren’t going anywhere near some Protestant bear, I tell you.

Mechjaz
Mechjaz
1 month ago
Reply to  Anoos

How do you shout “Martin Luther!!!!” in Latin?

Jason Lee
Jason Lee
1 month ago
Reply to  Mechjaz

“Martin Luther!!!!”

MY LEG!
MY LEG!
1 month ago
Reply to  Anoos

What a (horrifying) mental image!

Anoos
Anoos
1 month ago
Reply to  MY LEG!

The video footage is even worse.

Jb996
Jb996
30 days ago
Reply to  Anoos

Say it ain’t so?! Tesla lying? Well I never…. (Clutches pearls.)

Seriously though, there’s a minor chance someone messed up a spreadsheet equation, but it is Tesla, so most likely it’s an intentional lie.

Harvey Firebirdman
Harvey Firebirdman
1 month ago

*insert Fry shocked gif* I mean what auto company hasn’t been lying lately? Hah

Last edited 1 month ago by Harvey Firebirdman
Space
Space
1 month ago

Studebaker.

Rusty S Trusty
Rusty S Trusty
1 month ago

The demographic Elon has been trying to win over ever since he bought Twitter loves it when data is altered to align with their preconceptions or to fit their desired results. As he alienates more and more of Tesla’s customer base each day he needs his new online friends to start buying his cars and just making shit up has proven to be a very effective way to win them over.

Crank Shaft
Crank Shaft
1 month ago
Reply to  Rusty S Trusty

This description reminds me so very much of another public figure. It’s almost uncanny.

Amberturnsignalsarebetter
Amberturnsignalsarebetter
30 days ago
Reply to  Rusty S Trusty

Making shit up seems to be in fashion amongst some of Elon’s friends too. He was probably in the imaginary ‘helicopter crash’ too.

Codfangler
Codfangler
1 month ago

Maybe the folks at Tesla didn’t read Jason’s book.

Thevenin
Thevenin
1 month ago

So Tesla has:

  • Used an unusual EPA cycle to give inflated range ratings.
  • Programed their range estimators to give inflated results that exceed even their EPA ratings when the battety is over 90% full.
  • Manipulated reliability metrics by rejecting warranty claims and fulfilling them as “good faith repairs” instead.
  • Manipulated their stocks by falsely claiming an imminent buyout.
  • Boosted delivery metrics by delivering unfinished vehicles and fixing them later.
  • Suppressed crash reports by shutting off Autopilot shortly before crashes and blaming the drivers.
  • Lumped Autopilot with other safety features to manipulate safety metrics.

I’m starting to think Tesla might just have a thing for fudging numbers.

Mechjaz
Mechjaz
1 month ago
Reply to  Thevenin

“fudging” gets a bad rap and unduly denigrates fudge. Can’t we change this to “licoricing” or something? Use a bad thing to describe a bad thing, y’know?

T.B.A.
T.B.A.
1 month ago
Reply to  Mechjaz

I for one fully support your proposed change to “licoricing”; I like your ideas and would like to subscribe to your newsletter.

ClutchAbuse
ClutchAbuse
1 month ago
Reply to  Mechjaz

I love black licorice and it’s a scientifically proven fact that only the most attractive and successful people in the world find it enjoyable.

NosrednaNod
NosrednaNod
1 month ago
Reply to  ClutchAbuse

Those who like salty black licorice also have been scientifically proven to have a significantly above average sized penis.

Vetatur Fumare
Vetatur Fumare
1 month ago
Reply to  NosrednaNod

Admittedly only a sample size of one, but your data checks out in this Swedish household.

Mechjaz
Mechjaz
1 month ago
Reply to  NosrednaNod

My mom likes black licorice. This raises too many questions, none of which will ever, ever be asked.

Mechjaz
Mechjaz
1 month ago
Reply to  ClutchAbuse

*Looks at bank account, unanswered job applications, and mirror*

… Okay yeah you might be on to something here

MY LEG!
MY LEG!
1 month ago
Reply to  Mechjaz

Ratfucking the numbers good enough?

Vetatur Fumare
Vetatur Fumare
1 month ago
Reply to  Mechjaz

I like salty black licorice and I also like circus peanuts. Is there something wrong with me?

Crank Shaft
Crank Shaft
1 month ago
Reply to  Thevenin

I’m hearing they will soon be releasing a taxi powered by AI specialized vaporware. It’s just sixty-some short days away. However, an even better version will be scheduled to be released early next year. Although, that version will also require some very slight tweaks to better than perfect so there will be another minor delay. After that there will be the initial public beta-test upon release of version 921.3 of their new Space-Cadet software capable of of full-self-autopilot for the release next month of their revolutionary flying taxi which will self-summon from your local airport to your lawn in a future software update. Tesla is pleased to announce they are currently accepting $25,000 deposits on all upcoming planned versions and revisions of the imagined prototypes of the TFT (Tesla Flying Taxi) powered by Space-Cadet FSF (Full Self Flying) software powered by AIV (Artificial Intelligence Vaporware).

Note: All disputes subject to arbitration paid by Tesla.

Drew
Drew
1 month ago

It really gives you a level of confidence in the product when the Autopilot, generally used for highway driving, is compared against crash data from all non-Autopilot driving (which, presumably, includes all driving in conditions that are beyond its capability), and then that data is massaged to convey more of an advantage.

That level of confidence is zero. Perhaps less.

Data I’d prefer to see:

Direct comparison of crashes per highway mile driven.
Total crashes per mile driven by Autopilot users and non-users.
How long between last Autopilot use and crash for each recorded accident.

Ben
Ben
1 month ago
Reply to  Drew

How long between last Autopilot use and crash for each recorded accident.

This has always been my completely unsupported speculation for how Tesla fudges the safety numbers. Since technically the driver is always responsible for the car, they can just not count anything where Autopilot threw up its hands and shut off prior to a crash.

Crank Shaft
Crank Shaft
1 month ago
Reply to  Ben

A microsecond is a long time. If you can’t get out of an impending collision in that amount of time, that’s all on you. It says so right in the contract you signed.

Just ask the arbitrators we pay. They’ll tell you it’s all your fault.

Drew
Drew
1 month ago
Reply to  Crank Shaft

It says so right in the contract you signed.

Well, not the one you signed, but we pushed the new terms and you clicked through to use your car. But that original contract clearly stated we could change the terms at any time.

Amschroeder5
Amschroeder5
1 month ago

Shocked. Shocked I tell you.

-2020 MY Owner that would never be caught dead driving FSD and desperately wishes I could just get a normal non-‘smart’ cruise control to stop phantom braking and automatic speed set-point changes.

Last edited 1 month ago by Amschroeder5
Gubbin
Gubbin
1 month ago
Reply to  Amschroeder5

I was wondering how they were counting ordinary cruise control usage, but it sounds like there is no such thing on Teslas. Ugh.

Amschroeder5
Amschroeder5
1 month ago
Reply to  Gubbin

Not since 2020 or so 🙁

RidesBicyclesButLovesCars
RidesBicyclesButLovesCars
1 month ago
Reply to  Amschroeder5

I agree with needing the “dumb” cruise control. My wife and I both had cars with adaptive cruise control prior to our Teslas. Both cars had phantom braking issues and at times it was nice to just flip back to the dumb cruise and not worry about it.

James Carson
James Carson
1 month ago

Dr Data at your service.

V10omous
V10omous
1 month ago

RIP Tesla May (8:00 AM – 8:35 AM). We hardly knew ye.

Rad Barchetta
Rad Barchetta
1 month ago
Reply to  V10omous

Wake up, Tesla, I think I got somethin’ to say to you
It’s 2024 and you really should be past Level 2
I know FSD is abused, but I feel I’m being used
Oh, Tesla, I couldn’t have tried any more

I plugged you in at home
Just to save me from going broke
You cooked the books and that’s what really hurts

Hillbilly Ocean
Hillbilly Ocean
1 month ago
Reply to  Rad Barchetta

#COTD

Nsane In The MembraNe
Nsane In The MembraNe
1 month ago
Reply to  Rad Barchetta

*terrible Glenn Danzig impression*

I’VE GOT SOMETHING TO SAY

I CRASHED MY TESLA TODAY

EmotionalSupportBMW
EmotionalSupportBMW
1 month ago

Just throwing this out there, not a traffic scientist or anything. But, Tesla has a history of deactivating Auto-pilot before a crash (https://www.motortrend.com/news/nhtsa-tesla-autopilot-investigation-shutoff-crash/). If this continued to happen, and Tesla was then placing these events into “No Active Safety Features” category, would that account for the unexplained mileage?

Canopysaurus
Canopysaurus
1 month ago

Figures lie and liars figure.

More importantly, what happened to Tesla May? Does she like critters? Can she out-wrestle Elon? Is she related to the former British Prime Minister? Don’t leave us hanging.

Maryland J
Maryland J
1 month ago

Lies, damned lies, and statistics.

Totally not a robot
Totally not a robot
1 month ago

I’m painting with the biggest of brushes here, but IME Teslas seem to be driven by the worst drivers. Worst awareness of the outside world, worst attention to pesky details like traffic laws. I think the “autopilot” is an attractant for people who actively hate driving, and they see it as some sort of bandaid to do the driving for them, even if it isn’t even enabled.

RidesBicyclesButLovesCars
RidesBicyclesButLovesCars
1 month ago

I’ll pick up your brush and assume the Tesla drivers in your area were former Altima drivers.

Clark B
Clark B
1 month ago

I hear this about Tesla drivers a lot. Interestingly, that’s not how they are here. They drive like the stereotypical “Prius driver,” rarely exceeding any speed limit and driving so slowly in town that they’d hold up my 50hp Beetle. They drive how I imagine David drives in his Leaf–like they’ll run out of juice any minute.

On the flip side, it’s not unusual to have your doors blown off by a Prius going Mach Jesus in the fast lane.

CUlater
CUlater
1 month ago
Reply to  Clark B

“… going Mach Jesus…”. Beautiful!

Dan Parker
Dan Parker
1 month ago

The weird bit is that you can’t really tell what type of bad driver they’re going to be… Former Prius driving hypermiler draging ass? Sure. Would be M3 owner ripping through traffic on the fwy? Uh huh. Soft skulled super timid type driving “safely”? of course! Charger/challenger driver racing an altima on the fwy? Absolutely. Once and future Altima driver chasing down a Charger on the shoulder? Also yes. Office drone applying makeup/shaving/eating breakfast on the commute? Why not?

Cerberus
Cerberus
1 month ago

Many are conspicuously like either Prius or Altima drivers, however, there are an awful lot of them I don’t notice, so I think they’ve probably still got a better record around me than CRV drivers (worst non-commercial drivers).

Nhizzat
Nhizzat
1 month ago

My experience mirrors yours. If you live in Northern VA, you know I’m talking about a specific demographic.

Arch Duke Maxyenko
Arch Duke Maxyenko
1 month ago

Tesla lying about things? My word, what is this world coming to?

Icouldntfindaclevername
Icouldntfindaclevername
1 month ago

The current headline text is making my head hurt LOL

10001010
10001010
1 month ago

Apparently there’s an employee named Tesla May who is in charge or tabulating all the crash datas.

Reasonable Pushrod
Reasonable Pushrod
1 month ago

I’ve had a theory for awhile, that Autopilot disengages just before impact in a crash. This then allows Tesla to claim Autopilot was not in use during the crash.

Geoff Buchholz
Geoff Buchholz
1 month ago

Not just a theory: https://www.motortrend.com/news/nhtsa-tesla-autopilot-investigation-shutoff-crash/

Mind you, I haven’t found any follow-up to this in my brief look around … anyone else?

Reasonable Pushrod
Reasonable Pushrod
1 month ago
Reply to  Geoff Buchholz

That definitely makes you question Tesla’s data.

My Goat Ate My Homework
My Goat Ate My Homework
1 month ago

I would be skeptical but Tesla has a history of these sorts of things. For example. They report 99.95% “up-time” for their supercharger sites. But, a SITE is a location (maybe a bank of 4 chargers). And they consider a “site” to be “up” if at least 50% of the chargers are working.
So, 49% of actual Supercharger chargers could be down and they could still technically report 100% site “up-time”. Shove the word “SITE” in there and what they are saying is technically true when you read the fine print.

You could roll up to a SITE with 8 chargers. 4 working (and in-use) and 4 broken. And, guess what, that site is still 100% “up”.

You have to read the fine print, not the tweets. That’s where all the truth gets hidden.

I wouldn’t be surprised if they started combining things in a way that hides the truth and only serves to make Tesla look better. Or, maybe this person is a Tesla short-seller and pedo.

Jaded Helmsman
Jaded Helmsman
1 month ago

In the last 9 years, I’ve used superchargers at about 75 different sites. All told, probably around 250 supercharging sessions. While there have been less than a handful of incidences with charging at any single station at anyone of those 75 sites, I’ve never come across a supercharger site that was anywhere close to 50% down. If we use your estimate of 8 chargers at a site (and many I’ve visited were larger than that), that would be about 600 chargers with fewer than 5 individual chargers “down” in 9 years of me using them. I’ve never been unable to fast charge at a supercharger site.

I don’t have any doubt there’s some serious spin on the autopilot safety numbers being reported. I don’t have any reason to doubt that the supercharger availability numbers are much more accurate.

And Elon’s an ass who is actively harming his brand. Despite having owned two Teslas (a 2015 and a 2018), I won’t buy another if it in any way benefits him, and sold one and replaced it with a competitor.

Last edited 1 month ago by Jaded Helmsman
My Goat Ate My Homework
My Goat Ate My Homework
1 month ago
Reply to  Jaded Helmsman

I’m not knocking the supercharger network, it’s the best out there. I just provide one easily understood example about Tesla misleading consumers.

The issue is not “how accurate is it”. The issue is that it’s misleading, they know it is, they could be less misleading (they could easily provide statistics for individual charger up-time but they don’t) and they purposefully choose not to in order to look better.

And it’s practices like my example that make it hard not to be skeptical of any data they share or statements/claims they make.

A. Barth
A. Barth
1 month ago

Is Tesla May Miscounting Crashes

“Tesla May” sounds like a marketing person from the deep south.

Dead Elvis, Inc.
Dead Elvis, Inc.
1 month ago
Reply to  A. Barth

That sort of title gore is giving me flashbacks to the old site.

My Goat Ate My Homework
My Goat Ate My Homework
1 month ago
Reply to  A. Barth

Or maybe a Dr. Seuss character.

Last edited 1 month ago by My Goat Ate My Homework
Captain Muppet
Captain Muppet
1 month ago
Reply to  A. Barth

Or a British Prime Minister.

Col Lingus
Col Lingus
1 month ago
Reply to  A. Barth

My sister Daisy has entered the chat.

WR250R
WR250R
1 month ago

Is Tesla may? Nay! In a bed of stocks and stonks thay lay!

68
0
Would love your thoughts, please comment.x
()
x