← Back to topic list

What do people think about the accident data dropped by Tesla?

InvisibleBlueRobot | 2025-11-16 02:28 | 98 views

Has anyone reviewed the accident data Tesla dropped/ published? Any takeaways or comments?

Comments (48)
y4udothistome 2025-11-16 02:35

They have grok going over all info. Enough said!

nlaak 2025-11-16 02:52

Given their continual barrage of lies over the years, anything Tesla or Elon says is IMO, immediately suspect.

Useful_Response9345 2025-11-16 02:54

Accident data, to me, doesn't mean much. It's the incident rates that matter. How many times are drivers having to step in to correct sudden jerking behaviors ? Driver + FSD is obviously always going to be better than driver alone. (redundant system) But, without the drivers there supervising, how often will the cars crash? I'm guessing Tesla doesn't want us to know since they don't talk about it. Hence, robotaxis aren't happening any time soon.

ccie6861 2025-11-16 03:26

This is the right answer. Tesla has been publishing meaningless stats for years. They compare FSD miles to overall human miles. The human miles include all variety of conditions and situations. FSD miles are ideal conditions in which FSD will allow you to operate. No rain, no snow, no blinding lights, no weird roads that it doesnt understand. I want to see it normalized for that. I would bet my salary it isnt nearly so impressive.

mrbuttsavage 2025-11-16 04:04

You cannot trust anything that comes out of Tesla. Take your pick of lying, lying by omission, or deliberately misleading, they do it all. Tesla accident data is generally the "deliberately misleading" type.

britzsquad 2025-11-16 04:22

Their methodology has been fishy in the past. Independent experts should review it. Elon musk is the king of lying with statistics.

FlipZip69 2025-11-16 04:39

Not only that, FSD only works in decent weather. When most accidents do not happen to begin. How bad would FSD be in poor weather and no driver to step in? At the moment FSD reverts control back to a driver (in decent weather) on average once every 380 miles. That is no where near the level of a human driver.

Lacrewpandora 2025-11-16 04:51

Good validation of TSLA's SAE Level 2 driver assist. But zero usefulness in projecting FSD's usefulness as an autonomous driving function.

Spillz-2011 2025-11-16 04:57

I don’t know if both is better. I’m better than 10 year olds at math but if I’m casually checking their work errors are more likely than me doing all the work. I think we see similar stuff with llms. People trust too much and oops bad stuff slips through

Quercus_ 2025-11-16 05:30

The thing about the Tesla accident rate, is it is not an accident rate for FSD. It is the accident rate for the integrated system of FSD plus a supervising driver. Which means that accident rate tells us nothing about how good FSD itself is. Even if Tesla is reporting it honestly.

Hobbes1001 2025-11-16 05:38

I don't trust any statistics that I haven't manipulated myself.

Hobbes1001 2025-11-16 05:42

How often does a human driver revert control back to a driver?

Hobbes1001 2025-11-16 05:44

\> Driver + FSD is obviously always going to be better than driver alone. (redundant system) I don't know about that. If the driver trusts the system and isn't paying attention, then Driver + FSD would be much worse than driver alone.

bazookateeth 2025-11-16 06:27

Billionaires lie? Its almost like they will do anything for more money which is how they were able to get to where they are today.

OilAdministrative197 2025-11-16 07:58

Think most open source data on Tesla cars shows when using FSD they are much more dangerous than letting people drive. Make sense given how poor ai is compared to humans at these tasks.

FlagFootballSaint 2025-11-16 09:11

Soooo…. You think you trust data they publish? https://variety.com/2025/film/global/elon-musk-unveiled-documentary-trailer-whistleblowers-1236574785/

rellett 2025-11-16 09:30

All cars should have a black box that must be installed, and the data after any accident has to be transferred to a read only so it can't be deleted or edited as I don't trust car manufacturers especially elons fsd scam

gwenver 2025-11-16 11:55

You lost me at "by Tesla"...

gwenver 2025-11-16 11:56

Yup. Recent history has definitely shown us morality is for losers.

Useful_Response9345 2025-11-16 13:25

👍 Yes. I use the student driver analogy a lot. FSD is a lot like a teen driver who can manage well enough with a teacher there to intervene.

Mecha-Dave 2025-11-16 13:42

My wife has called me outside to help her park before

Ouch259 2025-11-16 13:45

The cost to insure a Tesla is about 40% higher than other cars. That should give you a hint.

GiveMeSomeShu-gar 2025-11-16 13:57

Yeah the reported rate is for FSD, which means "fully supervised driving". Tesla has yet to operate a fully autonomous car, so it's rich when Elon calls Waymos 2500 autonomous cars count "rookie numbers".

nlaak 2025-11-16 14:12

> Recent history has definitely shown us morality is for losers. Sadly, that's not recent.

nlaak 2025-11-16 14:14

> Its almost like they will do anything for more money which is how they were able to get to where they are today. There have been studies done that show highly driven leaders, like CEOs and billionaires, often have traits connected with psychopaths.

Afraid_Sample1688 2025-11-16 14:57

People have looked at this data in detail over time. 1). Rich people drive Teslas. Rich people have fewer accidents. 2). FSD is mostly run on highways. People have fewer accidents on highways. 3). The data went missing for a while. That's when FSD was really struggling. 4). FSD will disconnect right before a collision. That leaves the driver to handle the emergency. It's not clear in their data how that is handled - likely shown as a non-FSD collision.

CptCoe 2025-11-16 16:22

Except that the teenager rarely gives up on its own suddenly!

CptCoe 2025-11-16 16:25

Also a hint about how costly they are to repair

Useful_Response9345 2025-11-16 16:27

True.

olyfrijole 2025-11-16 17:52

They're facing multiple lawsuits related to safety, so you can expect that any data they release is carefully selected to support them in court.

eclwires 2025-11-16 18:14

🐂💩

InvisibleBlueRobot 2025-11-16 19:23

This was my first thought as well.  How can we trust or believe any of it?  I was hoping someone with some expertise in the space might have some interesting comments on what was shared and the questions they have.

[deleted] 2025-11-17 08:49

They hide so many accidents and data You can't trust a word they say

[deleted] 2025-11-17 08:51

Amen And when they have an accident it turns off. DRIVERS FAULT!

Few-Masterpiece3910 2025-11-17 12:53

the only thing its good at, is to campare it to itslef over time. As long as they apply a consitent methodology it can inform us how the system is improving. The comparison vs cars without FSD is limited as Tesla does not have the same data for other cars so they are unable to have a propber aple to aple comparison.

DistributedView 2025-11-17 12:54

EU cars already have this. It's called Event Data Recording and records a minimum 5 seconds before and after a collision is detected. It must be downloadable from the OBD port. Tesla are uploading full pre crash data to their servers and then deleting the vehicles copy. (As discover in the recent lawsuit where someone did a data recovery and saw the deleted log files on the car Tesla claimed didn't exist) I have no idea if they leave a stripped to the minimum data set on board for OBD download to comply with EU rules, or are just non compliant and it hasn't been pulled up yet.

InvisibleBlueRobot 2025-11-17 17:08

I guess this is exactly what I am asking. Do we know they are hiding accidents? Is there any evidence of this? How? Because TESLA is their own insurance company for a lot of tesla drivers, they can potentially hide accident information easier than almost any other car maker. But this is just a potential issue, I have no way of knowing (other than Musk's history of lies and partial truth's) this is actually happening.

InvisibleBlueRobot 2025-11-17 17:10

According to the release they claim data includes all accidents in which FSD was enabled in the 5 seconds prior to the accident. If Tesla is honest, then these accidents are be included and they specifically posted about this and made this exact statement.

InvisibleBlueRobot 2025-11-17 17:15

they stated in the dump (or on platform formerly known as twitter), that data includes accidents in which FSD was used in the privious 5 seconds prior to an accident, to make sure they were not eliminating FSD related accidents. It was this post that actually made me pay attention to the data dump and post the question. Otherwise I would have assumed it was all lies like everything else posted by Musk.

InvisibleBlueRobot 2025-11-17 17:15

The data findings are now.... racist.

InvisibleBlueRobot 2025-11-17 17:17

This is the kind of thing I was interested in hearing about. Terrifying.

[deleted] 2025-11-17 17:57

hahah \~no. The family that sued Tesla for FSD was a prime example of how those things go. Tesla CLAIMS transparency, but the fact is: \- They said they have no accident data (as so often) \- then it was discovered there IS data on the car's modules \- Tesla sent an engineer who said, 'its no use, it's damaged!' \- The family hired a hacker, and THERE WAS ALL THE DATA and in top condition too! That is the short version of it! This was after a year(s long battle with Tesla via Lawyers!! ANYONE who believes they include accident data is a an idiot! TESLA WILL LET YOU AND YOUR FAMILY SUFFER IF IT SAVES THEM A BUCK! And Elon is PROUD OF IT! [https://www.theguardian.com/technology/2025/aug/01/tesla-fatal-autopilot-crash-verdict](https://www.theguardian.com/technology/2025/aug/01/tesla-fatal-autopilot-crash-verdict)

InvisibleBlueRobot 2025-11-17 18:49

I just looked up some data. Tesla claims they have 1/7th the number of accidents of a real driver. In real world, 16% occurred on “interstates and freeways,”. I am not sure how Tesla calculated chance of accident on freeway / highway, but if they are comparing agaist all accidents, they should almost reach this number just through data selection. Tesla claims 1/7th number of accidents "crashes". This is equavalent to about 14.28% (of the number of accidents a human driver does, assuming everything else is equal). However, only 16% of all accidents are on freeways/highways, similar roadways. If "average" or expected is looking at metrics on all accidents (and not just freeway accidents), then you could almost achieve the goal just by driving on highways, but using "all accident" data for comparison.

InvisibleBlueRobot 2025-11-17 19:01

I have always attributed this to the fact that Telsa's are very difficult to repair and that much of the work to make them cheap to manufacture makes them nortoriously difficult to fix. Not that Tesla's are necessarily more likely to get in accident. But it could be both...

y4udothistome 2025-11-17 21:36

Exactly

mafiacopking 2025-11-18 02:21

Rivian doing it too. If the car sees a car or a kid it stops. If it doesn't recognize a vehicle or a person it plows through. I witnessed a rivian auto drive into a traffic vehicle because It didn't look enough like a vehicle. The flashing lights confuse the vehicles auto pilot.

levon999 2025-11-18 21:27

🥱 Since approximately 90% of accidents are caused by drunk (impaired) driving, speeding, or distracted driving, comparing the accident rate of an autonomous vehicle to that of the average driver seems like a rather low benchmark. I’d like to see autonomous vehicles compared to “good“ drivers.

InvisibleBlueRobot 2025-11-19 17:19

"...but people drive unsafely" is the exact issue autonomous driving should solve. Most people can drive safely. Many people do not! A net reduction in accidents and net reduction in deaths would be a big deal. I am not a defender of Tesla, or their data or even autonomous driving in general. I also see several possible issues with the way Tesla is splicing their data related to ***drunk driving*** but it is slightly different than how you phrased it above. As one person pointed out before ... * Tesla is historically driven by higher income people. * Higher income people have better driving records and fewer accidents overall * Higher income people are less likely to drink and drive compared to the general popution. * Therefore: * general data has higher much higher incidence of drunk drivers today, vs Tesla owner data being used and compared to general data. * If more people were using Tesla's, the Tesla data may be much worse than it is today as more people prone to poor driving get involved. * For instance, I would assume a drunk driver would be bad at assuming control of a FSD when an intervention is required.

Add comment

Login is required to comment.

Login with Google