What do people think about the accident data dropped by Tesla?
Has anyone reviewed the accident data Tesla dropped/ published? Any takeaways or comments?
Has anyone reviewed the accident data Tesla dropped/ published? Any takeaways or comments?
They have grok going over all info. Enough said!
Given their continual barrage of lies over the years, anything Tesla or Elon says is IMO, immediately suspect.
Accident data, to me, doesn't mean much. It's the incident rates that matter. How many times are drivers having to step in to correct sudden jerking behaviors ? Driver + FSD is obviously always going to be better than driver alone. (redundant system) But, without the drivers there supervising, how often will the cars crash? I'm guessing Tesla doesn't want us to know since they don't talk about it. Hence, robotaxis aren't happening any time soon.
This is the right answer. Tesla has been publishing meaningless stats for years. They compare FSD miles to overall human miles. The human miles include all variety of conditions and situations. FSD miles are ideal conditions in which FSD will allow you to operate. No rain, no snow, no blinding lights, no weird roads that it doesnt understand. I want to see it normalized for that. I would bet my salary it isnt nearly so impressive.
You cannot trust anything that comes out of Tesla. Take your pick of lying, lying by omission, or deliberately misleading, they do it all. Tesla accident data is generally the "deliberately misleading" type.
Their methodology has been fishy in the past. Independent experts should review it. Elon musk is the king of lying with statistics.
Not only that, FSD only works in decent weather. When most accidents do not happen to begin. How bad would FSD be in poor weather and no driver to step in? At the moment FSD reverts control back to a driver (in decent weather) on average once every 380 miles. That is no where near the level of a human driver.
Good validation of TSLA's SAE Level 2 driver assist. But zero usefulness in projecting FSD's usefulness as an autonomous driving function.
I don’t know if both is better. I’m better than 10 year olds at math but if I’m casually checking their work errors are more likely than me doing all the work. I think we see similar stuff with llms. People trust too much and oops bad stuff slips through
The thing about the Tesla accident rate, is it is not an accident rate for FSD. It is the accident rate for the integrated system of FSD plus a supervising driver. Which means that accident rate tells us nothing about how good FSD itself is. Even if Tesla is reporting it honestly.
I don't trust any statistics that I haven't manipulated myself.
How often does a human driver revert control back to a driver?
\> Driver + FSD is obviously always going to be better than driver alone. (redundant system) I don't know about that. If the driver trusts the system and isn't paying attention, then Driver + FSD would be much worse than driver alone.
Billionaires lie? Its almost like they will do anything for more money which is how they were able to get to where they are today.
Think most open source data on Tesla cars shows when using FSD they are much more dangerous than letting people drive. Make sense given how poor ai is compared to humans at these tasks.
Soooo…. You think you trust data they publish? https://variety.com/2025/film/global/elon-musk-unveiled-documentary-trailer-whistleblowers-1236574785/
All cars should have a black box that must be installed, and the data after any accident has to be transferred to a read only so it can't be deleted or edited as I don't trust car manufacturers especially elons fsd scam
You lost me at "by Tesla"...
Yup. Recent history has definitely shown us morality is for losers.
👍 Yes. I use the student driver analogy a lot. FSD is a lot like a teen driver who can manage well enough with a teacher there to intervene.
My wife has called me outside to help her park before
The cost to insure a Tesla is about 40% higher than other cars. That should give you a hint.
Yeah the reported rate is for FSD, which means "fully supervised driving". Tesla has yet to operate a fully autonomous car, so it's rich when Elon calls Waymos 2500 autonomous cars count "rookie numbers".
> Recent history has definitely shown us morality is for losers. Sadly, that's not recent.
> Its almost like they will do anything for more money which is how they were able to get to where they are today. There have been studies done that show highly driven leaders, like CEOs and billionaires, often have traits connected with psychopaths.
People have looked at this data in detail over time. 1). Rich people drive Teslas. Rich people have fewer accidents. 2). FSD is mostly run on highways. People have fewer accidents on highways. 3). The data went missing for a while. That's when FSD was really struggling. 4). FSD will disconnect right before a collision. That leaves the driver to handle the emergency. It's not clear in their data how that is handled - likely shown as a non-FSD collision.
Except that the teenager rarely gives up on its own suddenly!
Also a hint about how costly they are to repair
True.
They're facing multiple lawsuits related to safety, so you can expect that any data they release is carefully selected to support them in court.
🐂💩
This was my first thought as well. How can we trust or believe any of it? I was hoping someone with some expertise in the space might have some interesting comments on what was shared and the questions they have.
They hide so many accidents and data You can't trust a word they say
Amen And when they have an accident it turns off. DRIVERS FAULT!
the only thing its good at, is to campare it to itslef over time. As long as they apply a consitent methodology it can inform us how the system is improving. The comparison vs cars without FSD is limited as Tesla does not have the same data for other cars so they are unable to have a propber aple to aple comparison.
EU cars already have this. It's called Event Data Recording and records a minimum 5 seconds before and after a collision is detected. It must be downloadable from the OBD port. Tesla are uploading full pre crash data to their servers and then deleting the vehicles copy. (As discover in the recent lawsuit where someone did a data recovery and saw the deleted log files on the car Tesla claimed didn't exist) I have no idea if they leave a stripped to the minimum data set on board for OBD download to comply with EU rules, or are just non compliant and it hasn't been pulled up yet.
I guess this is exactly what I am asking. Do we know they are hiding accidents? Is there any evidence of this? How? Because TESLA is their own insurance company for a lot of tesla drivers, they can potentially hide accident information easier than almost any other car maker. But this is just a potential issue, I have no way of knowing (other than Musk's history of lies and partial truth's) this is actually happening.
According to the release they claim data includes all accidents in which FSD was enabled in the 5 seconds prior to the accident. If Tesla is honest, then these accidents are be included and they specifically posted about this and made this exact statement.
they stated in the dump (or on platform formerly known as twitter), that data includes accidents in which FSD was used in the privious 5 seconds prior to an accident, to make sure they were not eliminating FSD related accidents. It was this post that actually made me pay attention to the data dump and post the question. Otherwise I would have assumed it was all lies like everything else posted by Musk.
The data findings are now.... racist.
This is the kind of thing I was interested in hearing about. Terrifying.
hahah \~no. The family that sued Tesla for FSD was a prime example of how those things go. Tesla CLAIMS transparency, but the fact is: \- They said they have no accident data (as so often) \- then it was discovered there IS data on the car's modules \- Tesla sent an engineer who said, 'its no use, it's damaged!' \- The family hired a hacker, and THERE WAS ALL THE DATA and in top condition too! That is the short version of it! This was after a year(s long battle with Tesla via Lawyers!! ANYONE who believes they include accident data is a an idiot! TESLA WILL LET YOU AND YOUR FAMILY SUFFER IF IT SAVES THEM A BUCK! And Elon is PROUD OF IT! [https://www.theguardian.com/technology/2025/aug/01/tesla-fatal-autopilot-crash-verdict](https://www.theguardian.com/technology/2025/aug/01/tesla-fatal-autopilot-crash-verdict)
I just looked up some data. Tesla claims they have 1/7th the number of accidents of a real driver. In real world, 16% occurred on “interstates and freeways,”. I am not sure how Tesla calculated chance of accident on freeway / highway, but if they are comparing agaist all accidents, they should almost reach this number just through data selection. Tesla claims 1/7th number of accidents "crashes". This is equavalent to about 14.28% (of the number of accidents a human driver does, assuming everything else is equal). However, only 16% of all accidents are on freeways/highways, similar roadways. If "average" or expected is looking at metrics on all accidents (and not just freeway accidents), then you could almost achieve the goal just by driving on highways, but using "all accident" data for comparison.
I have always attributed this to the fact that Telsa's are very difficult to repair and that much of the work to make them cheap to manufacture makes them nortoriously difficult to fix. Not that Tesla's are necessarily more likely to get in accident. But it could be both...
Exactly
Rivian doing it too. If the car sees a car or a kid it stops. If it doesn't recognize a vehicle or a person it plows through. I witnessed a rivian auto drive into a traffic vehicle because It didn't look enough like a vehicle. The flashing lights confuse the vehicles auto pilot.
🥱 Since approximately 90% of accidents are caused by drunk (impaired) driving, speeding, or distracted driving, comparing the accident rate of an autonomous vehicle to that of the average driver seems like a rather low benchmark. I’d like to see autonomous vehicles compared to “good“ drivers.
"...but people drive unsafely" is the exact issue autonomous driving should solve. Most people can drive safely. Many people do not! A net reduction in accidents and net reduction in deaths would be a big deal. I am not a defender of Tesla, or their data or even autonomous driving in general. I also see several possible issues with the way Tesla is splicing their data related to ***drunk driving*** but it is slightly different than how you phrased it above. As one person pointed out before ... * Tesla is historically driven by higher income people. * Higher income people have better driving records and fewer accidents overall * Higher income people are less likely to drink and drive compared to the general popution. * Therefore: * general data has higher much higher incidence of drunk drivers today, vs Tesla owner data being used and compared to general data. * If more people were using Tesla's, the Tesla data may be much worse than it is today as more people prone to poor driving get involved. * For instance, I would assume a drunk driver would be bad at assuming control of a FSD when an intervention is required.
Login is required to comment.
Login with Google