False News most of them were minor nothing major look up in the NTHSA
Even if this is true, I wonder how this compares to human counterparts during the same timeframe
Involved doesn’t mean caused. Someone rear ending one at a stop sign would count towards this. Classic statistics
That seems like a low number.
For 40 vehicles?
CBS news, isn’t that an oxymoron?
see BS
Homonym
I remember one accident I saw on YouTube was a robotaxi slightly touched another vehicle and then stopped. And everyone in the news talked about it for at least a week.
That’s what some of these were, hitting a pole/tree, a parked truck and stuff like that. I think it’s on r/electricvehicles, they made a big deal out of it as usual 😂
Are the vehicles driving all day and night?
I've been driving for 30 years with 0 incidents. Tesla has, what, 40 cars in 8 months? 0 incidents in 360 months, vs 14 incidents in the equivalent of 320 months. That's pretty bad.
In Miami there's 14 human car crashes every few hours
[deleted]
I read earlier it was 4x the average driver. But there was no context on how many miles the robotaxis drove vs. humans drove, so it could have been clickbait.
Crash stat is irrelevant without miles driven.
In Sioux Falls there are 1.892 human crashes every few hours
For an autonomous car, hitting a stationary object is for me a huge issue. Not being able to properly understand some weird street signs, not reacting to another car properly could also happen, but hitting a pole or a tree, this means that the car can't see some stationary object or that emergency braking is not working well. If an old driver would hit a tree we would immediately jump to conclusion that he is unsafe to drive, an autonomous car doing the same should bring the same line of thought. It should always detect object in its path and brake before hitting them. Even if it is just at 5mph, the fact it still hit a stationary object it is a huge issue. This is the "first rule" an autonomous car should obey, stop before hitting anything in your path. So either it didn't see it, or it didn't stop, 2 huge red flags!
You’re an outlier not the norm. People need to stop with the anecdotal evidence when it comes to statistics.
Okay
How many miles do you put on a daily basis?
Four times as many accidents as the average driver or four times as fewer accidents as the average driver?
Oh no! It's only much better than human drivers but not perfect. Let's shut it down! /s
What is your number of accidents per miles driven?
More
I assume you're very against Waymo then. Or any autonomous car, really. Or human drivers. Basically, all cars, period. Because all of these things hit stationary objects. That's ok. The rate just has to be low enough.
Does it have miles driven in the article? 500,000?
Here. It’s one every 57k miles, as opposed to one every 239k miles https://gizmodo.com/tesla-robotaxis-reportedly-crashing-at-a-rate-thats-4x-higher-than-humans-2000722989
That doesn’t make it false lol. Liability isn’t even worked out in these cases, especially if you can taxi your model 3. Accidents occurring at 4x the rate of humans. https://gizmodo.com/tesla-robotaxis-reportedly-crashing-at-a-rate-thats-4x-higher-than-humans-2000722989
Ah yes, because Tesla operates all vehicles in Austin, not just a handful of them.
How can they calculate 4x if they don't know how many miles have been driven?
One of the accidents caused an hospitalization, and it was initially misreported. Why don’t we speak about that?
Tesla rate is at least 4x worse than a human, even with a professional always focused on avoiding accidents. Waymo is 10x safer than humans.
No, the news is perfectly accurate and can be used to calculate their accident rate that is at least 4x worse than a human. All this while having a professional always focused on avoiding accidents.
No
~800000, so it’s a crash every 57k miles. If someone crashes that often they should void his driving license because he is a danger.
So do you crash every 57k miles like the robotaxi?
That’s now how statistics work. Go get an understanding in what 1 in x means and we can have this conversation.
Are you another one that crashes once every 57k miles like the robotaxi?
If you think that crashing once every 57k miles is perfectly normal they should burn your driving license because you are a danger.
Tesla could say what happened but choose not to. They are 100% hiding something
Again that’s not how statistics work.
Maybe just consider them like student drivers learning to drive. Beginners pretty much collide with anything when they’re starting. Let’s see once this full autonomous driving is released, if it ever gets released.
And what we do know is many cases involved hitting stationary objects. Like 100% at fault.
I hate these data points. It’s roughly 4x the average over 800,000 miles, with 14 recorded accidents since June 2025. For comparison, Waymo reported 140 accidents since March 2025 across 6.5 million miles. The challenge with this data is how they defined “accident". The reporting includes everything from a minor tap to a bus crashing into a parked robotaxi. It also counts incidents like the vehicle lightly contacted a stationary object at under 2 mph. In general because of how sstrict the reporting requirements are, virtually every minor event gets logged. WHen you compare it to human-driver incidents, especially low-speed bumps or minor contact, these would likely never be reported. That alone can skew raw comparisons. There’s also a perception issue. When a Tesla is involved in any incident, headlines are usually, “Tesla involved in accident.” When it’s a different brand, it’s usually described as a “motor vehicle accident” and not “Ford Mustang involved” or “F-150 crash.” The brand becomes the story in my opinion. And when another manufacturer uses similar driver-assist technology, they often described it as “technology similar to Tesla,” which just continues to reinforce Tesla’s name in accident reporting. That framing as a whole influences how the data is presented and discussed.
miles were comparable, what was not comparable that most of human accidents (fender benders) aren’t reported. So they were comparing fender benders of Robotaxi with serious human accidents
Tesla rate isn’t 4x worse than humans. That is because most human accidents (fender benders) aren’t reported and thus we only have data on serious human accidents. Meanwhile autonomous cars have to report all accidents
There’s no way you really think that. Humans are dumb as hell and always distracted by their phone. FSD is a better driver than most people.
Classic bollox headline
All of the accidents are documented. What are they suppose to say that hasnt already been said?
It’s a fact, it’s not what I think: https://electrek.co/2026/02/17/tesla-robotaxi-adds-5-more-crashes-austin-month-4x-worse-than-humans/
It is, accordingly to Tesla own standard: https://electrek.co/2026/02/17/tesla-robotaxi-adds-5-more-crashes-austin-month-4x-worse-than-humans/
I guess not. I didn't hear about it.
Correct. Vehicles I've owned have been in multiple accidents over the course of my lifetime thus far. But if I actually provided relevant details: 50% of the accidents , my vehicle was parked and unattended. 25% the vehicle was struck from behind while fully stopped at an intersection.... the remainder were other situations that were caused by another driver. None of the incidents were due to my fault or other issue with my vehicle. But if I just said that I'd been in 8 accidents since I Started driving a few decades ago... well, people just fill in the blanks. This is just shit journalism.
They don't have and can't provide enough details to make the stat accurate or meaningful. It's poor journalism and an article written as clickbait.
https://electrek.co/2026/02/17/tesla-robotaxi-adds-5-more-crashes-austin-month-4x-worse-than-humans/ The newly filed NHTSA data also reveals that Tesla quietly upgraded one earlier crash to include a hospitalization injury, something the company never disclosed publicly.
Only 14 crashes is pretty good. There fleet is pretty massive.
That’s a dumb analogy. I haven’t hit anything in my decades of driving. I may have curbed a wheel actually but other than that, I take pride in my ability to drive well. I expect a robotaxi to make 0 mistakes. I 100% expect it to never curb a wheel. It’s a robot, not a human. Any “accident” that is done to the robotaxi and not its fault is excusable like rear end accidents. Everything else, the bar should be at <1% accident rate and of those accidents, I expect it to be some insane chain of events like a tire blows out, randomly hits a patch of black ice and another tire blows out while it’s correcting which cause it to hit the guard rail or something nuts like that. Not curb a wheel around a turn because it lost track of its size for a second.
The perception issue is crazy. I remember Feb 2024 shortly after I got my car it was “recalled” due to the warning light font size being too small. At the same time I think it was Toyota and BMW had recalls due to a serious fire risks. I think we can all guess which one got all the media coverage
Well maybe you're just a magical fairy, but believe it or not, humans get into accidents all the time, including with stationary objects. So the bar isn't perfection. The bar is the human accident rate. Because if you don't allow self-driving cars when their accident rate is better than the human accident rate, then you're choosing to have more people die. You may think an accident rate of literally 0.00000000% is possible, but that's because you don't understand computers and you don't understand large scales. Absolute perfection is impossible. All we can do is asymptotically push closer and closer to that point over time.
Isn’t it only 40 cars? That’s 35%
>most human accidents (fender benders) aren’t reported and thus we only have data on serious human accidents. Meanwhile autonomous cars have to report all accidents ill repeat the part of his comment you ignored, since you love being disingeuous
In Austin. Driving there is hell
Nope, that's only in comparison to reported accidents. Many minor accidents go unreported normally, but they're required to be reported for autonomous cars. So 4x is highly inaccurate. If you show me how 4x was calculated, I can tell you exactly how it's wrong. The current accident rate for Robotaxi is roughly one accident per 50,000 miles. That seems pretty bad if you're comparing to reported human accidents, but once you realize that it's also counting incidents such as touching a curb at low speed, it's really not bad. I bet humans have very minor incidents like that more often than 50,000 miles, but unfortunately there's no good data for that. For comparison, the Waymo accident rate is around 100,000 miles, so if you're being consistent, then you'd think Waymo is less safe than humans too. But it's obviously not.
They really gotta split up designations when compiling data like this
This.
It’s calculated using Tesla own standard: The irony is that Tesla’s own numbers condemn it. Tesla’s Vehicle Safety Report claims the average American driver experiences a minor collision every 229,000 miles and a major collision every 699,000 miles. By Tesla’s own benchmark, its “Robotaxi” fleet is crashing nearly 4 times more often than what the company says is normal for a regular human driver in a minor collision, and virtually every single one of these miles was driven with a trained safety monitor in the vehicle who could intervene at any moment, which means they likely prevented more crashes that Tesla’s system wouldn’t have avoided. https://electrek.co/2026/02/17/tesla-robotaxi-adds-5-more-crashes-austin-month-4x-worse-than-humans/
posts an electrek article 🤣🤣🤣
Yeah, I'd like to see the reasoning that it has 4 times as many accidents as the regular driver. it sounds like a clickbait claim.
And they are sourcing it from electrek so right off the bat it is sus.
Oh I vw4y well think it could be i was just answering the question above. I dont think they have a large enough data set to make any claims at this point personally.
Absolute perfection is impossible but when I see fsd still currently make insanely dumb mistakes like running a red light because the left green went on then I know it’s not ready. I think the accident statistics is a poor measurement and they should be calculating internally based on disengagement events before trying to push driverless robotaxis. The issue is Elons whole trillion dollar pay package is based on trying to beat everyone to market and get enough people on board and cash out before everyone else catches up. FSD was asymptotically plateauing until they redesigned with occupancy networks. Even nvidia now has copied that strategy with their new platform that debuted in the Mercedes CLA which is very close to how v13 performed initially. Either way, Tesla seems to be plateauing again and they can keep trying to quantize and prune the models to fit into hw4 and still be safely reliable but each training run results in a different model with different regressions. I just don’t think we’re going to see a safe robotaxi with hw4 levels of compute / model sizes and I think Elon is going to guinea pig the public hoping for the best instead of making sure that it can reliably not run red lights 99.999999% of the time. If you want no steering wheel with a car that won’t run red lights 98% of the time then by all means, put your life at risk. But I won’t get in that car.
What? Humans make even dumber mistakes all the time. Your problem is you don't understand large scales. You see many examples of mistakes on the internet and assume that means the mistakes actually occur frequently on an individual level, not realizing that they occurred collectively over the course of 8 billion miles. So those "many" mistakes actually aren't many at all. For example, 100 mistakes over 8 billion miles is a mistake rate of 0.000001% per mile. That's tiny. I also don't know where you got 98% for not running red lights from. I have 2,500 miles on FSD v14.2 so far and have had it try to run a red light exactly zero times. So in reality it's likely 99.9...% with some number of 9s. And like you said, with a certain number of 9s, that's obviously acceptable. > they redesigned with occupancy networks Lol you have no idea what you're talking about.
You must have a magical fairy fsd 14. Of my 3 teslas running v14, they have all fucked up at least once per week. My model x ate a chunk of tire the other day. So you’re just full of it and have no idea what you’re talking abour. Occnets are what these models are and that’s what allowed them to become so efficient. Hydranet and Bird eye view was compute intensive so they took all 8 camera streams to predict occupied space vs figuring out what the actual objects are within the space. It’s just determining whether there’s something there or not and if it’s moving or not then gauging its motion planning based around whether the object that’s occupying space is moving into its path or not. It took occnets to be able to create a small enough model to train end to end and fit on hw3/4. This was all in ai day 2022 when elluswamy took over for karpathy. Not much has really changed since they went to end to end just iterative models and pruning of bad auto labeling. I’m not convinced you can do this on a single model. Nvidias platform has multi modal stacks with a safety stack with hardcoded laws to abide by that can correct the end to end neural net when it randomly hallucinates a green light instead of red. You’re going to see a plateau on hw4 (I believe it’s here and I’m doubtful that it will get better) but maybe with hw5 and a MoE architecture, you’ll get closer to that 99.99% asymptote.
"crashes" 35% of incidents occurred when the vehicle wasn't even moving.
The numbers seem to work out to 4x worse - at least that's what these reports say
I'm interested in the number of accidents *caused* by the self-driving cars. Becaus the number of accidents it is *involved* in is far less useful in determining whether they should be rolled out quickly or slowly.
Sure, but it's still a telling data point that robotaxis are involved in accidents at 4x the rate of human drivers, whether they are crashing into things or getting crashed into
It *is* interesting. Is that measure based on miles driven?
Login is required to comment.
Login with Google