← Back to topic list

Fatal Tesla Autopilot crash triggers $345 million lawsuit and safety questions | Lawsuit claims Tesla ignored known Autopilot risks

chrisdh79 | 2025-08-01 12:06 | 382 views

Comments (35)
CompoteDeep2016 2025-08-01 12:12

How long will it take to have a final decision here? years? any jury should torch elon musk...

Key-Beginning-2201 2025-08-01 12:19

Also people are suing successfully to get FSD refunds.

Engunnear 2025-08-01 12:26

>any jury should torch elon musk... Which is exactly why Tesla has always been willing to offer settlements in exchange for NDAs.

Chemical-Idea-1294 2025-08-01 12:30

No money brings her back. But maybe it protects other innocent people from the dangerous claims. On thing is, if you are victim of it when you drive it. There you take the risk yourself. But an outsider has no chance for choice. And this responibility lies on Tesla and it is time to sanction it.

MhVRNewbie 2025-08-01 12:51

So retard driver who should go to jail creeps on the floor looking for his phone while driving and then blames somebody else. ..

Ok-Elevator302 2025-08-01 12:52

Would Tesla still be liable if FSD disabled milliseconds before the crash?

SolutionWarm6576 2025-08-01 12:57

👆. Think there has been 4 that has been settle. Like Eng stated above, the other party has always signed and NDA. It’s going to be interesting. Like the Cali, DMV investigation for the misleading FSD charge, Tesla is saying, you still have to supervise it. Then that’s not really “full self driving”. Even if they get off on these two investigations. They may have to drop the whole “FSD” marketing, because it’s not. It’s going to be interesting to see with both these cases.

KaySav1337 2025-08-01 13:14

He was looking for his phone while doing 60mph 🤣. Her death is on him.

[deleted] 2025-08-01 13:31

[deleted]

Chemical-Idea-1294 2025-08-01 13:35

But he did so because he trusted Teslas marketing. The driver is also at fault, but without Teslas promises he would have likely acted different.

Kevin_Turvey 2025-08-01 13:39

"Judges are not fucking morons" is really not a phrase I would bet on, in the year 2025.

KaySav1337 2025-08-01 13:42

This content marketing? This is the 2019 marketing you’re talking about? “Tesla consistently emphasized that Autopilot was not a fully autonomous system and required a fully attentive driver who maintained control of the vehicle at all times”.

Ancient-Watch-1191 2025-08-01 13:51

When regulators allow new tech (like for example ADAS2) the implementation should be done in a way that it reduces the number of accidents (when the systems are activated). The whole ordeal with the Tesla Autopilot is two fold: 1. It's propagandized by the manufacturer as "Autopilot" and used as such by the customer, all while it's nothing more and nothing less than an ADAS2 system 2. The hardware and software implementation of this ADAS2 system in Tesla cars is deeply flawed, as indicated by the excessive number of accidents triggered by the system itself. The responsible authorities in the US ( National Highway Traffic Safety Administration - NHTSA) issued a report in 2022 that points out the frequent malfunctioning of the Tesla ADAS system which lead to significantly more accidents (with casualties) with Tesla cars where the ADAS system is active than with the cars of the competition. In the NHTSA report of June 2022, the following number of crashes due to ADAS2 system failures is reported: Tesla: 273 (1.7 Billion miles with ADAS L2 system activated or 160 accidents with casualties per billion mile) Ford Motor Company: 5 (0.127 Billion miles with ADAS L2 system activated or 39 accidents with casualties per billion mile) BMW North America: 3 (0.125 Billion miles with ADAS L2 system activated or 24 accidents with casualties per billion mile) The NHTSA has then issued a Standing General Order requiring Tesla Inc. to continue reporting the occurring accidents involving all ADAS L2 equipped Tesla cars. The NSTSA reported then in their Office of Defects Investigation (ODI) summary that covers the period Aug-2022 to Apr-2024 a further [956](https://www.nbcnews.com/tech/tech-news/feds-say-tesla-autopilot-linked-hundreds-collisions-critical-safety-ga-rcna149512) crashes involving failures of the ADAS2 system. Then end 2024, The Verge reported some good news for Tesla: [Trump is probably going to kill the crash reporting rule that made Tesla look bad](https://www.theverge.com/2024/12/13/24320515/trump-tesla-crash-reporting-adas-nhtsa-sgo)

massageofacid 2025-08-01 13:59

That's why responsible car makers call similar systems driver assisants not by misleading name like Autopilot or FSD.

KaySav1337 2025-08-01 14:04

Tousche

Chemical-Idea-1294 2025-08-01 14:05

Then compare it with Elons Tweets. Tesla did mention the restrictions, but never emphasized them. Only somewere hidden in the sales brochures. Alone the fact, that Tesla had to be ordered to call it FSD supervised instead of FSD as they did before is evidence enough.

Fun_Volume2150 2025-08-01 14:11

They tried to make that argument in the DMV trial, and the judge didn’t seem to be having any of it.

JRLDH 2025-08-01 14:22

The laughing emoji tells everything about you.

[deleted] 2025-08-01 14:40

Tesla “ignored” known autopilot risks? Tesla is CURRENTLY ignoring all the known autopilot risks.

Crutchduck 2025-08-01 15:04

Shock imagine that ignoring facts to, make profit.. eh we should probably just regulate them less

Background-Resource5 2025-08-01 15:09

If the boss of Ford, GM, BMW or Toyota said what Musk said , there would be hell to pay. Many more ppl would die. It is why the other car companies never make statements about safety that are not true. I worked in the auto industry. It is drilled into you that statements must be true.

wenchanger 2025-08-01 15:22

the "tesla support" subreddit is full of people who have had their cars veer off road or crash into other cars on their own.... but these idiots refuse to learn that maybe their car is the issue - they are told to blame themselves

StinkPickle4000 2025-08-01 15:54

Tesla FSD was trained by people who want a robot to drive their car…. Let that sink in

praguer56 2025-08-01 15:56

I'm interested in this because I paid $12,000 for this pipe dream and today, the new and improved version is $8,000. And I don't get all of the bells and whistles a new car gets.

Material_Quit7769 2025-08-01 16:04

I have more of a risk with your dumbass driving on the road. FSD is amazing.

Key-Beginning-2201 2025-08-01 16:07

Here is the info. I would contact the arbitration party: https://electrek.co/2025/07/07/tesla-forced-reimburse-full-self-driving-arbitration-failing-deliver/ There was an instance in the UK, also.

Background-Resource5 2025-08-01 19:15

https://www.theguardian.com/technology/2025/aug/01/tesla-fatal-autopilot-crash-verdict?CMP=Share_AndroidApp_Other The victims won!!

rhedfish 2025-08-01 19:43

Forget that Camp Lejeune crap, money is in Tesla crashes.

RagaToc 2025-08-01 20:10

Nope. The driver got sued as well. This is the victims sueing Tesla as they think the company has some blame here too. And money wise Tesla will have bigger pockets than a person.

RagaToc 2025-08-01 20:14

Thanks for posting this. Will be interesting to see if the appeal will hold the verdict.

MhVRNewbie 2025-08-01 20:16

Ok, understand that. But the verdict is still a joke.

GoodFaithConverser 2025-08-02 14:35

It might be neat, but I’ve seen too many videos with jerky decisions that would lead to death for me to trust that it’s safer than a human overall. Maybe it is, or will be, but I’m not trusting Musk’s claims about it for a nanosecond.

Internal_Basil1096 2025-08-02 17:46

Why are people assuming these vehicles are flawless and will keep them safe? And when did people start believing everything a salesman says? I mean if Tesla came out and said you can take a nap on your way to work and you're car will get you there safely people would actually buy into it. Use your damn heads. Everything fails eventually, businesses are built to find ways to entice people to buy their products. That does not mean everything they say is true. Reaching for your phone on the floor, with your foot on the accelerator, while approaching an intersection, because you thought your car would protect you is just flat out stupid. Use common sense.

Normal-Selection1537 2025-08-03 06:35

And why they make buyers sign arbitration agreements.

Complex_Dealer8081 2025-08-04 01:05

Driver’s fault for looking for a phone for 6 seconds and pressing the gas pedal while approaching a a red light. It worked as intended. Dumb ruling

Add comment

Login is required to comment.

Login with Google