← Back to topic list

Tesla Spars in Court Over Autopilot Alert 2 Seconds Before Crash

SnooSprouts4376 | 2025-07-18 01:51 | 365 views

Comments (67)
SnooSprouts4376 2025-07-18 01:52

Unpaywalled  https://archive.ph/i9Sm9

InfluenceEastern9526 2025-07-18 02:10

LOL. 1.6 seconds to brake? I'm looking for the videos of this testimony. It will be more interesting than the Karen Read retrial.

jason12745 2025-07-18 02:29

> “I do not have any evidence in front of me that the word ‘beta’ is trying to communicate anything to drivers,” Cummings said. “What it is trying to do, in my professional opinion, is avoid legal liability.” Missy Cummings continues to kick ass.

BigMax 2025-07-18 02:30

An it says he hit the brakes .55 seconds before the crash. So he did react, in 1 second, but that's not enough. Do they really think that you can hear "beep" and know *exactly* what to do in the moment? It's not just "beep! take control back" it was "beep! SLAM the brakes THIS very millisecond!!!" Do we really expect that kind of reaction time from a driver who has been told "sit back and relax, I'll let you if you need to take over."

sonicmerlin 2025-07-18 02:34

This is why I don’t use autopilot. I don’t trust Elon or Tesla to prioritize the safety of people he considers “NPCs”

EducationTodayOz 2025-07-18 03:02

more than enough if you engage elon's hive mind, mortal. not funny they are really in the poo with this

jmouw88 2025-07-18 03:13

Standard traffic/highway design reaction time was once two seconds, and has been moving towards three for most situations. That means roadways are designed (hills, curves, stop lights, etc.) in a way that provides the driver at least two seconds, and more likely 3, before taking any action. So no, this is not an expected reaction time for a driver who is actually driving, let alone under autopilot. Also worth noting that design deceleration speeds are 11.2 ft/s/s, or 7.6 mph. A warning .55 seconds before an obstacle would only allow for 4 mph of deceleration assuming instantaneous reaction time.

spam__likely 2025-07-18 03:41

lol... you trust anything else?

BrainwashedHuman 2025-07-18 03:47

I don’t trust much, but especially not the guy who’s built an empire on “move fast and break things” for hardware and safety critical systems.

Independent-Design17 2025-07-18 03:51

If anything could've been done in the 2 seconds before the crash, why didn't the autopilot just do that, rather than switch itself off? If hitting the brakes and/or swerving would have worked, why didn't the autopilot just hit the brakes or swerve? It seems to me that the FSD allowed things to reach a state that no amount of braking or swerving would have avoided **someone** getting injured. So what was the driver meant to do in the final 2 seconds? Jump out of the car? Contemplate the trolley problem? Carry out some sort of manoeuvre that would have caused himself injury but would absolve Tesla of all blame? Is **that** the intent of the 2 seconds: to give enough time for a loyal Elon fan to deliberately off themselves for the sake of their god-wizard-king and preserve Tesla's good name? The logic escapes me.

IcyHowl4540 2025-07-18 04:19

Watch as Elon stans try to point to Elon as the reasonable one and the steely-eyed fighter pilot lady as the irrational one. I've got my popcorn bucket ready.

Maelstrom2022 2025-07-18 04:21

Exactly. My non-FSD truck from the same time already auto breaks if it detects you approaching an object too quickly.

y4udothistome 2025-07-18 04:42

Well said

y4udothistome 2025-07-18 04:43

Hitler was probably a nice person if you take the war out of it white picket fence you know

frownface84 2025-07-18 04:52

Because if autopilot does it, and something goes wrong, such as losing traction and killing a bystander than it’s autopilot’s fault. But if it hands back control to the driver last second (I meant figuratively but it’s almost literally in this case) then that’s on the driver, apparently.

Brilliant-Site-354 2025-07-18 04:52

if you cant react in 2 seconds you shouldnt have a license tbh

NeighborhoodFull1948 2025-07-18 05:30

This is why Tesla won’t release Level 3 autonomous driving. The vehicle needs to give the driver up to 10 seconds to take over, and the system is still responsible and liable for up to 10 seconds after the driver takes over. This is what Mercedes does. It’s also why Mercedes Level 3 is so limited, it’s not about technical ability, it’s all about liability. Can’t imagine Elon taking on that type of liability with FSD, because then the default would be that Elon is fully responsible, unless proven otherwise.

snailnado 2025-07-18 05:30

Thanks! I was enjoying this debate (kind of rooting against tesla) until I got to read it, and I saw this part: Data recovered from the car’s computer shows that driver George McGee was pressing the accelerator to 17 miles (27.4 kilometers) per hour over the posted speed limit, leading him to override the vehicle’s adaptive cruise control before he went off the road. He hit the brakes just .55 seconds before impact, but it remains in dispute whether he saw or heard warnings from the Model S while he was reaching to the floorboard for his dropped cell phone. George! Were you pressing the accelerator while reaching to the floorboard for your phone? Seriously? We're gonna end up with warning signs on our fucking floorboards. WARNING! DANGEROUS! PLEASE READ BEFORE DRIVING. DO NOT ATTEMPT TO PICK UP YOUR CELL PHONE FROM THIS FLOORBOARD WHILE VEHICLE IS IN MOTION. Alright end rant.

Longjumping-Bedroom5 2025-07-18 05:36

Tesla has a lot of data so they should already know that people are bad at actively babysitting a car. People are overestimating their ability to take over at any time to save the lives of everyone in the car.

Facktat 2025-07-18 05:47

> something goes wrong, such as losing traction and killing a bystander How the fuck would this happen? AEB in non-self-driving already perform such breaks and losing traction while breaking shouldn't be a thing anymore since Bosch solved this problem in the 60s (ABS).

nekosake2 2025-07-18 06:00

because by switching off, they think they (Tesla) relinquishes any and all legal responsibility. its the driver's fault because the autopilot is off (without consent from the driver, even)

frownface84 2025-07-18 06:14

it wasn't a real example. The point i'm making is that when it comes to automous vehicles, the car companies want zero accountability. Faced with a trolley problem where a self driving car has to choose between running over person A and running over person B, the car company is going to get sued whichever way they choose; so instead if they throw control back to the driver last moment; then they also pass on the legal liability.

Youngnathan2011 2025-07-18 06:26

Isn't that just the normal thing for any car with a smart cruise control?

thinkbox 2025-07-18 06:28

Should be top comment. But it won’t be because nobody here does anything but read the headline and jerk off.

Facktat 2025-07-18 07:06

What BS. How would the situation have to look to kill more people by breaking. I can see why it wouldn't steer away but the car not breaking is just a bug and have nothing to do with avoiding legal liability. As I said, AEB already does that and is mandatory in the EU and will be mandatory in the US starting September 2029. AEB is exactly that, full breaking if there is danger detected but no steering.

zkareface 2025-07-18 07:09

It's common but not linked to that system. AEB is mandatory in EU by law since July 2024. It's been mandatory on commercial vehicles since 2013. Most new cars already had it at that point, but you can get them without smart cruise (because companies want you to pay extra for smart cruise).

Retox86 2025-07-18 07:16

Because if they had that the car would phantom brake like crazy all the time, because they dont have real working sensors in their car like everybody else, and that would make the feature useless..

Hi_Doctor_Nick_ 2025-07-18 07:38

So why didn’t the Tesla’s AEB activate?

Facktat 2025-07-18 07:43

Yeah, I am aware of that being the obvious real reason. Phantom breaks would make Tesla look bad. I was arguing with the other comment about this being "to avoid legal liability". No breaking all the time when there is danger would avoid legal liability but looking good is more important to Musk.

Hadleys158 2025-07-18 08:00

The government, needs to start making laws requiring self driving or autonomous vehicles to have installed blackboxes that can't be tampered with, for evidence collection purposes in cases like this.

Few-Masterpiece3910 2025-07-18 08:08

the problem is, why is FSD able to be used in such a way? Why can someone use FSD and speed at the same time? Either the car is in (supervised) control or it is not. The crash shows why driver monitoring is required for a system that only works while supervised.

zkareface 2025-07-18 08:16

Seems the crash was in the US? Perhaps they don't add it or disable it for US cars?

Retox86 2025-07-18 08:17

Yea, its all about looking good but in fact being dangerous.. And AEB is already mandatory in Europe but Im not quite sure Teslas AEB actually work good enough to be called AEB.

frownface84 2025-07-18 08:33

you missed my point completely. Car markers won't want autopilot engaged at all during any sort of accident. The decision to slam the brakes when an accident is about to occur might be the right one to make; but the auto maker won't want to be liable for making that decision so the default move is to pass the ball back to the driver to make the decision.

Facktat 2025-07-18 08:43

And you missed my point. It's not autopilot which should engage during any sort of accident but AEB. Tuning autopilot off seconds before the collision won't help them in front of a jury.

deco19 2025-07-18 08:48

No doubt many just see that's she's a woman and he's a man and that was enough for them. Not like choosing between one conman and another with faux masculinity.

AbbaFuckingZabba 2025-07-18 08:54

Earlier versions of auto pilot were very prone to Phantom breaking. This created an issue because a phantom breaking events is very dangerous and the liability lies with Tesla. Instead, it’s much smarter from a liability perspective to have the car not auto break unless it is 100% sure there’s a real obstacle. If the card does hit something, it’s just a beta system and the driver wasn’t paying attention.

numpyforyou 2025-07-18 11:49

Tesla is always in supervised mode under FSD. That means user controls take priority. In this, the user’s control was to speed and override FSD, and created a very dangerous scenario.

AdministrationTop772 2025-07-18 12:01

If that was their idea then they did not run it by a competent lawyer

the8bit 2025-07-18 12:03

Well in that I agree with the witness, it sure sounds like the product is designed to try and avoid liability. Cause it's called "fully self driving" but apparently it's also fully not liable for anything that happens

the8bit 2025-07-18 12:04

Yeah that is both the tricky part and also 99% of the value of autonomous driving. This is also why eg AI is not used to make medical decisions, but in that arena we decided to wait instead of just YOLOing it and saying "this will make diagnoses but also isn't responsible for the outcome of that"

mike7257 2025-07-18 12:10

This looks like highly illegal to me . Like your seatbelt goes off before the crash so there is no lawsuit to the manufacturer.. time to sue Tesla on this shit

PM_ME_UR_QUINES 2025-07-18 12:16

Directly from Tesla's official page on FSD (emphasis mine): "When enabled, your vehicle will drive you almost anywhere with your active supervision, *requiring minimal intervention.*" I guess *minimal intervention* refers to the 2 seconds you have to save your own life.

Engunnear 2025-07-18 12:39

>that would make \[it obvious how useless\] the feature \[is\] FTFY

Lotronex 2025-07-18 12:57

Lol, Tesla's don't even have odemeter's they don't tamper with.

Engunnear 2025-07-18 13:04

So thinking about this critically, Tesla's statements don't hold water (shocking, I know). Everyone stipulates that the driver was trying to retrieve his phone from the footwell, so we'll take that as a true given. Tesla is saying that the driver had his foot on the accelerator, thus overriding Autopilot. Let's assume that's true for a moment. They also say that the front collision alert went off at 1.76 s before impact, and that the driver hit the brakes 0.55 s before impact. Does Tesla honestly expect the claim that a driver who was completely disengaged from controlling the vehicle registered the warning, understood what it was, and moved his foot from the accelerator to the brake in 1.21 s, to hold up to scrutiny? I'll wager that the 1.21 s is how long it took for his foot to lift off the accelerator. The brake lights illuminating when regenerative braking engaged was recorded as a braking event, and he never actually applied braking. So the "best case" for Tesla is that they omitted a functional driver monitor system because it cost money and made it apparent that the system was not capable of autonomy, and a woman is dead as a result. The alternative is that they outright fabricated the claim and supporting data, saying that the driver had his foot on the accelerator.

jason12745 2025-07-18 13:28

They already testified they collected zero data on Autopilot until 2018, which absolutely cannot be true. A little data fabrication doesn’t seem improbable.

dtyamada 2025-07-18 13:31

>“Almost every time he commuted from his office to his condo, he would get a strikeout,” Moore said. When that happened, McGee would pull over, put the car in park, shift it back into drive and turn Autopilot back on, the witness said. This isn't the defence they think it is. You're saying its that easy to defeat you safety measures. If he wasn't able to restart it that easily maybe the victim would still be alive.

loxiw 2025-07-18 14:06

Oh he did react in 2 seconds (actually 1.6)

Acceptable_Rice1139 2025-07-18 15:04

My 2021 basic model F150 does that, and it doesn't even have radar cruise. It works extremely well and saved me twice.

snailnado 2025-07-18 15:43

For real, we need a clean court case for that exact matter.

scally1017 2025-07-18 15:55

There is a distinction between FSD and standard ‘Autopilot’. This is a 2019 crash and the article speaks only of Autopilot so we can’t conflate it with FSD.

Fun_Volume2150 2025-07-18 20:10

Teslas no longer have radar, so AEB doesn't work reliably on them.

Fun_Volume2150 2025-07-18 20:14

I really prefer my dumb car, where I never have to worry about taking control because I'm always driving, instead of being a passenger.

Fun_Volume2150 2025-07-18 20:19

People and journalists, especially journalists, have always used the terms interchangeably.

norsktex 2025-07-18 21:08

To put the blame on the driver. No other reason.

etaoin314 2025-07-19 04:52

I get that...but dont throw the baby out with the bathwater. I find the FSD frightening but there are other models of automated driving. I have a hyundai, the ADAS is not well liked afaik, but I really like the philosophy behind it. While it is far more limited than FSD, it drives ***with*** you not for you. I have it on in the background but I still just drive as if it was not on. I only notice it when it kicks in to make it harder to do something stupid (like drift out of my lane). It does not add as much convenience as the tesla system but acts as a safety net in case I get distracted instead of trying to be an occasionally deadly robodriver. Computer assisted driving can either focus on convenience or promoting safe driving. Doing both well is really tough. To me, safety is the better end of the tradeoff and is the way that hyundai went vs telsa going all in on convenience at the expense of safety.

Brave_Quantity_5261 2025-07-19 06:05

I think it switches off 2 seconds before the crash so that the data does not show that FSD was on at time of crash. It was designed that way. “Look no accident ever happened with FSD on!”

demonya99 2025-07-19 09:08

The logic is to evade legal liability. “At the time of the crash, and even before the crash, FSD wasn’t active. The driver was in full control of the vehicle.” This is the kind of factually correct but highly misleading statement made possible by the two-second disengagement rule, a legal loophole that allows lawyers to shift responsibility onto the driver, even when the system was in full control leading to the inevitable crash.

[deleted] 2025-07-19 10:17

>because by switching off, they think they (Tesla) relinquishes any and all legal responsibility Source for this?

nekosake2 2025-07-19 13:49

maybe read the article we're discussing? >The company’s (tesla's) lawyer, Joel Smith, pressed a key witness for the plaintiffs to agree that an audible alert 1.65 seconds before impact — when the car’s automated steering function aborted — would have been enough time for the driver to avoid or at least mitigate the accident.

One-Peace-8139 2025-07-19 16:19

The old versions of that webpage were way more bullshitty, you can see on the wayback machine

One-Peace-8139 2025-07-19 16:21

Him pressing the throttle pedal is data provided by Tesla, who has been proven to straight up lie about data. I doubt he was pressing the throttle as he bent down to get the phone.

sonicmerlin 2025-07-20 06:39

Bet it was Elon’s decision

sonicmerlin 2025-07-20 06:42

Yeah but I highly doubt that will work in court. It’s a pretty absurd argument and no jury would buy it IMO.

hilldog4lyfe 2025-07-20 21:44

Tesla has historically had very frequent general counsel departures. I remember reading years ago that the average tenure was 6 months.

judgeysquirrel 2025-07-21 10:47

I use ADAS most of the time. I still drive while the system is on. If I have an issue, medical or simply attention wise the ADAS acts as my backup. Autopilot should have been called "copilot". FSD should have been called "navigating copilot". If something isn't level 5, I'm driving.

Add comment

Login is required to comment.

Login with Google