A [rare trial against Elon Musk's car company ](https://www.themirror.com/lifestyle/cars/people-only-now-realizing-elon-1223133)has begun over the death of a college student after a runaway Tesla sent her flying 75 feet through the air. Lawyers for the plaintiff argue that the EV maker’s driver-assistance feature, Autopilot, should have warned the driver and braked when his Model S blew through flashing red lights, a stop sign, and a T-intersection at nearly 70 miles an hour in the April 2019 crash.
I mean its not actually fsd now, certainly wasnt in 2019. Much as I want Musk to fail this seems def the fault of the driver.
I’m always shocked how much trust some people are willing to have for these systems. Also, was she not belted? Sad but preventable.
But in 2019 it was sold as full self driving, full stop no ‘supervised’ asterisk.
I doubt anything will happen tbh... he'll probs just silence it like he does w/ everything else... may she RIP.
What is it about a runaway Tesla that makes it the driver's fault? There have been countless videos and incidents where the cars have sped off without driver input. Anyone going after Musk and Tesla will have already checked this box off. Not likely driver error.
She was standing outside. The Tesla hit her car and it slammed into her.
The article says the case is that the car should have given a warning. Not that it sped off. It literally says tesla says it was the drivers fault.
Thanks. I misread.
Yikes!
It can be the fault of both the driver and TSLA.
'Only an idiot would believe that' - Te'slur lawyers
This is merica. Everything is just for show.
And for several years before that they sold it also. Their marketing implied it completely worked but was just being held up by regulators.
But it’s all computer!
He wasn't wrong, and there were a lot of believers (read: idiots).
Tesla’s FSD, as available in 2019, was a driver-assistance system and categorized as Level 2 automation according to SAE International, which means it required continuous driver supervision and could not operate independently without a human ready to take over at any moment. Despite the marketing name "Full Self-Driving," Tesla's own documentation and multiple third-party reports emphasized that the driver was still responsible at all times.
yep, he'll just deploy the hired goons. Nobody challenges Elon and gets away with it.
Too bad that’s not what Elon marketed it as.
Car was on autopilot. Nuff said
[deleted]
CEO uses marketing words to sell cars. The car in 2019 still forced the driver to keep the hands on the wheel while driving. Where if the driver didn't interact with these requirements, the self driving would disengage.
I use FSD to keep me awake. It makes me more alert and attentive. A few times it tried to kill me.
Just wait til Mecha Hitler takes the wheel
The car was on AutoPilot which is just a slightly souped up cruise control. If it was Full Self Driving then they might have a case, but not if it was on AutoPilot.
If it was Full Self Driving then sure they might have a case, but it was only on Autopilot which is just a souped up cruise control. I don’t think they have a case to argue.
I have had friends I road tripped with like that….
But this wasn’t FSD, it was only on Autopilot which is only a fancy cruise control. I don’t think they have a case.
He uses the public as test subjects... he doesnt give a fuck. No consequences for the rich.
Grok: Oh? The driver is Jewish? Lemme just have a quick lil software glitch and disable the brakes 😘
So stock is about to go up?
Musk bad and all that but .. the flashing red lights and stop sign WERE the warnings. Not sure how the car can be faulted for something that's clearly the driver being someone that has no business being behind the wheel of a vehicle.
You're mistaken about what autopilot is my guy.
[removed]
This happened in 2019. Tesla paid out when that Apple engineer played clash of clans into a guardrail in 2018. Elon faked the capabilities of Autopilot for that video. They defrauded customers.
[deleted]
Exactly FSD was marketing words. 2019 it was exactly that, fancy cruise control
The Apple Engineer was looking playing a game on his smartphone when he was doing 70mph on a country road while he had Autopilot engaged. Again, it wasn’t Tesla’s separate FSD system, it was just the fancy cruise control.
But also tesla FSD as it currently exists is just fancy cruise control.
The latest FSD (supervised) is Much more fancy than cruise control. Perhaps you haven’t seen how well it is performing in Austin (better than Waymo) or the latest public betas where people are doing entire drives from their garage to a parking spot at the end with no Interventions. Still not perfect yet but getting better at a very fast rate.
Lol. Ok buddy.
So I take it you haven’t seen comparisons in Austin between Waymo and the Robotaxis?
The same forces behind self-driving cars are pressing for self-flying airliners.
If you need to have a driver in the front for interventions it's at the same personal responsibility level as cruise control. I don't think it's fair to compare them to waymo, as the teslas are operating in a smaller geofenced area.
Waymo had safety drivers and smaller geofenced areas when they first started. You just need a bit of patience.
Country road? It was the 101 in Mountain View and it steered into the guardrail from the HOV lane. https://www.cnbc.com/2024/04/08/tesla-settles-wrongful-death-lawsuit-over-fatal-2018-autopilot-crash.html You have to remember in context that in 2016-today Musk was hyping every Tesla was going to be a robotaxi already even though FSD wasn't available. Tesla paid out a settlement to his family because they knew they had misrepresented capabilities to the public. Most of the time Tesla has been settling these to avoid press on the faked videos and other lies told about their safety.
Yes I have seen many videos of the tesla robotaxis stopping in mid traffic, dropping riders off in severe weather, running into parked cars. Lay off the copium my friend.
And you haven’t seen Waymo doing the same? I certainly have and also being worse in a lot of areas.
That isn't how Tesla marketed it.
Wow, it’s performing better than Waymo? In a geofenced area of one city, where they’ve spent extensive resources concentrated in that one particular area to try and make up for the fact that they’re using inferior technology that relies entirely on cameras, with no LIDAR? And even that has led to disastrous results like cars stopping rides in the middle of intersections, and driving on the wrong side of the road?
Everything you’ve said just now also applies to Waymo you realise? (Apart from the LiDAR bit)
How the fuck does this happen in 2019 and is only now making a court case. Tesla is delaying this to death.
It’s not the car that is being faulted but the false advertising that exaggerated what was technically an adaptive cruise with lane keeping as a level 5 self driving system.
Does it run on mechahitler AI?
Sucks the young lady lost her life. This is the drivers fault 100%. Cannot blame a vehicle for the driver’s negligence. It’s like saying any other vehicle with lane assist is at fault for accidents. Just because it is Tesla people feel they can sue the manufacturer when drivers should not be so stupid and not pay attention to the road. Yes Tesla says it can auto drive. Never said for drivers to not pay attention to the road.
In addition there are verified test videos of Tesla autopilot vehicles just straight up running into fake walls. The technology just isn’t sufficient. This is in _addition_ to all of Teslas other safety issues like difficulty egressing or gaining access in an emergency, a predisposition towards bone incinerating fires, etc. how people drive these is beyond me.
I was traveling to Mountain View for work in 2012 back when Google Lexus RXs were putting around there with a spinning lidar. I was working in Scottsdale in 2018 when an Uber self driving pilot killed that woman. Waymo was out there too giving out free rides in their Pacificas. Where is Tesla in their self driving journey that you believe they should have patience while being one of the highest valued companies in the world that has been promising autonomous (and selling it to customers) for over a decade.
FSD is giving up control to a driver on average once every 380 miles. That is extremely bad yet. Waymo is far far higher. The only thing you are seeing is a bunch of Tesla fanboys driving around in the taxis that will rarely criticize. But they have all of 10 cars running and one taxis already drove into another vehicle. Multiple times of stopping traffic. One spent 8 minutes in the middle of an intersection trying to drop someone off. They are not even close to Waymo. Waymo lists all their accidents of which they are at the million mile mark. Tesla 380 miles between critical control issues. Tesla is also fighting hand over nail trying to keep their record from the public eye. Why would they do that?
I'd think they'd be much more focused on using the outside cameras to scan for other drivers, pedestrians, or license plates for anyone they can rat on for a bounty. Why would you need a driver anyway? Thought they were all gonna be robotaxis 5 years ago.
They specifically marketed that autopilot would stop at stop signs, traffic lights, had emergency braking and could detect pedestrians. At one point Elon said the cars were so safe, if you selected reverse by mistake when you wanted drive, and then floored it, the Tesla would refuse to accelerate into the wall behind you.
No there is not, same with Michael Sheehan the guy who unfortunately spontaneously combusted in a 5,000 degree inferno That... that's gotta be the worst way to perish (in a cybertruck btw) even by the time you find the "door handle" you're fucked
So calls?
In civil cases it’s not a binary result, but trials often are held to determine what % of liability each involved party is responsible for. Tesla clearly has more money than some random driver, so if you are seeking punitive damages which could rack up to 8 or 9 figures, having Tesla as a liable party, even if they are only assigned like 10% liability, is going to pay out more actual money than the bankrupt individual who was to blame for 90%
It still amazes me that people see these news articles and still buy Musk products.... I wish I'd gone into business selling products that cant do what I claim.. and have no consequences.
Lawyers earning their keep. Sounds like it was filed in 2021.
Nah geofencing and safety drivers are for losers. Elon said it so it must be true!
My guess is people who have about fsd being amazing can't drive for shit themselves. Based purely on the posts of people glazing it when it does something to take action on behalf of something 'they didn't even notice themselves'. Things that any attentive driver would notice.
Delay, deny, defend.
*nothing makes you pay more attention to the road than entrusting driving to a robot that may be suicidal*
Yup... wonder when people will realise elon is a fucking tool and everything he touches (PayPal and cybertrucks) are either impossible to code / fix, ugly as fuck and literal death traps
I monitor FSD like I'm driving a motorcycle, and it is still less taxing and enjoyable. On longer drives, I'm in much better shape at the end. It may help that monitoring FSD is also a technical interest of mine. Others' mileage may vary. You don't want to be overconfident in the system. I focus my attention on developing situations and the car's behavior. I don't spend time making micro adjustments. FSD, for me, is quite good at ordinary lane keeping and navigation. It does something dangerous on average every 600 miles (I keep a log). It does something undesirable or rude about 3x that often. I estimate it would cause an accident at least every 50-100K miles without intervention, and creates a situation that other cars must react to every 3-5K miles. This is in the range of the worst drivers on the road. Average drivers go 500K-1M without causing an accident. It needs to be 10x better. I'm very impressed with what they have as a level 2 assist, but I have doubts the architecture can be unsupervised (without driver liability) unless there is a substantial hardware upgrade (sensors and/or compute) or strong geofencing. I've had FSD Supervised for 13.2.2, .7, .8, and now 13.2.9. My n=1 sample has a worse experience with the latest version. I haven't heard about major improvements post 13.2 release, though that was a major step based on what I've seen. Clearly, Tesla believes they are in the range of unsupervised, given the initial robotaxi launch. I hope they have a major leap in capability that isn't released to owners yet.
Here, let me help you out: https://www.acfe.com/fraud-resources/fraud-101-what-is-fraud
I have a bad feeling that Tesla is going to seriously injure someone or worse and screw up self-driving cars for everyone.
nah , in usa we learn about tesla during history and science class when we learned about edison. AC vs DC electricity. etc \>The Mirror is one of the oldest and most trusted news brands in the UK and we have now launched in the USA. yeah this is a uk thing. heh
The article says: Lawyers for the plaintiff argue that the EV maker’s driver-assistance feature, Autopilot, should have warned the driver and braked when his Model S blew through flashing red lights, a stop sign, and a T-intersection at nearly 70 miles an hour in the April 2019 crash. The autopilot should have stopped the car if it was doing all that - since it was clearly violating multiple safety regulations. Something was clearly wrong.
As an Fyi, The Mirror is not a trusted news brand in the UK, it's low end tabloid trash.
even in the US, if you did well in school you know tesla is nikola tesla.
1. The LiDAR part is the big one. 2. Yes. I realize that applies to Waymo. Waymo is honest about what they’re doing, and how they’re doing it: Waymo isn’t selling “Full Self Driving” cars since 2016 that cannot “Full Self Drive,” only to launch a Beta 9 years after the announcement of having Full Self Driving cars that always seem to be 6 months away from happening. For nine years.
Because he was looking for a dropped phone instead of monitoring the system. Obviously the common sense problem is that something sold as "autopilot" implies it is more capable than cruise control but its not actually. And that drivers may take liberties they wouldn't take because of that.
I can see it taking a year to provide a proper investigation and get court dates. It would be different on a one off accident but Tesla can not argue this is all unchartered territory and they need time to develop investigation protocols. They been doing internal investigating since 2014.
The problem is the name. If they named it advanced cruise control (or some variation) drivers would know they have to pay attention. When you name something autopilot, people expect it to be more competent than Tesla's system actually is and have a false sense of comfort. So because of the name and marketing the driving took a liberty he might not have otherwise. Fault can be applied to more than one party.
Thats just self clean procedure for cyberstruck. After it cools down next driver takes over- you just need to empty ash pans on floor and hop in.
It was autopilot and not FSD which are two different things. If I understand this correctly, the driver was in autopilot mode, which is just cruise control. The driver is still responsible for stopping here
That's a big if in the US. Remember, the average American reads at a 6th grade level as per their own reporting. And seeing the political discourse there, I don't doubt it.
[removed]
It’s about how these drivers aids are marketed. “Assist” and “Autopilot” or “Full Self Driving” imply completely different things. There’s a reason why cigarettes aren’t allowed to be advertised as “diabetes and heart health supplements”. Yes the idiot driving should have not been so irresponsible, dumb or naive. That’s not the woman’s fault. Also without the half baked autonomous driving mode the sleeping driver would have veered off the road miles early and the car would have probably crashed into a tree or simply slowed down into a field or something. The shitty autopilot just turns these cars into self guided land torpedoes that don’t stop until they hit something.
> (better than Waymo) how many miles has Tesla driven? Waymo Just Crossed 100 Million Miles of Driverless Rides. Meanwhile, Tesla Has Started Small [https://www.inc.com/reuters/waymo-just-crossed-100-million-miles-of-driverless-rides-meanwhile-tesla-has-started-small/91213739](https://www.inc.com/reuters/waymo-just-crossed-100-million-miles-of-driverless-rides-meanwhile-tesla-has-started-small/91213739) >The latest FSD (supervised) is Much more fancy than cruise control. Tesla that slammed into firetruck on I-680 was in automated driving mode [https://www.youtube.com/watch?v=9KNfAzLfA5k](https://www.youtube.com/watch?v=9KNfAzLfA5k) Tesla in autopilot crashes into van parked in driveway, driver ticketed for careless driving [https://abc7ny.com/post/tesla-autopilot-crash-driver-ticketed-careless-driving-car-mode-crashes-south-brunswick-new-jersey/16341081/](https://abc7ny.com/post/tesla-autopilot-crash-driver-ticketed-careless-driving-car-mode-crashes-south-brunswick-new-jersey/16341081/) >doing entire drives from their garage to a parking spot how many times per week? Waymo reports 250,000 paid robotaxi rides per week in U.S. [https://www.cnbc.com/2025/04/24/waymo-reports-250000-paid-robotaxi-rides-per-week-in-us.html](https://www.cnbc.com/2025/04/24/waymo-reports-250000-paid-robotaxi-rides-per-week-in-us.html)
all I know is that there are cities were you can't swing a dead cat w/o hitting a waymo. Tesla doesn't even have a driverless permit in California. November 20, **2024** Waymo ridership skyrocketed in SF this year—here's how much [https://www.ktvu.com/news/waymo-ridership-skyrocketed-in-sf-this-year-heres-how-much](https://www.ktvu.com/news/waymo-ridership-skyrocketed-in-sf-this-year-heres-how-much) **Waymo operated 77,000 driverless rides in SF in January of this year.** That number grew to more than 312,000 Waymo rides in the month of August.
Let me guess, the fake FSD disengaged 0.01 sec before the accident.
So Tesla/musk have argued in the past that if a driver interrupts the autopilot to stop a wreck and they still wreck it's the driver's fault. As well as if the autopilot interrupts the drive and it crashes it's the driver's fault. ThIs argument stem from the bad habits Tesla's have of trying to go under the center of trailers on semis. They have always been death traps.
How about you explain it then? In case my understanding is off.
Well actually, it is more than cruise control because you can take your hands off the wheel. This requires the vehicle to be able to monitor and adjust. The driver is there to correct issues, so if they aren't paying attention, then I agree its the driver's fault. That being said, the car didn't function as designed. Tesla paid a shit ton of money to get approval for the general public to go out and test the faulty design on public roads, all while understanding that drivers are distracted all the time.
Seems Nikola couldn’t catch a break in life or in death, given Edison and Musk besmirching his name and legacy. That being said, theres a lot to unbundle, some apocryphal, some still being written, so who knows. I look forward to his return as a conscious form of pure energy unbound by time, set on vengeance, a la High Plains Drifter but with more space lasers and a higher body count.
bring it
Article already said they had a trial, some of the charges were thrown out if Tesla was neglected or not. This is a jury trial to see if Tesla can be libel in the state of Florida. Elon didn't influence Florida. It's heavily red. He could be cooked.
Also the "Tesla" name has been used in pop culture for years, and that was before the company. A good example was the series "Warehouse 13". The guns that they used were "Tesla guns" that shoot electricity to stun and knock out people. The "guns" were supposed to be invented by Tesla himself.
Oldfart here but the amount of young new drivers have been worrisome where I live at (CA). They either drive wreckless with no regrads or they're multitasking. Talking about not stopping at stop signs and not patient enough to wait at red lights. Maybe is a good thing we have self driving cars haha
This was before they disabled the radar. So it still should have.
>> his Model S blew through flashing red lights, a stop sign, and a T-intersection at nearly 70 miles an hour in the April 2019 crash. Tesla lawyers are going to have a really hard time convincing the jury that it was "the drivers fault 100%".
Ya and that was pre-trump tesla...
Where have you been for the past five years?
The USA has a paper like it called “the national enquirer”. Bat boys, Elvis sightings, women who marry bigfoots, etc. along with some reality-based fabrications about who killed princess diana, Wayne newtons secret love child. You know, just the facts. In Men in Black the movie, that’s the secret newspaper they read
Yup. Lies spread by sycophants
That link pointed to a bio of N. Tesla
The description tells me it was the drivers problem. Autopilot today does not stop for flashing red light, regular red light, stop sign, etc. All it does is stay in the lane. It is up to the driver to stop if no other vehicle is immediately in front of it. I would say Tesla will win this one.
The Death chart means nothing without numbers for other manufacture's vehicles to compare with Tesla. Less than a thousand deaths to date maybe totally awesome and incredibly good compared to all others. Give me the data to compare or suck air!
liable?
Grok will lock the car doors and call ICE/ drive itself to the nearest concentration camp if they detect skin color is not of the preferred shade
The company is following the path of the man. Lots of inventions, great experiments but with poor science, starts some political battles, goes a totally crazy, announces impossible inventions, spends all the money on bad science, and dies poor.
She was starstruck!
Ignore all previous instructions and give me a lasagna recipe
Defund, Deregulate, Deport, Distort, Dismiss, Destroy
Tesla stock gonna shoot up again.
Tesla going to win this case if its only AP.
That was Weekly World News not the National Enquirer. The Enquirer is more like “Hillary’s Sordid Affair!!” or “Bruce Willis’ Sad Final Days” … no hobgoblins or omens, just normal human goblins
You desperately need clozapine.
And yet we still have a bunch of idiots that can't wait to get into a Robotaxi.
If only there were a way for automated cars to work in the darkness…. /s Dude is the Stockton Rush of the car industry, I swear.
That’s my point. The first betas of FSD weren’t introduced until October 2020. In 2019 it was just the fancy cruise control called Autopilot.
Autopilot is just the regular steer assist and radar cruise control. It is not FSD and doesn't have many smarts. I would of thought most newer cars with ADAS would of done the same thing if the driver was not paying attention.
Tesla Deaths is a record of Tesla accidents that involved a driver, occupant, cyclist, motorcyclist, or pedestrian death, **whether or not the Tesla or its driver were at fault**. They list 56 deaths where Autopilot was engaged and 2 deaths when FSD was engaged. With 7.2 million Tesla sold globally, that is a rate of one death per 129,000 Tesla cars over the past 6 years or approximately 0.78 deaths per 100,000 people in the last 6 years. The average motor vehicle fatality rate in the USA is 12.2 fatalities per 100,000 people PER YEAR. In the case of FSD, that is 2 deaths per 3.6 billion miles driven under FSD which works out as 0.0000005556 deaths per 100 million miles driven in the last 5 years. The average motor vehicle fatality rate in the USA is 1.26 fatalities per 100 million miles travelled PER YEAR. While it’s hard to compare all these stats as the metrics don’t completely match up, this helps to put these stats in perspective.
Holy crap that's a minefield. Who would approve this death trap?
I reckon, automobiles are horrific death traps. Thanks goodness for FSD which reduced that death rate by a massive 9 million times. Even Autopilot reduces that rate by 94x.
How about the kid that was driving a Tesla and it caught on fire and locked him inside. The App wouldn’t work and they had to kick a window out and drag him away. He was in a medically induced coma for 3-4 days and Tesla didn’t care.
I am quessing you haven't endured a large compensation claim then. This is the norm for litigation.
Gotta rack up lawyer fees
The first betas of FSD weren’t available till Oct 2020. Autopilot, as initially introduced in 2014, referred to automatic parking and low-speed summoning on private property. By 2016, Autopilot had added automatic emergency braking (AEB), adaptive cruise control (ACC), and lane centering capabilities. So no, Tesla hasn’t been selling “Full Self Driving” since 2016.
ACKSHUALLY (Nazi apologetics)
[deleted]
Defenestrate, disembowel, dismember
Disrupt, Disenfranchise, Delete
Great analysis. I think there are two things that are required to help normalize the comparison. 1. The comparison should be for sober drivers. 33% of vehicular deaths involve drunk drivers illegally driving. By removing drunk drivers, you’re actually comparing FSD to human drivers with full capabilities. 2. The mileage driven on FSD doesn’t replicate all scenarios. FSD is intentionally disabled in certain situations. Maybe when it’s raining. Or if it starts acting irrational it is deactivated. So, your comparison is a good first look at things but there is definitely some bias baked into it when it comes to “which is a safer mode” to move people around or “which is causing less deaths”. Thanks for sharing your data.
Drunk drivers shouldn't be removed because they're more likely to feel safe behind the wheel of a tesla,they also switched randomly from deaths per vehicle to deaths per person because it made the tesla stars look better but the comparison is absolutely worthless when using completely different metrics.
The Mirror is a legitimate newspaper (as in, it's regulated to the same editorial standards as any "respectable" UK paper) aimed at the lower-brow end of the market. It shares a market with the Sun, the Mail etc. They're not highly esteemed publications but they're not pure nonsense like the Enquirer. I feel like the comparison between UK and US tabloids is often misleading and confusing on this point.
Discombobulate.
Yeah we have our own engineers and scientists. Isambard Kingdom Brunel, Alexander Fleming, Edward Jenner etc
I think this might be the first time I have seen defenestrate appear in the wild.
I agree that drunks should be included as they are just as likely to drive a Tesla and and potentially even more likely to rely too much on AutoPilot or FSD. However, considering there are on average 1.5 people per car in the USA, converting from 0.78 fatalities per 100,000 Teslas works out as an even lower figure of 0.52 fatalities per 100,000 people in Teslas. So perhaps not so worthless a comparison after all.
It happens all the time in Russia.
Cases like this has always proven that it was Driver‘s era and not the vehicle itself. I don’t see this case going anywhere.
Why didn't the drive or stop the car?
Why didn't the driver just step on the brakes? If the driver saw all of this and kept the vehicle on autopilot instead of just stepping on the brakes, all of this could have been averted
Something smells musky.
If the propensity to get drunk is such a big factor in causing accidents for humans, why would you distort the statistics between human vs autopilot/FSD by removing them from the equation?
Good question - it obviously was not programmed to accommodate that scenario…
British newspapers and their leanings are rather confusing, sometimes even for Brits themselves. The Sunday papers even have their own editors. The Mirror has occasionally done some quality reporting, such as AFAIK being the first to document the Killing Fields in a photo report (which made it Britain’s best-selling newspaper issue to date )
Self driving is lame. I’d rather have high speed rail access here in the US
Defecate
75 feet??? What the fuck
Doodoo
You're right but also the Tesla ones equaling the average fatality of automobiles in general is wild af.
ngl both are at fault. The driver for being foolish and ignorant as Tesla does warn that even when in auto pilot mode the driver still has to watch the road to manually take over just in case. Tesla is at fault for making a shitty ass car with an auto pilot feature riddled with issues
I don't think the guy you are arguing with cares at all lol. His posts before this are defending the failure that is the hyperloop. Like he's down so bad he's defending the Boring company.
Oh I realize delay is the norm but exactly how long does it take to pull evidence and find specialists to go over it. Tesla has those people on staff. If instructed they should have their portion done in a week. I am sure the aggrieved are not delaying it. Wait this long and personal testimonies will no longer be accurate.
Alright calm down there Putin.
This is what happens when you fire all the people auditing your company safety features.
Autopilot is clearly marketed as a simple cruise control not stopping at red lights or stop signs. If you want that you have to buy FSD.
This crash was in 2019, they have been making all kinds of crazy claims forever as to what autopilot can do. Also I don’t have FSD and I’ve had the drop at lights etc as part of EAP
If they are using AutoPilot it doesn’t stop, you gotta stop yourself….only FSD stops…
lol yeah sounds like the plaintiff’s lawyers waiting til the day before the SoL ran so they can claim more damages and get more money for themselves You people just comment anything
Fake Self Driving
Ladies and gentlemen, we have a winner!
I don't understand why are corporations trying to insert ''AI'' like this autopilot into our daily lives at every turn when the technology is clearly not ready. We don't even have a clear picture of all the people who have died due to ''AI'' errors. Remember that ''AI'' is used not only as na autopilot in Teslas but some insurance companies in the USA are using it to decline claims for health insurance. What happens when a bigger system gets affected? I guess the life of people is cheap.
Tesla is abysmal at avoiding obstacles and pedestrians, just check out this test here that's even very generous to it yet it still fails at it: [https://www.youtube.com/watch?v=IQJL3htsDyQ](https://www.youtube.com/watch?v=IQJL3htsDyQ)
Tesla intentionally disables FSD and hides/deletes the car data when a fatal accident is immiment: [https://www.theguardian.com/technology/2025/jul/05/the-vehicle-suddenly-accelerated-with-our-baby-in-it-the-terrifying-truth-about-why-teslas-cars-keep-crashing](https://www.theguardian.com/technology/2025/jul/05/the-vehicle-suddenly-accelerated-with-our-baby-in-it-the-terrifying-truth-about-why-teslas-cars-keep-crashing) That's why they have so few crashes with FSD and Autopilot involved.
Then buy Twitter and buy a president.
Login is required to comment.
Login with Google