The company’s FSD tech doesn’t actually self-drive. Experts say that advising customers to switch in on when they're drifting between lanes is exactly the wrong move. Read the full article: [https://www.wired.com/story/story/tesla-urging-drowsy-drivers-to-use-full-self-driving-that-could-go-very-wrong/](https://www.wired.com/story/story/tesla-urging-drowsy-drivers-to-use-full-self-driving-that-could-go-very-wrong/)
How about you just don't drive a car when you're drowsy?
>Tesla update 2025.32.3 includes an undocumented change to Tesla’s drowsiness detection feature. The change was discovered by Tesla hacker Green, who detected the change in decompiled Tesla code. When your vehicle detects that the driver is drowsy or notices multiple lane departure assistance warnings, your vehicle will display a pop-up on the screen, prompting you to proactively engage FSD to help you keep an eye on the road. "Are you sleepy? Use FSD!" - Tesla "You crashed? That's not FSD fault, you have to take over at a moment's notice!" -Also Tesla
This has to be a clickbait title. That would be a direct incitement for customers to disregard the law, which sets up for easy state lawsuits.
I really hate this company and its deceptive tactics. I hope their stock crashes one day.
It does yes and that is what they are doing. They aren’t very clever.
It's not a question of if, it's a question of when.
We’ve all been waiting a long time
Well but they’re above the law so…
Haven’t they lost several suits
My model Y has saved my life a number of times while on autopilot. I live in the country and always need to drive a long distance every time I see one of my family members in Melbourne. So I can easily fall asleep and end up in a tree at 110 kph! Which I did in 2018 in my friends Volkswagen Amarok. I tank my lucky stats that I didn’t end up hitting someone else and killing them. So yes, I use auto pilot all the time now. It’s utterly fantastic to have.
Did they not learn anything from the lawsuits? It’s not the FSD system that slap them with enormous fines. It’s how they marketed the system. This $500M per incident of drowsy driver on FSD for next few years. Morons.
You'll probably be featured on the news one day in a fireball unable to get out of your Tesla after FSD kills you... hopefully no one else.
Learn to stop and rest if you’re too tired to drive, before you kill someone.
None of them hurt.
And maybe get a CPAP, this is a symptom of poor sleep.
Drowsy drivers should pull off the road and rest, end of story.
Remember the day after the verdict came down in Florida, when fElon retweeted some idiot playing with her hair while FSD made her car move around?
Going to get someone killed and is deceptive, but the current NTSB and SEC don’t care.
Have you ever considered: *Pulling over to rest when you're tired?* I know its old fashioned, but has a 100% success rate.
It's okay. They've already tested the automatic post-crash data-deletion subroutine. It works.
That should stop it from happening from now on, thanks mate. Now do drunk driving!
FUCKING PULL OVER AND SLEEP If FSD has saved your life MULTIPLE times then for the love of God surrender your driver's license.
Honestly? I get this. The eye tracking is annoying to a fault. Dosing off in stop and go traffic I could see it being much safer with than without. It will beep as soon as you close your eyes pretty much. Without you’d just drift off into oblivion lol.
Archived version https://archive.ph/ghyiq
Thank you for helping u/wiredmagazine stay in compliance with Rule 6.
They learned that they can afford very expensive legal representation, and they have an army of assholes ready to defend them from any amount of logic and reason.
Or a car so boring it makes you drowsy
In any sane country, this would put Tesla under massive liability because the correct answer is if you’re drowsy don’t goddamn drive.
I'm sure they'll drag them out with appeals, and it will be several years before these lawsuits affect TSLA's cash position. Plenty of time for TSLA's leadership to cash in on more stonk.
Except those lawsuits they settled or lost in court. The precedent has been established. The same argument will win almost certainly without some sort of political attack on the courts.
Imagine creating self driving that you're not supposed to trust, the idiocy inherent in that amazes me.
>My model Y has saved my life a number of times while on autopilot. guessing these people had different experience. **Tesla Sued In Australia For Overpromising Range, Phantom Braking, Misleading FSD** [https://www.carscoops.com/2025/02/tesla-sued-in-australia-for-overpromising-range-phantom-braking-misleading-fsd/](https://www.carscoops.com/2025/02/tesla-sued-in-australia-for-overpromising-range-phantom-braking-misleading-fsd/) >I tank my lucky stats that I didn’t end up hitting someone else and killing them. keep counting those stars. **Deadly Tesla Crash Raises Questions About Vision-Based Self-Driving Systems** [https://www.carscoops.com/2025/06/tesla-fatal-crash-arizona-fsd-vision-only-safety-investigation/](https://www.carscoops.com/2025/06/tesla-fatal-crash-arizona-fsd-vision-only-safety-investigation/) Video from the Tesla shows that the roadway leading up to the crash was obscured by direct sunlight on the horizon. >So I can easily fall asleep which means if the car says take over, you're kind of fucked?
How is that legal
Someone needs to take your license away before you kill someone.
[removed]
Let just encourage inattentive drivers to continue driving. And to use a system that is well documented and even portrayed by Tesla no less to not be ready for full self driving.
What would you say if GM encouraged you to use SuperCruise when you're drunk?
Things only become illegal when legislators pass a law making them so. If they figure nobody is stupid enough to do a thing, it's generally not illegal.
Completely unrelated to the subject of the thread.
Crash = another new car sale. Stock up on this news!
Sounds like they’re trying to upsell drivers to FSD /s with incredibly dangerous and misleading in-car messaging. I don’t think that the judge in the CA DMV case can use this in deciding the case, but maybe the DMV can use it to ask for a more drastic penalty.
I ended up turning off FSD a couple of updates ago because it became completely unreliable and borderline reckless. It would slam on the brakes at shadows on the highway, drift toward white lines when off-ramps or extra lanes opened, and constantly mismanage speed — either going way too fast or well under the limit. Even lane keep assist still struggles with these issues. Honestly, I wouldn’t trust it unattended; the risks are too high. What’s most frustrating is that earlier versions worked better, with only minor glitches, but recent updates have made it worse.
Let’s hope you lose your license soon. FSD is a flawed cruise control. We get it, you probably paid Tesla $14,000 for it so now you like to go around trying to convince yourself and others that it’s something great. You were robbed. Elon thanks you for your donation!
Die while you are sleeping. No suffering. Elon is a good man.
Y'all gonna hate me for this, but it makes total sense. If you obviously shouldn't drive if you're tired, but if you're going to do it anyway, fsd will warn you if you close your eyes and pull over if you don't pay attention for too long.
"Some of you may die..." \- Elon Musk
Not just Tesla, also their fanboys
That info very rarely makes it to the news which is unfortunate
The new Naked Gun movie pretty much covers this scenario
How about not trusting the worst brand for deaths per billion km (by over double the US national average)? Full Self Driving= Full time monitoring for fuck ups, at risk of your life.
It'll be when pumping isn't enough to hide total failure.
No regulation = companies don't need to give a fuck. Buy the politician , all the problems go away in the USA
"Why isn't a beer dispenser in the dashboard a factory option?"
Reminds me of the old joke 'my grandfather died peacefully in his sleep, unlike the screaming panic of the passengers in his car'
This is not a company with safety is a priority clearly.
This effort to induce demonstrably impaired drivers to unlawfully operate a motor vehicle should justify imposing personal criminal liability on Tesla's directors and executive officers for the inevitable consequences.
It is normal that in work space, workers make accidents some times.
No way. It's 98% perfect! It only crashes twice a month, just like every human does.
Tesla should be urging drowsy drivers not to operate the fucking car.
Seems pretty reasonable to me. If people are honest, we all drive when drowsy at times. But, now that I have FSD (it's pretty much the only way I drive my model Y), it's so much safer for everyone in those times where I have to drive an hour or so late at night and I'm not driving in high speed urban traffic. I don't sleep, but the ability to glance away for a couple seconds or just move around some helps keep me more than alert enough to monitor FSD.
Gateway-Frederick_Pohl.MP4
Dang, just no fuck given that their program isn’t capable of unintended driving. Yeah that will not create lawsuits.
Login is required to comment.
Login with Google