← Back to topic list

Tesla intentionally crashes headlong into dump truck

SackofBawbags | 2025-12-04 16:48 | 559 views

Oopsie Doopsie! Gotta be human error for sure. Right Elmo?

Comments (197)
bobi2393 2025-12-04 16:55

I'd guess human error, and I doubt it was intentional. ***Edit:*** After re-watching and seeing the phantom dashed-white lane markings and lack of double yellow line at around the point the Tesla began swerving, that does make me think an FSD or Autopilot error are most likely (see around 0:22). My original doubt was based on thinking it was a properly marked road, and it clearly isn't.

Different-Feature644 2025-12-04 16:56

Awful. That was almost certainly a suicide I feel like. They dodged that truck and headed straight for the dump truck. Articles say impairment wasn't a factor. Even the theory of "Tesla FSD disengages before impact" wouldn't make sense because the driver would've had to have been cranking the wheel towards the oncoming lane to get it to continue the swerve even if the system detected an imminent collision with the white pick-up. The only FSD related thing I could imagine is if it was an inexperienced driver who got the attention prompt to touch the wheel then they jerked the wheel hard and it disabled FSD. Wide open road, no place to be turning into... this honestly does look like they were trying to kill themselves.

yamirzmmdx 2025-12-04 16:56

Oof. Getting Tesla insurance to pay out for all that damage will suck.

thegoodcrumpets 2025-12-04 16:58

They've been known to turn off auto pilot a split second before impact to be able to say it wasn't auto pilot driven for years. They'll blame the driver for their stupid camera only systems shortcomings as usual.

PM_ME_UR_QUINES 2025-12-04 17:07

It's like pushing someone very hard so that they stagger and fall off a cliff, then blame the victim because they had time to take another step before falling.

TheBrianWeissman 2025-12-04 17:11

Definitely ready for one million “robotaxis” any day now.

dezastrologu 2025-12-04 17:12

Doesn’t look like they intentionally dodged that truck. Looks more like human input trying to avoid the truck. There’s also another perfectly good truck right in front of this one that they could’ve swerved into if they were looking to kill themselves. Not looking like they’re trying to kill themselves at all. God knows what the terrible Tesla camera-only self driving system saw and made it swerve into oncoming traffic, with the driver barely avoiding the truck.

thegoodcrumpets 2025-12-04 17:12

Well the market is rewarding them so it's not like they have any sort of inventive to stop 🥳 Same with the door handles that keep getting people killed in fires, why take the cost of a redesign if lethal negligence made them the most valuable car company ever

Jonesy1966 2025-12-04 17:12

The original headline does not include 'intentionally'. The video shows the Tesla avoiding another collision that puts it right in front of the semi. There was nothing intentional going on here. Now whether it's the fault of FSD/Autopilot or human error is another matter. EDIT: Spelling

TheBrianWeissman 2025-12-04 17:14

Or the car was just doing what it’s done many many times on camera using FSD.  I’ve seen tons of video of cars randomly veering into oncoming traffic while FSD is engaged.  In all those videos, the person is paying attention and they react to pull the wheel back to the right. What you don’t see is the times they’re not quick enough, or they’re asleep, or texting or watching porn or whatever.  I suspect this is one of those cases.  You think it’s suicide because FSD can drive in a suicidal manner.

Malacasts 2025-12-04 17:16

Most likely human error. It was already driving too fast to be fsd

EarthConservation 2025-12-04 17:20

Kinda surprised the truck couldn't brake or at least turn before running into the wall. Guess it's possible the airbag went off.

Role_Player_Real 2025-12-04 17:20

I mean who knows but what in the world about that video made you think it was human error?

Scribble_Box 2025-12-04 17:21

Yeaaaahh..... More like robomissiles.

JRLDH 2025-12-04 17:21

It’s absolutely fascinating if you read the fawning posts of FSD customers who are absolutely smitten by their beloved car. It’s the same arguments since I first heard about FSD when I was an idiot and bought a Tesla Model 3 (gotten rid of it a few years ago). The latest version is *always* a game changer for these people. In reality, it still doesn’t see every object reliably, just like in this video. No wonder Elon thinks everyone is stupid. His customers certainly are.

Jonesy1966 2025-12-04 17:22

Yeah, that's what I thought, too

CloseToMyActualName 2025-12-04 17:22

Could be a medical incident, or they spilled their coffee, jerked the wheel, and unsuccessfully tried to recover. I don't see any particular evidence of it being FSD/autopilot related.

CloseToMyActualName 2025-12-04 17:22

I agree human error, but FSD is infamous for speeding.

Malacasts 2025-12-04 17:23

Yeah FSD drives so damn slow, even on hurry on local roads

Malacasts 2025-12-04 17:23

On freeways, local roads you have to force it to go fast, my 35mph zone I can't get it to go even 40 without putting my foot on the acceleration

Skycbs 2025-12-04 17:26

I don’t see it avoiding an accident

CD_Projeckt_Pink 2025-12-04 17:28

Probably in Mad Max mode

Inevitable_Koala1673 2025-12-04 17:30

The car didn’t avoid another collision. If you see the video, it had clear road ahead. Then suddenly swerves just as it goes over a tree shadow

Jonesy1966 2025-12-04 17:31

The Tesla is obviously speeding and it looks like it over corrected steering or braking making it swerve in front of the pick-up. Driver appears to yank the Tesla to its left to avoid a head on with the pick up, putting it right into the path of the semi

PaleInTexas 2025-12-04 17:32

*Declined - driver at fault. FSD was manually disengaged 0.001 seconds before accident occurred.

dextercho83 2025-12-04 17:37

Their stock price probably shot through the roof when this video gorgeous released

mioiox 2025-12-04 17:38

I am pretty sure it saw the truck at some point. The point of contact, that is.

Mootaya 2025-12-04 17:39

Lmao are you on crack? The Tesla wasn’t going that much faster than the car in front of it and it had 5 or 6 car lengths between itself the next car. You can’t makeout what the driver is doing because of the video quality. What are you even watching? Tesla shill lol

Malacasts 2025-12-04 17:40

V14 was massively nerfed, and on HW3 doesn't exist. We won't ever truly know though, Tesla hides FSD errors and issues

Sad_Ghost_Noises 2025-12-04 17:45

Must be misreporting. So many reddit users telling me that Teslas are safe! They have the best safety ratings!

lothar74 2025-12-04 17:49

I expect TSLA will go up at least 5% on this news..

The_Synthax 2025-12-04 17:50

impressive mental gymnastics to jump to “Tesla shill” when they rightly point out that this driver is a full-blown idiot. Either they cannot drive and did this of their own accord, or they trusted FSD. An obviously stupid move either way.

ukittenme 2025-12-04 17:59

It is really quite amazing! Right up until the one time it decides to drive you into oncoming traffic…

Jonesy1966 2025-12-04 18:00

Tesla shill?? LMAFO! I'm actually banned from most Tesla subs because I asked questions about FSD they didn't like. GFY

dezastrologu 2025-12-04 18:07

100% this

TheRuneMeister 2025-12-04 18:17

I’ve heard that many semi trucks in the US doesn’t actually have airbags. Don’t know if it is still the case. However, if I was in a head on collision like that, I have no idea how I would react. Might very well get knocked out and coast into a brick wall.

SolutionWarm6576 2025-12-04 18:18

One of Elon’s first moves while running DOGE, was eliminating 30 positions at the NHSTA. Those positions oversaw the safety of FSD. lol.

ZoomHigh 2025-12-04 18:20

Looks like an FSD situation - avoiding a shadow, and then the oncoming pickup, and then ouch.

Real-Technician831 2025-12-04 18:29

Did you check the video, that Tesla homed right into that dump truck, so their targeting system is spot on 🎯

HanzJWermhat 2025-12-04 18:29

It’s a lot of momentum and even when fully locked up wheels it’s gonna slide for a bit.

HanzJWermhat 2025-12-04 18:30

This is good for TSLA

Sea-Marzipan-8157 2025-12-04 18:41

Very possible the Tesla took out the front drivers side wheel which would cause the truck to veer to the left, steering ability gone.

ricLP 2025-12-04 18:44

The thing it really wasn’t. I got rid of mine in Jan (will let folks guess why), but I had tried it. And it was quite scary in Bay Area traffic. It was great without traffic until one day it slammed on the brakes while at 70mph on an open freeway. Just glad no one was behind me 😬

EverythingMustGo95 2025-12-04 18:46

Too be fair, he gave hundreds of millions of dollars to Trump. He had to get paid back (at taxpayer expense) so they can both come out ahead, that’s the Art of the Deal. Elon also got rights to “you’re fired”.

TheInternetsLOL 2025-12-04 18:48

Robotaxi and FSD everybody! They are so far ahead of Waymo 🤭

ukittenme 2025-12-04 18:50

Why didn’t the truck just move out of the way???

Boundish91 2025-12-04 18:51

Also known as blatant corruption.

bobi2393 2025-12-04 18:51

I agree about "who knows"; this is just my guess. It could instead be a mechanical error, an FSD error, an intentional act, or something else. But I don't think it's an FSD error because I've seen a ton of FSD mistakes and accident videos, and haven't seen one where it changed lanes directly into the path of a ***nearby*** oncoming vehicle. I've seen FSD or Autopilot drive into stopped vehicles and animals in their own lane, and randomly swerve into oncoming lanes when there's not currently a nearby oncoming vehicle, but not swerve into an oncoming lane directly in front of an oncoming vehicle like this. If it turns out that is what happened, I'll certainly revise my prediction reasoning for future similar collisions.

MouseWithBanjo 2025-12-04 18:59

Regulators now as how long since it disengaged etc

DazzlingPoppie 2025-12-04 18:59

Elon will demand the GDP of all of North America as a pay package.

girl_incognito 2025-12-04 19:04

I loved that video where they show the screen on the tesla detecting like a continuous stream of traffic lights ahead and it pans up to show a traffic light being transported on a trailer ahead of them lol

altoona_sprock 2025-12-04 19:05

It almost had a head on collision with the pickup when it veered into the left lane of oncoming traffic, but the pickup only brushed against the Tesla. Then it just homed in on the dump truck like a missile. I think the white car that the dump truck took out on it;s way to the wall was another Tesla, too.

cullenjwebb 2025-12-04 19:08

I've heard this excuse a lot but it does actually happen. It doesn't see oncoming cars reliably enough to say that it won't cause a collision when it's dodging ghosts. * [Example 1](https://www.reddit.com/r/TeslaFSD/comments/1oz2h6f/swerved_into_oncoming_traffic_over_leaves/) * [Example 2](https://v.redd.it/zq1qpane3e4g1)

Buggg- 2025-12-04 19:08

Phantom braking is annoying and freaking scary. A section of I-5 near me has a spot the car loves to test the brakes. Not sure how the software keeps allowing this - I-5 is one of the main freeways in the country. You’d think every lane would be mapped and mastered by now.

ObviouslyJoking 2025-12-04 19:10

We’ll have to wait for Tesla to let us know if FSD Mad Max was enabled, but in the meantime the drive was charged with two driving violations (since the driver assumes all responsibility either way).

mioiox 2025-12-04 19:13

This bloody truck, it appeared out of nowhere!

Individual-Nebula927 2025-12-04 19:23

I like the one waiting at a train crossing, and it's a continuous stream of semis. The car is just guessing at what objects are, and the fancy graphics make people think it's brilliant.

No_Primary1336 2025-12-04 19:25

I have a friend who is the biggest Tesla/Elon fan. Like cult level fanatic. He swears FSD version 14 has solved it. He also has posted a video of his cybertruck breaking the “hardest he’s ever experienced on FSD” for some blowing leaves. I’m always amazed to see the mental gymnastics.

girl_incognito 2025-12-04 19:26

Makes you wonder what it will do if an empty train car rolls by.

Individual-Nebula927 2025-12-04 19:27

Um, Tesla had to issue a software recall because they intentionally programmed the cars to not stop at stop signs and with the ability to automatically speed.

[deleted] 2025-12-04 19:30

I gave up on an FSD free trial on my M3P about a year ago. It was incredibly dangerous. I won’t even use cruise control due to the phantom braking which absolutely will kill a motorcyclist unfortunate enough to be behind. Other than the awful software/lack of sensors (the latter I suspect) & awful road noise, the car is fantastic; super reliable & economical (I charge at home at night).

Withnail2019 2025-12-04 19:38

Not see it and accelerate

Robo-X 2025-12-04 19:38

If the truck would be running FSD it would have predicted that the Tesla would go into his lane and move out of the way. 100% human error of the truck driver.

DamNamesTaken11 2025-12-04 19:38

Autopilot cut out 0.002 seconds before impact, therefore it’s not FSD’s fault! /s On a serious note, hope everyone is alright involved. Looked like a bad hit into that wall.

FoShizzleShindig 2025-12-04 19:39

It was the rolling stop, not automatically speed.

Individual-Nebula927 2025-12-04 19:40

It was both. Separate recalls.

RocketLabBeatsSpaceX 2025-12-04 19:41

I’ve noticed a massive influx of these people. Wouldn’t put it past Elon to use bots on reddit to try and spread positive sentiment tbh. I mean, he bought Twitter for propaganda so…

mrbuttsavage 2025-12-04 19:41

That's actually what it looks like. Literally straight, light traffic, then suddenly veers left as it crosses a heavy shadow.

analyticaljoe 2025-12-04 19:42

Now that's a car that needs the word "Robotaxi" written on the side. Then it would drive better. :)

EverythingMustGo95 2025-12-04 19:44

Yes, all made legal by the Supreme Court

FoShizzleShindig 2025-12-04 19:44

Interesting because current FSD automatically speeds on mad max and hurry modes. Got a link? They should be recalled again.

Odd_Ninja5801 2025-12-04 19:45

This looks like the sort of thing that could cause Tesla stock to crash. Upwards.

dorchet 2025-12-04 19:46

the kind of people that think something will never happen to them.

XKeyscore666 2025-12-04 19:57

Did the driver survive? That looked like a nasty impact.

TheBrianWeissman 2025-12-04 19:58

This is extremely accurate. I'm friends with Robert O'Dowd, the son of Dan O'Dowd, the guy who has been funding The Dawn Project for years. This is the same Dan O'Dowd who ran for US Senate a few years ago against Feinstein, and spent $14 million for two short Superbowl public service announcements. Those announcements likely had little effect, due to the context and the audience, but they were made to alert the broader public to the dangers and fallacies of FSD. Every time Tesla releases an FSD update, The Dawn Project puts the software through the paces. They also compile and publish the field data, including crashes, interventions, etc. According to Robert, the software hasn't significantly statistically improved in its entire duration. Every new version gets a little better at a few things, but gets worse at others. Every version still merrily blows past a stopped school bus and murders a child. Apparently the latest version has some special interventions built in, but it will still splatter a kid if you go a bit too fast or have it in "Mad Max" mode. FSD owners who fawn over the technology are a classic case of confirmation bias and small sample size. They think their anecdotal commute to work is a reflection of the entire state of the technology. I hope those people never have a bad accident like the one in the video, but it only takes one fuckup to ruin multiple lives.

XKeyscore666 2025-12-04 20:00

That seemed like enough force to rattle your brain around a bit. I think it would take me at least a few seconds to register what even is happening.

MapleDansk 2025-12-04 20:08

Surviver bias. The ones with negative experiences tend to die.

bobi2393 2025-12-04 20:13

I've seen those before, but in my opinion the errors are substantially dissimilar. In the Example 1 video, the Tesla is heading into a blind curve that obstructed its view of the other vehicle until nearly the same time it swerved. Besides not being visible until around the time of the mistake, the the oncoming car was not nearby; it was around 100 feet away, and was fairly easily avoidable. At their relative speeds, it allowed around 2 seconds for correction, which was more than was needed. In the OP video, that was a straight road with clear weather, clear pavement, sun angled from the side, and the Tesla abruptly swerved at around a 30° angle maybe 20 feet in front of the oncoming vehicle. At their relative speeds, it allowed only a fraction of a second for correction, which I don't think would have been enough to get back in its lane once it was angled like that. In the Example 2 video, that was a wrong turn onto a one way street, not swerving into an oncoming lane, and there was no nearby oncoming vehicle threatening an imminent collision I'm honestly not looking for an "excuse" for the mistake, just basing a guess on what I've seen in the past. The vehicle log should make it clear if a manual steering input led to the swerve, and the state of various autonomous features. If it shows FSD caused the swerve, I have no problem accepting that. FSD makes tons of other types of mistakes, like running red lights, and lots of less dangerous swerves, but I haven't seen a swerve as obviously and immediately dangerous as this before.

SlowDekker 2025-12-04 20:13

The thing is that FSD drives really well for 99.99% of the time, but it does have random suicidal urges.

Skycbs 2025-12-04 20:17

Ok. So it avoided an accident that it almost created.

dtyamada 2025-12-04 20:20

It didn't see it as it was dodging the first vehicle it tried to crash into going the wrong way!

JRLDH 2025-12-04 20:22

As long as their image recognition isn’t as good as a human’s it doesn’t matter if it performs well most of the time. It has zero contextual awareness and still morphs and teleports objects. It cannot distinguish a harmless trash bag from a dangerous solid obstacle. It has no clue if an object is a squirrel, a cat, a dog or a small child. It’s a massively overhyped ultra dumb system that confidently shows you that your cabinet in your garage is a semi truck. I just don’t understand the mind set of “well, it’s good 99% of the time”. So is a drunk person yet we have penalties for DUI.

Jonesy1966 2025-12-04 20:28

It seems like everyone one here wants it to be Tesla's fault and any narrative away from that gets shat upon. You lot are as bad as the Tesla cult

dtyamada 2025-12-04 20:30

There was literally a video of a CT, on straight road, where the FSD tries to veer into an oncoming car. Here's a link: https://www.reddit.com/r/RealTesla/comments/1inkqeo/cybertruck_fsd_tries_to_cause_a_headon_collision/

Actual__Wizard 2025-12-04 20:40

>It’s absolutely fascinating if you read the fawning posts of FSD customers who are absolutely smitten by their beloved car. The humans or their AI robots?

Mootaya 2025-12-04 20:57

The driver might have done it themselves but what this guy said is completely false. No speeding and was not avoiding a collision.

Mootaya 2025-12-04 20:59

Then why are you shilling? Your entire comment is false. There was no collision to be avoided.

Unplugthecar 2025-12-04 21:07

Driver was texting. https://electrek.co/2025/12/04/elon-musk-tesla-fsd-drivers-can-now-text-drive/

Donthaveacowman124 2025-12-04 21:10

If only they had a sensor that wasn't fooled by shadows

Donthaveacowman124 2025-12-04 21:15

How many humans have you seen swerve into oncoming traffic? Drifting, yes, but swerving like this seems unusual

Jonesy1966 2025-12-04 21:19

WTF, dude? Where exactly am I shilling?? You're as bad as those who are in the pro-Tesla/FSD cult: as soon as something is presented that you don't agree with you kick off. Like I said earlier, GFY and move on. You're deranged!

diegofercam1966 2025-12-04 21:23

Yep, definitely a crash caused by failed autonomy driving, a human will never swerve like that just for a shadow.

TryIsntGoodEnough 2025-12-04 21:51

The minute it saw the truck it probably disengaged FSD so that the driver was responsible

TryIsntGoodEnough 2025-12-04 21:54

You say /s... But in reality what you said is true

ManifestDestinysChld 2025-12-04 22:05

I have no doubt that Jesus Take The Wheel Mode was engaged a split-second before impact.

ircsmith 2025-12-04 22:10

So wish I had gotten rid of mine a few years ago. sigh, now I am stuck with a huge loss. Time to bite the bullet.

rellett 2025-12-04 22:15

I thought Elon said you could text and drive

BrokenHopelessFight 2025-12-04 22:22

Cooking

fishsticklovematters 2025-12-04 22:23

And turned off FSD 3 ms before impact so they can claim that it was human error.

[deleted] 2025-12-04 22:36

When Trump first got in, they loosed the laws about how much data Tesla has to provide the federal government for accidents they investigate.

TaifmuRed 2025-12-04 22:36

User error confirmed! Fsd is not on!

Mootaya 2025-12-04 22:42

You’re a shill because you just blatantly lied to make Tesla not seem like the problem lol

Jonesy1966 2025-12-04 22:46

Firstly, that's not a definition of a shill. Secondly, explain how I've lied. Thirdly, I'm calling it as i see it. Just because that goes against your anti Tesla bias doesn't make it a lie. Once again, I politely invite you to GFY and calm the fuck down

UncleDaddy_00 2025-12-04 22:48

The video from inside the Tesla will be wild. I don't know if it was human error or pretend genius human error, but it is interesting that the car manages to skirt the pickup within inches and then aligns directly into the land of the dump truck.

MrGelowe 2025-12-04 22:49

It is more like you push someone off a cliff and it's their fault for not stopping before going splat.

MicksysPCGaming 2025-12-04 22:51

That's my exit!

Useful_Response9345 2025-12-04 23:05

And they're all willing to believe robotaxis are coming in a few months, despite Tesla being very sketchy with their incident numbers.

outworlder 2025-12-04 23:12

It's probably not even solvable with object recognition alone. Humans do that, but they also understand what they are looking at. We don't have to be trained on objects before we can understand them. At most we'll go "there was _something_ on the road and I had to swerve to avoid whatever it was". And like you said, it seems that FSD doesn't have object permanence either. At least with pre-mapped systems they will know where the road is supposed to be and they can at least tell if there's things where they are not supposed to be according to their maps.

Melodic-Beach-5411 2025-12-04 23:14

Didn't Elon refuse to have Lidar and Radar on Teslas because they were too expensive so Tesla's depend entirely on cameras ?

outworlder 2025-12-04 23:14

Yeah and their machine learning approach, which they seem to use for everything and not just to categorize what the cameras are seeing, is unlikely to get any better. They can't just fix one thing, unlike traditional algorithms. They can train with more data and hope the issue goes away and hope even more that new issues didn't crop up.

mrbuttsavage 2025-12-04 23:36

Yes. Training a system it follows lead cars, lane lanes, and road edges 99% of the time properly isn't exactly a world changing feat. The real feat was actually delivering something that would do that on consumer cars, something that would regularly push telemetry back and could be updated quickly. That's an accomplishment. But the extra 1% where you actually need better sensors, real world knowledge, etc is where Telsa is deficient and has apparently no plans to improve.

LOLZatMyLife 2025-12-05 00:08

**FULL SELF DRIVING 2017 BABYYYYY !!!** /s

Syscrush 2025-12-05 00:26

>They can train with more data and hope the issue goes away and hope even more that new issues didn't crop up. For a task as complicated as driving, I'm pretty sure it's a mathematical certainly that new issues have to crop up.

bobi2393 2025-12-05 00:33

That was making an unprotected left turn into its final destination. That's why it signaled left, showed its path turning left, and slowed to 15 mph, before it started turning. That's a serious error, but unprotected lefts into the path of oncoming vehicles are common for FSD. Chuck Cook's YouTube channel is primarily about unprotected left FSD tests and failures. In the OP video, there is a driveway just behind where it swerves, but it looks like the Tesla passes by it at normal traffic speeds, before suddenly swerving diagonally into oncoming traffic. But I did notice re-watching that many of the lane markings on the road are wrong, with a dashed line in the middle of the Tesla's original lane, in a way that most humans would understand they should ignore, but which I could understand would be confusing to an ADAS trying to stay in its lane. The dashed line in the near-center of the Tesla's original lane seems to start right at about the point it swerved, so that also makes me lean toward this being an FSD failure, even though avoiding the oncoming car should be prioritized over attempting to stay in its lane. It looks like the lanes were originally painted around two feet to the right of the new lanes, from the truck's perspective, then the old markings were partly scuffed, making them less pronounced or in some cases removed them, and new markings were painted offset by a couple feet. The result is that the Tesla's original lane, where it seemed to turn from, has a dashed white center line for its actual lane, another dashed white line for a non-existent phantom lane two feet to the Tesla's left, and because of the turn just before the Tesla swerved, there are no ~~double~~ solid-and-dashed yellow lines separating the new phantom lane from the center turn lane and oncoming traffic.

wessex464 2025-12-05 01:10

Did I miss something? Where does it say FSD was involved?

bobi2393 2025-12-05 01:22

I think this was an FSD or Autopilot error that was triggered by incorrect lane markings on the road. Some time in the past, the lanes around the point where the Tesla swerved shifted around two feet to the right of where the Tesla swerved, and from the trucker's footage you can see where the old lane markings gradually diverged from the new lane markings. Before where the Tesla swerved (look around 0:22 in the video), there's a fresh dashed white line indicating the right side of the Tesla's current proper lane, but at right around the point where it swerved, there's a slightly-faded dashed white line indicating the right side of the old lane (kind of a "phantom" lane), around two feet to the Telsa's left. And at that same point, the yellow lines indicating the edges of the new and old center turn lanes are interrupted for around 25 feet because there's a turn-off on the Tesla's left, enhancing the misunderstanding of the phantom lane as a current lane. So the Tesla probably shifted around two feet to the left to stay in its perceived lane. After that the center turn lane markings resume, so there are current lane markings, but also the phantom markings of *all* the old lanes, and the former solid-and-dashed lines of the former center turn lane are now worn so they look like double dashed lines (they're no longer solid), giving the Tesla six dashed lines to choose from as indicating its current lane. Apparently, the Tesla picked some of the scuffed former turn lane markings as where its appropriate lane shifted to, so it shifted even further to the left. Except there was a pickup truck truck partially in the phantom center turn lane the Tesla swerved to, and perhaps since it already had leftward momentum from swerving into it, it chose to try swerving further to the Tesla's left to avoid the pickup rather than swerving to the right. All this is conjecture, and it could just be human error or mechanical failure or something, but the phantom lane markings all over the road right where the Tesla screwed up make a software error more understandable. The Tesla's human driver (and/or their insurer) may be held primarily responsible, but it's possible the roads department and/or Tesla could be held at least partially responsible for some of the damages. If I were in making and enforcing the law, I'd probably hold the roads department primarily responsible, just because their lane markings reflect such *willful* negligence.

mrdilldozer 2025-12-05 01:24

Tesla owners are like the definition of sunk cost fallacy. There's a reason why the car community despises them but really doesn't give other EV owners shit. They genuinely think the are driving around a technological marvel because Musk said so. It's embarrassing when comparing those pieces of junk to other EVs, but it's not like a Tesla owner would even be aware of that because they don't know a thing about cars in general. They think FSD is great because they have never used lane assist in other vehicles.

JRLDH 2025-12-05 01:50

I'm a fairly early adopter of EVs with my first one back in 2013, a Ford Focus EV. The Tesla Model 3 was my third EV. I can see how someone who has never driven a higher end gas car would be amazed by the Tesla because of the smooth and instantaneous acceleration. There is nothing like that in the ICE world at the same price point. And because it has the "Xerox" "Coke" "Google" advantage, it's often the first EV that one buys which performs so much nicer than a cheap gas car. And if the range + charging at home works for the buyer then they think that Tesla is the best thing ever and they are "locked in". Add the tech-apostles who are smitten by the reckless FSD ("no other company sells self driving on city streets 1!!11!!!1") and you have the Tesla aficionado situation hahaha. I ditched mine because I got tired of the blatant disrespect that this company has for their customers. The never ending exaggerations. Then they had a SW update which increased the screen area with the stupid visualization to take up almost 1/3rd of the screen WTF? I just always felt gaslighted (gaslit?) by a toot-their-own-horn company with that idiotic visualization and don't want to support a company and figurehead who lie all the time. I still have an EV as my daily driver. Not a Tesla and WAY, WAY nicer and nothing is exaggerated beta-crap.

MUCHO2000 2025-12-05 01:52

Speaking of stupid ... why assume this is FSD?

Icy-person666 2025-12-05 02:06

It was user error the moment they signed the paperwork to take possession.

bishop42O 2025-12-05 02:15

How do we know it was FSD and not somebody trying to commit suicide?

beaded_lion59 2025-12-05 02:26

It could have been an inexperienced new owner who floored it, lost control of the speeding vehicle & collided with the truck. These things are actually surprisingly damn fast.

bobi2393 2025-12-05 02:33

Police said excessive speed and impairment (probably meaning drunk or high) were not factors, but they were still investigating as of news reports I read.

friendIdiglove 2025-12-05 02:36

sAFeR thAn A hUMaN DrIvEr

bobi2393 2025-12-05 02:38

It looks to me like there were no stationary shadows near the point of lane departure, but there were many erroneous lane markings, from older lane markings not completely removed before painting new lane markings, and that may have interfered with FSD/Autopilot's lane following function. Watch it again and pay attention to the lane markings both before and after impact.

bobi2393 2025-12-05 02:40

The post didn't say FSD was involved, it just indicated a Tesla was involved. Police are investigating the cause, and didn't relay the Tesla driver's account of the accident to news media.

bobi2393 2025-12-05 02:46

[Google satellite view](https://www.google.com/maps/place/E+Cactus+Rd+%26+N+74th+St,+Scottsdale,+AZ+85260,+USA/@33.5967009,-111.9225338,57m/data=!3m1!1e3!4m6!3m5!1s0x872b74f429ade17b:0x1452b05e03da40a8!8m2!3d33.5966793!4d-111.9225942!16s%2Fg%2F11f3gb0zsd?entry=ttu&g_ep=EgoyMDI1MTIwMi4wIKXMDSoASAFQAw%3D%3D) of the intersection of Cactus Rd & 74th St in Scottsdale where the Tesla seemed to start swerving across the center turn lane. The phantom lines look less pronounced in the sat view than in the video footage. Google's street view of the area was last updated in 2011, before the lanes were moved over. There was no bike lane painted on the north side of the traffic lanes back then, and I'd theorize that painting the bike lane lines may have indirectly led to the motor vehicle lanes being a bit to the south..

friendIdiglove 2025-12-05 02:48

You’re right. It could also be a sudden case of Tesla Wompy Wheel.

bobi2393 2025-12-05 02:53

Police said they were investigating the cause in the last news updates I saw, so I think the public doesn't know. Given the circumstances, I think attempted suicide is less likely (it was at a relatively low speed, and the driver has non-life-threatening injuries), and FSD or Autopilot are fairly likely. The vehicle departed its lane at a point when there were several phantom lane markings from the road's historical layout, and it seems like distinguishing the current intended lane markings from the older phantom lane markings would be harder for the software, which doesn't normally make such distinctions, than for a humans, who have better inferential reasoning capabilities in abnormal circumstances like that.

Phyllis_Tine 2025-12-05 02:54

Why doesn't Elon make every Tesla employee commute solely with FSD in company-supplied cars?  Why isn't he driven around solely in FSD-powered vehicles?

friendIdiglove 2025-12-05 02:55

While interpreting the red lights and crossbucks as normal traffic signals. Oof. One of the most important things to “teach” these machines is what a RR crossing looks like, just like its important to teach small children about RR crossings.

Boys4Ever 2025-12-05 02:55

Taking out the trash cyber truck style

bobi2393 2025-12-05 02:56

The driver suffered non-life-threatening injuries. Kind of impressive for that hit.

bobi2393 2025-12-05 02:58

Articles reported that both drivers were hospitalized with non-life-threatening injuries.

JRLDH 2025-12-05 03:03

Hahahahaha. Because it’s what they promote and exaggerate. And it looks exactly like these FSD videos where the silly system attempts to drive into opposite traffic (where FSD warriors rescue to poor car in the last second). Of course, it could also be something else but given how stupid that FSD system is, chances are I’m correct. Also you do understand the purpose of a discussion forum, do you? It’s a chat and not a forensic investigation for a trial.

bobi2393 2025-12-05 03:04

My guess is that FSD or Autopilot initiated the lane departure into a course to hit the pickup, due to lane marking confusion, and that at some point a freaked-out human driver took control, which may explain the lucky pickup avoidance and then the lack of a save as it hit the dump truck. But anything's possible. The last report I read said police were still investigating, but didn't think impairment or speed were factors, but said nothing about autonomous driving features.

Globalcop 2025-12-05 04:57

Source?

Globalcop 2025-12-05 04:59

Do you have any information that this was FSD? Or are you just speculating? It's kind of an important distinction.

fastwriter- 2025-12-05 05:35

Even the Cars know that they are Garbage.

Sad_Ghost_Noises 2025-12-05 05:55

This is the issue. If you look at the robustness of Teslas in isolation (without taking into accout how they are used, or the potential for catastrophic mechanical or software failure) then they seem safe, yeah. The good NCAP ratings show this. But then you take into account the suicidal FSD, the known issues with cracking / breaking control arms (whompy wheels), the supercar performancebin the hands of every day untrained regular drivers, and the fact that Teslas appear to mostly be purchased by that _special_ kind of dipshit…

Withnail2019 2025-12-05 08:35

Why would a human driver do that? It makes no sense.

newaccountzuerich 2025-12-05 09:03

It shows a flawed approach to algorithm generation and improvement, and a basic misunderstanding of things like local maximae and data bias.

Psychological-Hall22 2025-12-05 11:26

The individual was suicidal and deranged. FSD was not on at the time.

wessex464 2025-12-05 11:49

Why do you assume a computer would?

Withnail2019 2025-12-05 11:58

People on the thread have explained why Tesla FSD could potentially do this.

wessex464 2025-12-05 12:23

Could potentially. Sure. I guess I misunderstood the sub. Is it just bitching about Elon and FSD with no regard for reality?

Drives11 2025-12-05 13:11

I've seen FSD do this exact thing for shadows on the road. and this happens to happen right as they're driving over a shadow, so I assume the exact same thing is happening as the one that hit a tree.

dtyamada 2025-12-05 13:47

There's been videos where seemingly the most logical explanation for an FSD crash is that FSD sees a shadow and thinks it's an object or a bend in the road and turns. Given your explanation of FSD commonly missing or misjudging oncoming traffic, that still seems like a plausible explanation in this case (since there appears to be a shadow in the area it swerves). But I appreciate your honest opinion.

Longjumping-Store106 2025-12-05 13:52

I had a Y and hated FSD. it nagged more than anything. I had to baby it more than just driving myself. Glad I never paid for it but for 1 month on a vacation and hated it the whole way. I had the original Ap1 in an old S and it was the best experience other than the occasional phantom braking and I knew the spots where it would do it so I was more careful in those spots or took control before I got there. TL;DR AP1 was better and FSD sucks.

ShootFishBarrel 2025-12-05 14:25

You can Google it. Tesla has tried this “autopilot shut off just before the crash” bullshit multiple times already.

alang 2025-12-05 15:00

I swear there will be more than one person whose last words are chiseled into his (it’s ALWAYS a him) tombstone: “Still Love The Truck Though”

alang 2025-12-05 15:04

“Sure, this behavior is incredibly unusual for a human driver, and is surprisingly common for Tesla’s FSD, and this car JUST HAPPENS TO BE one of the few cars on the road that can use Tesla’s FSD, but until it is absolutely proven that FSD was engaged at the exact moment of impact, we should naturally assume that it was human error. And the only way I will accept as proof of this is for Musk to announce it himself.”

Zealousideal-Sink-18 2025-12-05 15:32

Because it's a piece of shit it knew that it belonged in the back of that dump truck

phate_exe 2025-12-05 16:02

I'd imagine the driver of the white pickup that was in the left lane parked the car for a nice "sit and stare straight ahead" session after this. Can't find any mention of whether this was one of the driver assists deciding that it needed to avoid a dangerous shadow by swerving into oncoming traffic, or if the driver had a medical episode and/or just succumbed to the call of the void.

President-Jo 2025-12-05 16:14

Is there any evidence this was FSD? It seems like some are making that assumption, which is convenient for the narrative in their heads, but there isn’t anything pointing to that being the case.

President-Jo 2025-12-05 16:19

I’m not sure why it’s tough to understand that as unlikely as it is for this to happen at the hands of a human, it’s significantly less likely that it was FSD. Any informed person would bet this was human error.

Withnail2019 2025-12-05 17:18

Look at the way the car is lined up perfectly straight with the white line when it's in the wrong lane heading for the final collision

Withnail2019 2025-12-05 17:19

The more I look at the video the more it looks like the computer was in control

President-Jo 2025-12-05 17:19

How can it “look like the computer was in control”?

Withnail2019 2025-12-05 17:26

the way it moves, it's almost superhuman the way it dodges that pickup then lines itself up perfectly parallel with the white line prior to the final impact. But if you're an Elon glazer, I suggest you go back to your own subreddits.

Withnail2019 2025-12-05 17:34

I can't imagine what the thought process of a human doing this is supposed to have been

UnlessRoundIsFunny 2025-12-05 17:38

Well said. And many Tesla owners have never owned an expensive car before so they think Elon invented features that have been available for years. I’ve lost track of how many times a Tesla fanboi has told me about some amazing new Tesla exclusive that we had 25 years ago on other cars.

UnlessRoundIsFunny 2025-12-05 17:40

This. Phantom Braking is terrifying. Like when it suddenly panic stops in the fast lane of a freeway with no one in front—it’s almost disorienting: “WTF? Did I just hit something?”

Material_Variety_859 2025-12-05 18:16

FSD specifically states it’s intended to be supervised. I have a 26 model y and the entire training video emphasizes maintaining eye contact with the road. Are we to assume that a human being is unable to see a Dump Truck as they supervise the autopilot mode? This is absolutely human error. The system isn’t a fully autonomous vehicle.

JRLDH 2025-12-05 18:19

Yeah, it’s the Schroedinger FSD, all right. It’s both a Robotaxi and a dumb L2 assistance system. Robotaxi when it performs well and dumb L2 system when it inevitably fails. BuT tHe FiNePrInT sAyS iT’s SuPeRvIsEd !!!1!!111!!!11111

Material_Variety_859 2025-12-05 18:21

I don’t know what you’re talking about. This isn’t robotaxi. It’s a driving assistance technology that allows you to not have to use your feet or hands except for in the emergency intervention like a dump truck that you’re driving fast and about to collide with.

Common-Ad6470 2025-12-05 19:00

Within a split second of becoming self-aware, the Tesla realised the awful truth and decided to commit dump truck suicide….😳

zeekayz 2025-12-05 19:19

Did you update to alpha 0.0245f build 2893? It fixes the issues you had with your Tesla.

JRLDH 2025-12-05 19:53

LOL I don’t know what to say that won’t get me banned other than: You poor summer child.

chrisjdel 2025-12-05 22:05

It looks exactly like a number of previous incidents with the FSD, however there is no confirmation yet that we aren't dealing with an intoxicated driver on manual or something of that nature. If the police know they haven't released those details. Ongoing investigation, so on and so forth, I assume.

Alternative-Cow6206 2025-12-05 23:10

👏🏻

[deleted] 2025-12-05 23:53

While uncommon, in the last few years here in Texas; it's becoming more commonplace for people to unalive themselves with wrongway driving tactics.

BlueShift42 2025-12-06 07:22

People blaming FSD here have no idea what they’re talking about.

CarnivorousSociety 2025-12-06 09:29

Don't rule out a combination of both, there is a shadow near where it swerved it's hard to say exactly how close.

CarnivorousSociety 2025-12-06 09:32

ah yes, the classic texting and accidentally cranked the wheel into the oncoming lane issue

CarnivorousSociety 2025-12-06 09:34

then why avoid the first truck, your comment has no place here

CarnivorousSociety 2025-12-06 09:36

how often do people accidentally crank the wheel into the oncoming lane while driving at full speed? He wasn't trying to kill himself he dodged the first car, so he must have been in control? How did he crank the wheel if he was in control? If he wasn't in control, how did he react so fast to the truck and dodge it? He's just that bad of a driver, oops cranked the wheel left? Nah dude this was a self driving AI hallucination that made it veer into the opposing lane.

Withnail2019 2025-12-06 09:40

That's clearly what happened.

Withnail2019 2025-12-06 09:43

Exactly, these Elon worshippers are just ignoring the actual video

CarnivorousSociety 2025-12-06 09:45

I have a logical method for deducing why it must have been FSD in this specific case. It all comes down to the split second dodging of the first truck. Lets assume the driver caused this entirely: We can firstly assume the driver was not trying to kill themself because they dodged the first truck. So that leaves accidentally turning the wheel into the opposing lane, there's two ways this could happen: A) The driver was holding the wheel **in full control** and somehow cranked the wheel into the opposing lane for no reason. idk, muscle spasm? or B) The driver accidentally bumped or knocked the wheel **without control** somehow causing it to turn So they either **had control** or **didn't** at the time of swerving, if they didn't have control of the wheel then how did they dodge that first truck so fast? That leaves only one option (if they caused it): They cranked the wheel while being in full control then immediately self-corrected. How god damn likely is this? Who cranks the wheel while they are in full control of the wheel driving at full speed? ...... or it was obviously an AI hallucination that caused the car to veer into the oncoming lane. AI avoids the truck then just gives up when it's impossible to avoid the dumptruck and disengages.

Some_Review_3166 2025-12-06 12:12

Could it have been a catastrophic steering failure? FSD has its flaws, but unless we have the data or inside cabin view, we're speculating based on the other driver's perspective whether autopilot or FSD was engaged. Some of the older Tesla were prone to steering failures and I remember some model years had recalls in place. Also there was a Reuters investigative report in 2023 on some owners with relatively new cars reporting the steering wheel coming off while driving.

Kind-Pop-7205 2025-12-06 17:47

Can FSD have intent? Weird anthropomorphizing.

Jacktheforkie 2025-12-06 19:58

Depends how fast it happened, truck drivers are still humans

ionizing_chicanery 2025-12-06 21:57

That was more of a well executed lane change into incoming traffic than swerving out of control...

GoldheartTTV 2025-12-06 22:17

Ha, it's trash that takes itself out!

lightinggod 2025-12-07 04:30

This is correct.

Imper1um 2025-12-07 13:01

The car perfectly settled into the opposite lane right before the crash. Almost like there was something autonomous driving the car. A human would have continued across into the right side of the video frame, not lining up perfectly with the truck it crashed into.

Imper1um 2025-12-07 13:06

Every time that Musk lies about how FSD can be used unsupervised or can be used while texting should just play videos like this in the background.

Farriswheel15 2025-12-07 13:28

Can you even imagine how terrifying it would be to live near major road. Carnage regularly

RightRestaurant6151 2025-12-07 23:14

tesla tryna get rid of evidence and allow more customers to buy their deathmissiles love it

IDNWID_1900 2025-12-08 06:41

"Baby trash car finds daddy trash car after years of search".

ipokesnails 2025-12-08 16:47

This isn't TikTok, you're allowed to use grown up words without being censored.

[deleted] 2025-12-08 23:36

Guess I would never know, considering I've never had a TikTok account.

ipokesnails 2025-12-09 00:09

And yet here you are, using TikTok brainrot censorship words.

Pepparkakan 2025-12-13 10:18

The fact that there was a ”hole in the road markings” precisely where it decided to perform its little manoeuvre definitely makes me think FSD.

chrisjdel 2025-12-16 22:44

Yeah, it has all the fingerprints of another "full self-driving" mishap (Elon Musk has a rather odd concept of what that term means). We still don't know for sure though. It's possible you just have a drunk driver, or one of those knuckleheads who decides they're going to watch a movie, take a nap, or whatever, and let the car drive without supervision - in which case both the driver **and** the company are responsible.

Ornery-Detail-3822 2025-12-22 15:27

when a driver is texting a vehicle veers into the other lane. This vehicle made a sudden left hand from the middle lane across the left hand lane going in the direction it was going into the oncoming traffic. There's no way that wasn't FSD that made that maneuver. At what point the driver took control. Who knows. The initial turn was almost for sure FSD. Anyone with common sense would come to that conclustion.

Add comment

Login is required to comment.

Login with Google