← Back to topic list

Tesla influencers tried Elon Musk’s coast-to-coast self-driving, crashed before 60 miles

chrisdh79 | 2025-09-22 10:41 | 1636 views

Comments (173)
Suspicious-Call2084 2025-09-22 10:47

Well that’s awkward.

Ok-Bill3318 2025-09-22 10:48

And this is why FSD that “you need to supervise” is fucking useless.

Pot_noodle_miner 2025-09-22 11:01

Credit to them for still posting the footage

BuckChintheRealtor 2025-09-22 11:05

Wait, but Leon said in 2016 that "[a driverless Tesla will travel from L.A. to NYC by 2017"?](https://share.google/4JkR4AaE5DkXF5tI1) Including charging! That was exactly 9 years and 2 days ago and Tesla hasn't even "solved" wireless charging. To be fair, Musk **did** manage to open the burger joint he first promised back in 2018.

PoilTheSnail 2025-09-22 11:07

It's a Fucking Shit Device

And_Im_Chien_Po 2025-09-22 11:09

I implore you all to watch the crash with audio, I actually laughed out loud

Downtown_Rock_3164 2025-09-22 11:10

Honestly, I turned mine off a couple updates ago. It just became too unreliable—swinging way over or under the speed limit, drifting toward the right-side lane markings for offramps then correcting wildy, and even reacting to shadows from passing cars or trees. More than once it slammed on the brakes as if there were an obstacle when nothing was there.

leandrompm 2025-09-22 11:13

🎵 sad trombone 🎵

Bulky_Specialist9645 2025-09-22 11:14

Lidar would have easily seen that. Musk is a clown 🤡

neonapple 2025-09-22 11:17

What’s more stressful? Driving leisurely yourself or hovering your palms over the steering wheel, keeping your heel torqued up over the brake, and looking at the screen to continuously make sure it isn’t about to do something stupid?

Ok-Bill3318 2025-09-22 11:19

Exactly. Anyone who has supervised a learner driver knows this. Except there are cues a human learner is out of their depth. FSD will confidently drive into or over shit without warning.

encomlab 2025-09-22 11:22

Both people saw the object 8 seconds prior to impact lol

CetisLupedis 2025-09-22 11:23

>Tesla influencers Well, that's the final determination that we've failed as a society.

BuckChintheRealtor 2025-09-22 11:31

5 to 10 times more views after 2 days than most of his other videos after a month. Engagement going through the roof. Lots of viewers who never heard of his channel. Probably a couple of new subscribers. Free pr everywhere, including this sub. Pretty sure Bearded Tesla is stoked he hit that thing.

Engunnear 2025-09-22 11:31

\*articulated seeing the object eight seconds prior to impact. They had to have seen it before that to formulate their response.

Dangerous-Flamingo92 2025-09-22 11:36

I use FSD all the time for the past 2 years. I will say, 2 years ago it was not very good. Recently, I have been traveling around 240 Mi round trip using FSD. There are some times that I disengage because I am not sure how it's going to react, overall, I think the system works awesome, I normally never have any real issues with it. It seems to get better and better with updates. Of course that's not perfect yet but it's definitely heading in the right direction.

hilldog4lyfe 2025-09-22 11:37

lmfao

And_Im_Chien_Po 2025-09-22 11:39

people are roasting them but their eyes may have fooled them in to thinking the ladder was flatter than it looked. And they didn't realize how large it was until it was too late. That, combined with how fsd really does change your behavior i.e. you let fsd do its thing and you assume it's gonna correct itself so you don't really know if you should take over. All of this, combined with watching them go flying just makes this one of the funniest clips ever.

aresev6 2025-09-22 11:42

ten elderly grandiose telephone one chop divide mighty retire oatmeal *This post was mass deleted and anonymized with [Redact](https://redact.dev/home)*

Low_Shirt2726 2025-09-22 11:43

Well said. Before my mS got totaled I had gotten to where I drove myself anywhere but on highways. Just easier mentally to go on and do things myself

TheSilverSeraph 2025-09-22 11:44

Again, to be fair, that burger joint is a complete mess and is now on life support [https://www.jalopnik.com/1938650/tesla-diner-drops-most-menu-options-cuts-hours/](https://www.jalopnik.com/1938650/tesla-diner-drops-most-menu-options-cuts-hours/)

Dangerous-Flamingo92 2025-09-22 12:00

Then don't use it. We all have a choice. You only want to complain and that's ok. If you didn't complain, most won't comment.

Phyllis_Tine 2025-09-22 12:00

How do you feel that you still have to monitor FSD when it is engaged? Isn't the point of FSD that you can give up control to the car and software?

I_Am_AI_Bot 2025-09-22 12:01

A piece of cake for lidar.

luv2block 2025-09-22 12:02

It's not Tesla's fault. It turns out that driving is a very complex and difficult task. It's why the human brain requires high levels of attention and focus and why people shouldn't talk on their phones while driving. Tesla could not have known this. They thought it was comparable to say, an iRobot bumping around the house. Everyone needs to cut Elon some slack. He'll figure this out and they'll have it ready in say, I don't know, about a year.

BuckChintheRealtor 2025-09-22 12:06

Imagine saying that TSLA's valuation could reach [20 to 30 trillion ](https://share.google/puQl3s2rwe8HZKH8Q) yet not being able to properly run a fast food restaurant.

That-Makes-Sense 2025-09-22 12:07

Restaurants are such a low margin, high competition business. I guess it was worth a shot, but it looks like the writing is on the wall. Elon needs to focus on cars and batteries.

OneTotal466 2025-09-22 12:11

🤣🤣🤣

Think-Committee-4394 2025-09-22 12:16

The failure is simple Robotic systems are good in sealed environments with known hazards They fail to cope with open environments with changing & random hazards Humans have evolved to run multiple levels of environmental observation, tracking on conscious & unconscious levels data we can’t even program a robot to check, because we don’t realise all we do!

Irishspringtime 2025-09-22 12:18

Yeah. I got downvoted galore for saying that using FSD was exactly the same, in my opinion, as driving. My hands are still "on the wheel" as far as I'm concerned because I'm always looking out for something to go wrong. I don't wish anyone harm, but I can't wait to see unsupervised Teslas dead on the shoulder of the highways and passengers calling an Uber to come get them.

jaimi_wanders 2025-09-22 12:21

Not even being able to hire someone else to run a fast food restaurant in one of the biggest cities in the country, either…

I_Am_AI_Bot 2025-09-22 12:23

Why everyone should wait for Elon to solve it while the existing Lidar solution (Waymo, Appollo Go, et ) can easily handle things Iike this?

rbirming3 2025-09-22 12:23

I use it everyday. Going on 6 months. I stay attentive and let FSD drive the car. Might have an issue every 100 miles of driving and never a problem on the highway. It's not perfect but it's pretty good. Better than any of the lane assists on any other car. I have a 26 Model Y Launch. Might be the HW of the car using FSD.

luv2block 2025-09-22 12:24

I was joking. Everything with Elon is always "one year away".

[deleted] 2025-09-22 12:25

As long are Tesla changes the name and owners assume the fiscal responsibility of driving a faulty system that constantly changes. Have at it.

MrGelowe 2025-09-22 12:26

> and they'll have **it ready in** say, I don't know, **about a year.** This is essentially /s.

[deleted] 2025-09-22 12:28

The experiment wasn't to see if they could react. It was to drive coast to coast on FSD. This is failure two.

rbirming3 2025-09-22 12:29

Reduces the driving workload but does not eliminate it all. Supervised is Supervised. Stay attentive and be prepared to take over if required. Realize the technology limitations and use it as you want. Been using FSD on two different cars over 5 years. It gets better every year.

Engunnear 2025-09-22 12:33

>There are some times that I disengage because I am not sure how it's going to react... And you just nailed the problem with using an AI black box in a safety critical system. You can never be sure of how it's going to react, because it isn't necessarily going to react the same way every time. Without that ability, it can never be part of a viable product.

Lorax91 2025-09-22 12:34

>My hands are still "on the wheel" as far as I'm concerned ...and as required by Tesla's operating instructions, plus some local driving laws and common sense. Anyone using FSDS without their hands on the steering wheel is endangering themselves and everyone around them on the roads.

Lacrewpandora 2025-09-22 12:37

> the Model Y had a broken sway bar bracket and damaged suspension components To be fair, the car could have left the factory that way.

Lacrewpandora 2025-09-22 12:38

>We all have a choice. Bullshit. Branch Elonians endanger the rest of us every day with this junk.

richardbaxter 2025-09-22 12:39

FSD you need to supervise is D

sanjosanjo 2025-09-22 12:43

To me it seems more difficult than driving. In addition to processing all the normal stuff outside, the "human supervisor" has the added unknown of trying to predict what the car might do.

Ancient_Box_2349 2025-09-22 12:45

Tinfluencers

Engunnear 2025-09-22 12:46

Oh, I'm sure they give up control, whether they'll admit to it or not.

mishap1 2025-09-22 12:56

I wonder if that would cover the deductible + raised premiums b/c he probably totaled his car for it.

SuperLeverage 2025-09-22 12:57

You make a lot of excuses, but when you see something, you slow down, and if necessary, go around it. You do not plow into it at full speed.

TK-ULTRA 2025-09-22 13:00

Imagine if your human driver had 'issues' every 100 miles, you would find a different driver after the first problem!

goranlepuz 2025-09-22 13:00

Even if you're driving a Tesla? Not in my experience! 😉

Lauzz91 2025-09-22 13:03

I swear I fell down a ledge, hit my head and woke up in an alternate dimension. >In 2016, Elon Musk infamously said that Tesla would complete a fully self-driving coast-to-coast drive between Los Angeles and New York by the end of 2017. >The idea was to livestream or film a full unedited drive coast-to-coast with the vehicle driving itself at all times. >We are in 2025 and Tesla never made that drive. >Despite the many missed autonomous driving goals, many Tesla shareholders believe that the company is on the verge of delivering unsupervised self-driving following the rollout of its ‘Robotaxi’ fleet in Austin, which requires supervision from Tesla employees inside the vehicles, and improvements to its “Full Self-Driving” (FSD) systems inside consumer vehicles, which is still only a level 2 driver assist system that requires driver attention at all times as per Tesla. Electrek is now saying stuff like this? What happened, Fred? Never got his Roadster from all those referral points, did he?

SuperLeverage 2025-09-22 13:05

Full steam ahead!!!!

PoliticsIsDepressing 2025-09-22 13:05

Regardless, if I see something in the road I try my best to drive around it. If your theory was correct, there’d be a line of cars pulled off that just hit the ladder.

[deleted] 2025-09-22 13:15

Predictability really is important; that's when most human on human crashes happen: when someone does something unexpected, and then you have this entire unscripted melodrama about how things will unfold with fast moving objects in a limited space and unscripted reactions.

[deleted] 2025-09-22 13:19

They also admitted 100-% fault to insurance as well.

randyranderson- 2025-09-22 13:20

So you use it as a good cruise control?

Smartimess 2025-09-22 13:26

Watch the video. The camera based FSD simply did not register the obstacle when the drivers did. And the Tesla isn‘t better than in any other car. That‘s objectively false and is proven by many car magazines especially in Europe. Tesla is always behind carmakers like Mercedes, Kia or BYD, because camera-based-only FSD is shit.

dtyamada 2025-09-22 13:32

Debris on the road, that's an edge case!!! That'll like almost never happen. /s

henlochimken 2025-09-22 13:34

The problem is the rest of us *don't* get to choose when you put our kids' lives in danger because your fucking space car doesn't know how to stop for school buses.

ascaria 2025-09-22 13:43

Wait until they learn about fog, snow, heavy rain, night - or even pedestrians and cyclists!

ascaria 2025-09-22 13:44

Muskfans have been hating on Fred for years because they believe he’s anti-Tesla.

ascaria 2025-09-22 13:44

Later this year.

Lacrewpandora 2025-09-22 13:46

>Might have an issue every 100 miles of driving I fail to see the point of FSD, if it doesn't...well: actually work.

mishap1 2025-09-22 13:48

It's the same as the Cybertruck "crossing" the Rubicon Trail video. Even if it's a complete debacle, their identity/income source is Teslaworld so not posting it is objectively worse for them. If not for content that they try to monetize, they're just sad people who are all on Elon's nuts for some reason.

neliz 2025-09-22 13:49

judging by how they ran on the shoulder of a highway, I can't compliment them on an overabundance of intelligence, hence why they posted this.

neliz 2025-09-22 13:50

> Tesla hasn't even "solved" wireless charging. He meant the snake chargers, that would auto-deploy when you pulled up to a tesla charger.

neliz 2025-09-22 13:52

because the cop told them that, in order to claim insurance they'd have to have a police report, and the officer said they were at fault.

vassadar 2025-09-22 13:54

A traditional radio sensor would be enough.

Lacrewpandora 2025-09-22 13:55

We all get feedback loops with the videos recommended to us, etc...so it might seem like he's getting a ton of views. But I looked at a 6 day old video of his - 9.6k views and 85 comments vs "Part 1" of the coast to coast drive - 7.1k views and 69 comments I guess "Part 2" is where he hit pay dirt, with 50k views and 446 comments, but I wouldn't call that "viral". His subsequent videos have already died down to under 10k views, so this was a fleeting moment for his channel.

brintoul 2025-09-22 13:57

Thanks for this. That was pretty damn funny.

Kamen_rider_B 2025-09-22 14:01

Crashed before 60 miles = Rise 4% in stock.

paulstanners 2025-09-22 14:05

100% this. I've tried the FSD trials.... it's incredibly stressful. Much more so than driving myself.

Engunnear 2025-09-22 14:17

>I stay attentive No, you don't. I'm sure you think you do, but these two jackasses were "paying attention" and watched the car drive right into an obstacle that they themselves had noted with more than adequate time to react. System monitoring is not even a little bit the same as driving.

rbirming3 2025-09-22 14:18

Compare the Mercedes Drive Pilot to Tesla FSD. There are few opportunities to experience Level 3 as it's mostly lane assist. I'm not saying FSD is perfect. It's not but it's pretty good and yes better today than Mercedes Drive Pilot. Kia not even close

rbirming3 2025-09-22 14:19

And I agree. Camera is not the best solution for FSD for several reasons

rbirming3 2025-09-22 14:24

That's not paying attention.

torokunai 2025-09-22 14:24

yeah I put 2000 miles on it this June in a week. It was fine, but the OP scenario is my #1 fear since I know it doesn't dodge potholes etc. yet.

rbirming3 2025-09-22 14:27

Simple solution. Don't pay for it and don't use. Save a lot of time complaining about it.

capois_lamort 2025-09-22 14:29

That's why the stock is up ONLY 4% today./s

Dangerous-Flamingo92 2025-09-22 14:29

If it's so useless, don't use it. Tell me another car with technology as advanced as that. Yes, it is still a work in progress. They have come a long way since the beginning. It will take some time, but there's no other car company that has the technology like Tesla. You may not like him or his products but truth is truth

Ellidos 2025-09-22 14:36

The cherry on top is that owners pay for a subscription for FSD and owns all the liability.

Whosez 2025-09-22 14:39

Ride across country with hands off wheel - take a Greyhound bus. Ride across country with hands on the wheel - take an ICE or hybrid and not the Tesla.

toomuchhp 2025-09-22 14:41

I only use autopilot now, it’s easier to just let it handle the easy stuff

MochingPet 2025-09-22 14:55

And this is why 5 years or 10 ago, electrek was a useless publication when it was only Elon Elon Elon

sloppyjaloppy5 2025-09-22 14:59

If you actually watch the video… the driver is at fault here. They thought it was “roadkill” and wanted to drive over it. Even then, they could have switched lanes once they saw what it was… They were being idiots.

hashswag00 2025-09-22 14:59

Faux Self Driving

sloppyjaloppy5 2025-09-22 15:00

Exactly. They were being idiots..

AdKey5735 2025-09-22 15:07

car ran over road debris. car continued down the road for half a mile in its own lane until the driver pulled the car over. damage was done to the suspension. a quick fix at a muffler shop had them going again. but it appears the charging system was affected and so they are taking the car to a tesla service center for a complete examination. end of story. geesh! and as usual, people always assume an unfortunate accident would have been avoided if they were in control.. but there's no reason to believe that the driver would have done anything different. and if he did, he could have tried to avoid the debris, ran directly over it, unlike FSD, cut a tire and run off the road. the bottom line is that the car continued safely down its lane after running over the object. not all encounters with road debris end so well. if it had been a plastic bag blowing out from the side of the road, would you want the car to swerve violently to avoid it? ...no.

LightMission4937 2025-09-22 15:20

Full Self Derrr Fi-sting

Engunnear 2025-09-22 15:20

If I were the insurance company, I'd refuse payment on the grounds of gross negligence on his part.

Bravadette 2025-09-22 15:29

Diamondhand bagholders

Bravadette 2025-09-22 15:41

Thankfully its just a ( mobilized surveillance utility) car and not a spaceship from Frederick pohl's Gateway.

Imaginary_Manner_556 2025-09-22 15:42

Well that wouldn’t have been FSD. The entire point of their trip.

razor_train 2025-09-22 15:49

I'll never understand this pseudo-FSD thing. I actually enjoy the act of driving, so I can't grasp the notion of "hey, the car is driving itself, but not really, so you can just sit there but be ready to take control at any random moment." Why not just regular drive if the computer could randomly drive you off a cliff.

nlaak 2025-09-22 15:50

> people are roasting them but their eyes may have fooled them in to thinking the ladder was flatter than it looked. They're roasting them because only an idiot runs over *anything*. I don't run over obvious cardboard, if there's any possibility I can safely miss it. >And they didn't realize how large it was until it was too late. Which is why you don't wait to react. Brake hard and look for a place to direct the vehicle. Don't spent seconds analyzing what it is, assume it's dangerous and react accordingly. >That, combined with how fsd really does change your behavior i.e. you let fsd do its thing and you assume it's gonna correct itself so you don't really know if you should take over. Great, so put more trust in a fundamentally broken autonomous system, rather than trusting your own judgement to be careful.

MrHumph999 2025-09-22 15:59

FSD has really gotten better with HW4 cars. No I'm not a fanboi as I have been kicked out of all other Tesla subs. But you still have to watch for things on the road. These guys are idiots.

dtyamada 2025-09-22 16:23

Snow, they don't get snow in Texas!

Trades46 2025-09-22 16:24

Musk is a fraudster that has too much money & power. End of story.

BidAccomplished4641 2025-09-22 16:25

None of the other road users agreed to be beta testers...

BidAccomplished4641 2025-09-22 16:29

Why? They're an AI and robotics company LOL

malignantz 2025-09-22 16:30

I wonder how many Twitter influencers are familiar with the snake charger.

BidAccomplished4641 2025-09-22 16:33

Well, it drove itself. Promise fulfilled!

GrandElectronic9471 2025-09-22 16:35

Absolute Flim Flam. I'll give them props for posting the video anyway.

Sp1keSp1egel 2025-09-22 16:35

**[A Driverless Tesla Will Travel From L.A. to NYC by 2017, Says Musk](https://www.nbcnews.com/news/amp/ncna670206)** (2016) > By the **end of next year**, said Musk, Tesla would demonstrate a **fully autonomous** drive from, say, “a home in **L.A.**, to **Times Square** ... without the need for a **single touch**, including the charging.”

KnucklesMcGee 2025-09-22 16:36

>The hard part has just started. And there’s no telling how long it will take to get there. If someone is telling you that they know, they are lying. I don’t know. ***My best estimate is approximately 2-3 years and a new hardware suite.*** Oh, Fred. <shrug>

KnucklesMcGee 2025-09-22 16:37

Rented a Nissan Rogue with lane assist, and have to say I prefer to drive the car myself than deal with the lane nagging and wheel twitching under my hands. Think I'll wait for something a bit beyond level 2, thanks very much.

KnucklesMcGee 2025-09-22 16:38

Thanks for risking the lives of the drivers around you!

KPplumbingBob 2025-09-22 16:41

Do you use this braindead logic for anything else or is it only when it comes to Tesla? Why would we not talk about something that might affect all of us?

C0Y0T3Z 2025-09-22 16:48

Anyone in this group actually ever own a Tesla? Never seen this group before and wondering what all the hate is about. They are 1 of the top 3 most purchased cars in the US and likely the safest automobiles on the road. I dont get it. Is this just more jealousy?

newleafkratom 2025-09-22 16:49

And the stock goes up.

HiggsBosonHL 2025-09-22 16:53

No, supervised autonomous driving is one of the necessary steps towards progressing to actual FSD. Tesla calling it FSD, sure that is useless. But this concept itself is not.

V3T_L0L 2025-09-22 16:53

If you actually read the article you'd know the point was to use FSD only due to Musk's promises in 2016/2017. Running this test 8 years later FSD only, was to test how BS he claims were. If it were so easily detectable the car should've easily maneuvered around it. Yes, they could've manually avoided it but that screws up the whole premise.

That-Whereas3367 2025-09-22 16:54

Up 4%.

sloppyjaloppy5 2025-09-22 17:02

Fair point. Fine. lol

AndSoISaysToTheGuy 2025-09-22 17:08

Road debris, ftw....

bobi2393 2025-09-22 17:17

They lied to the officer and both said they thought it was a piece of paper in the road, so unless the insurer sees the video where they see the debris way in advance and say they think it's an animal, but choose to run it over anyway, they should be good.

alaorath 2025-09-22 17:30

(Jeremy Clarkson: Oh Know, anyway)

Engunnear 2025-09-22 17:36

Oh, so they filed a false report with the police, recorded it, and posted it on YouTube? For fuck's sake, they just keep getting dumber.

V3T_L0L 2025-09-22 17:37

Appreciate you considering a different view!

readit145 2025-09-22 17:42

I’m convinced the only people that aren’t stressed by it are people that are just getting their licenses. Half the positive reviews online can’t be real people. Unless someone posts their vin and receipt with their “i love the car” post I don’t believe them. Tesla has tons of employees that they ask to write puffery reviews under multiple accounts.

Crepuscular_Tex 2025-09-22 17:43

Just inject some tax dollar funding from SpaceX rerouted through "totally real" people's PayPal and DogeCoin accounts, and oh yeah...

rajrdajr 2025-09-22 17:43

The biggest problem Tesla has with “Full Self Driving (Beta)” is its name! If they simply gave it the honest name “Tesla Advanced Driver Assistance”, 99% of their marketing and regulatory problems would disappear. Their leader and marketing team are just not willing to do the right thing here.

Crepuscular_Tex 2025-09-22 17:45

Musk garglers

bobi2393 2025-09-22 17:49

They lied to the officer verbally, when asked if they considered moving out of the way, but I think they declined to to have a police report written at that time. The officer said he'd be happy to write one, but suggested contacting their insurance company first and seeing what they wanted. Either way, it could violate California Vehicle Code §31 - False Information to an Officer. They might slip out of it by doubling down on the lie, by lying that they did think it was paper, but only said they thought it was an animal out loud, each keeping their personal paper theory to themselves.

y4udothistome 2025-09-22 17:50

He started this when he was 44 years old he’s now 54 eligible for a senior discount AARP it will be eligible for Social Security before this is working

jeanpaulsarde 2025-09-22 18:04

That's great, that means they have a real chance to make it in Panama

mishap1 2025-09-22 18:09

They're Elon acolytes in California. I'm guessing they're on Tesla insurance which is spending $1.21 on top of every $1 they collect in premiums. If there was ever an insurer with an incentive/data to drop coverage for gross negligence, it'd be Tesla.

MediumHeat2883 2025-09-22 18:21

What kind of person becomes a Tesla influencer?

ARAR1 2025-09-22 18:23

They know it doesn't work. It's just to push blame off them.

UnlessRoundIsFunny 2025-09-22 18:55

Autopilot often feels like supervising a student driver who lacks an instinct for self-preservation. Stressful.

TheBlackUnicorn 2025-09-22 19:22

It's also an oxymoron. "Full Self-Driving (supervised)" is like saying "Free Beer! ($9.95)"

TheBlackUnicorn 2025-09-22 19:39

Well, we do know that a human driver without "FSD" could easily strike this object since we have evidence that one did (the trucker who hit it after them). It's hard to tell from the video since I feel like wide angle videos tend to flatten things, but it looked to me like it really blended in with the road and could have perhaps been a bit of an optical illusion, like the shape of the ladder was weird (a really flat rhombus), and since we seldom see things that shape it could be hard to make sense of what it was. Still a massive fail for Tesla, but I think it's more embarrassing for the claims of "FSD" than it is for these two guys specifically.

mrbuttsavage 2025-09-22 19:50

The guy's name is "BeardedTeslaGuy". People who make a beard or driving a Tesla their personality are already insufferable, let alone together!

Familiar_Gazelle_467 2025-09-22 20:34

Musk propped it up with a billion dollar buy - basically can't go tits up!

pjc50 2025-09-22 21:41

Probably not, but remember when someone intentionally crashed a plane in a fake accident for "content"?

REOreddit 2025-09-22 22:26

Almost 10 years ago (2016) Tesla released a video of a Tesla car driving autonomously with a caption that said more or less that the only reason why there was a human driver in the car was because regulations required it. It's 2025 and there have been cars from other companies allowed to operate with ZERO human employees inside the cars for a few years now. Tesla still can't do that, not because they are not allowed, but because their tech is not capable of doing that. Let that sink in.

iftlatlw 2025-09-22 22:28

Other than the Nazi thing, this self-drive is a pipe dream and Tesla are no longer leaders.

Dmoan 2025-09-22 23:25

If this doesn’t work in highways how can it work in city driving..

FlipZip69 2025-09-22 23:34

It is easy to post a video where it competed a trip successfully. But I suspect for every video where it failed, there are 100 that people simply did not post. After all who wants to post their failures?

devedander 2025-09-23 01:04

Break and gas in case of phantom breaking

Youngnathan2011 2025-09-23 01:59

It doesn’t

Youngnathan2011 2025-09-23 02:03

If you have issues every 100 miles with it, that shits nowhere near being ready for unsupervised use. It’ll likely never be.

Youngnathan2011 2025-09-23 02:04

Well I know here in Australia now I’m an unwilling participant to its testing, just from the fact I have to share the road with Teslas.

Youngnathan2011 2025-09-23 02:06

Man, how much did they spend on that thing just for it to be a failure?

savuporo 2025-09-23 02:12

certainly not a road kill there

AdCharming4240 2025-09-23 02:52

Not on life support. Tesla sycophants still going like weirdos to enjoy a weirdo experience while bothering the neighbors who have to live with the 24hr monstrosity. So genius to combine old style diner experience with futuristic (failing) tech. Really novel, no one ever thought of that.

AdCharming4240 2025-09-23 03:01

Honestly pathetic for Musk to keep bragging about his technology. Guy is smart, but his ego is his detriment. When will TSLA fold like the house of cards it is! At this point its mostly lies and speculation bolstering its valuation.

Yuli-Ban 2025-09-23 04:33

It's been noted for years now that Level 2+, Level 3, and to some limited extent imperfect Level 4 autonomy is actually the most dangerous type of autonomy. Precisely because it's "smart" enough to disarm you. It *can* drive itself in limited extents, and in some niche cases even end to end. It makes you think it's more capable than it actually is. And as a result, you wind up more on edge trying to prepare yourself to take control or you trust it too much and fail to realize when you're in danger. Level 1 stuff like cruise control was perfect because it didn't require *not* driving. You were still in full control of the vehicle; the vehicle simply automated certain aspects related to basic functions to make it easier and *less* brain taxing. Lane assist was also good because it could help keep you from veering into traffic or losing track of the road. Level 4+ and 5 will be cool for the opposite reason— you genuinely *don't* have to drive, since it might as well be a rolling embodied AGI, like an actual digital chauffeur. It's like worrying about controlling a tram or train cart as a passenger. Level 3, though, all the middle levels, is just conceptually awful. It's way too taxing on the brain, relies way too much on fooling you, and is too easy to trust with awful risks. It's actually very similar to the current state of AI as well, with how LLMS are just good enough to convince you they're intelligent but are so brittle that even wording the same prompt slightly differently can result in wildly different levels of quality. But people unaware of these limitations get disarmed and think it's actually intelligent, or they get frustrated but keep insanely trying to make it work because it *can* sometimes work.

Yuli-Ban 2025-09-23 04:42

It's more the antsy "do I, do I not" of level 2+ and level 3 autonomy, because it's like a Potemkin village version of autonomy (how many times have I used the term Potemkin village to describe contemporary AI these days?). It *can* drive itself.... but in niche conditions, and if even a single thing breaks, you have to take over. And the ephemerality of life means you can never know *when* that break will happen. But because it can appear to drive itself so we'll often enough, you get disarmed and relax and lower your guard. And then *bam,* now you're paying both insurance premiums and doctor's bills.

Ok-Bill3318 2025-09-23 04:49

Ok It’s useless to the end user. Training the thing to create a usable product is teslas problem. Not the customer’s.

saro13 2025-09-23 06:29

There are science experiments of linked hamster wheels that show that the hamsters in control of the wheel have far less stress than the hamsters that don’t have control, despite undergoing the same exercises and stressors.

KMS_HYDRA 2025-09-23 07:19

Thats like getting hit by a wave in the ocean, chance off one in a million!

KommunistiHiiri 2025-09-23 07:50

Brother didn't even read the post title.

Withnail2019 2025-09-23 14:18

Right. FSD is absolutely pointless.

StuntID 2025-09-23 14:41

OMG! What's more stressful, driving yourself or using adaptive cruise control? Yup, it's the former. Self driving isn't what is advertised, but it's not the nightmare you imagine as it's mostly just ACC. The branding of the product, and its price is the problem, not the comfort it adds to highway driving >What’s more stressful? Driving leisurely yourself or hovering your palms over the steering wheel, keeping your heel torqued up over the brake, and looking at the screen to continuously make sure it isn’t about to do something stupid? Sheesh

Wicker1913 2025-09-23 21:07

When will Tesla fanboys wake up to Elon chronically over-promising and under-delivering!?!

Witchgrass 2025-09-24 00:48

Brake*, braking*. Unless you're talking about the brakes breaking

ExcitingMeet2443 2025-09-24 01:05

Tesla's self driving abilities are superior to a ~~human~~ stupid driver's.

Remote_Society6021 2025-09-24 01:08

What the hell is a potemkin village anyways?

HellsOtherPpl 2025-09-24 10:14

Snake charmers? Snake oil charmers? Snake oil salesmen?

neliz 2025-09-24 10:15

I think it will be Indian guys guiding the charging cable to your car with a flute

HellsOtherPpl 2025-09-24 10:31

😆

SeldenNeck 2025-09-25 04:04

Insurance companies probably use AI lawyers, who are about equal to self-driving cars. AI will make some people a lot more efficient, and that will displace a lot of workers, but it won't completely drive people out of the market.

CastorX 2025-09-25 06:37

automotive engineer here… I really doubt that a simple automotive radar setup would have detectes this. There might be solution, yea. BUT generally the ADAS radas systems used nowdays generally ignore objects that have zero velocity (are stationary) and don’t have the dimensions of an other vehicle. They do this to prevent phantom brakings. Yes, i know it sounds counterintuitive, but this is how it actually works. The other thing is that they just omit (dont even detect) objects directly on or near the surface of the road because it would also lead to a lot phantom braking. This metal object on the video could have been detected more reliably by a LIDAR. In fact I’m 99% sure that only a camera + Lidar combo could have reliably detected this as a dangerous obstacle.

vassadar 2025-09-25 08:01

Thank you. So, a radio sensor won't detect a wall unless the car keep moving towards it, right?

CastorX 2025-09-25 09:50

So what a sensor can detect and what the SW makes out of the information is 2 different things. It also depends on the type of the radar. What you mentioned in your question might be the case for a "Doppler" only (possibly continuous wave) radar. But that alone is not really used in ADAS systems because its limitations. Anyways, there is a HUGE difference between a wall and an object lying on the road, a wall has a large area and extends upwards. Tesla could have trained their object classification models to recognize the kind of stuff like on the video and any company could make a radar that detects objects like this one. (I explain it a bit more in details in the hope someone reads it) In both cases this would have lead to a lot of phantom braking because of many harmless situations that look like this one for the sensors. In case of the radar solution, a large portion of the radar signal would be reflected by any small objects AND even by the road itself! Bumps, speedbump, potholes even leaves! It's important to mention that radar doesn't give you a high resolution 2D image as a result! It gives you a lot of data but it is actually already pre-processed by the radar's HW. What you get is something like: information about 5 objects in front of the vehicle like relative speed, size (or radar cross-secion), distance and and angle (usually JUST horizontal angle!!): Object at 50m (+-5m), Velocity -20ms/s, Area is 5 (whatever unit they use) at angle 0 degrees (let's say it's in front of the car). That's it. That's the exact reason why it is always combined with at least one on-board camera that can provide more exact information like: shape, vertical position, and inputs for Optical Flow calculations and object classification. On the other hand using camera ONLY is also a shi\*\*y solution because it can't directly measure anything. It can't measure exact distances, it can't measure relative speeds alone. Even in dual camera (stereo) setup it has its weaknesses: Rain, sunshine, low brightness, too high brightness, brightness changes (tunnel in and out), dirt on camera, resolution limitations, motion blur, camera sensor noise and so on. So the conclusion is that you can create camera based and radar based solutions that detect walls (perfect scenario for the radar based systems) and systems that detect the metal thingy that was in the video. However in the metal part case the problem is that it's lying on the road. A radar based system can be tuned to be as sensitive and shoot the beam at the needed angle, but that would lead to a lot (and I mean a LOT) of phantom braking, it would render it unusable in many-many cases (potholes, mountain roads, ramps, ...). I would say it's the same situation as with military radars where there is a minimum angle/altitude where it is still usable. You can imagine it like this object in the video would "fly under the radar". A purely camera based system could be tuned and trained to recognize this but like I mentioned this would surely lead to phantom braking too. However in my personal opinion in this special case it might work actually SOMEWHAT better (under the weather condition like in the video) as radar solution. A LIDAR based solution (obviously it has cameras, LIDAR and radars too) would work the best. Lidar supplies exact geometry data about the surroundings of the vehicle. Combine this with the information from the cameras and radars would possibly be the most reliable. BUT one must not forget that even a lidar has limited resolution. The smaller the object / the further the object is the harder it is to detect.

vassadar 2025-09-25 10:11

Thank you very much. I hit a gold mine of a response.

CastorX 2025-09-25 10:19

Lol. Sure. Im glad it helped.

EverythingMustGo95 2025-09-25 14:43

CA takes that seriously. My MIL’s doctor went to jail for 6 months for lying to a police officer. (He said his buddy was with him while a crime happened, turned out hospital records showed he was doing a surgery at that moment. Whoops. Couldn’t press charges because of his lie so cops were pissed)

ADDSquirell69 2025-09-26 03:15

The car was completely disabled because of that? My Volkswagen Rabbit in high school ran over a ladder and was fine.

Yuli-Ban 2025-09-26 07:11

https://en.wikipedia.org/wiki/Potemkin_village >In politics and economics, a Potemkin village is a construction (literal or figurative) the purpose of which is to provide an external façade to a situation, to make people believe that the situation is better than it actually is. So when I call LLMs a Potemkin village version of AGI, what I'm saying is that they're dressed up like they're these general-purpose AIs that can do anything and everything. They have the scaffolding, they have the front of it, you can ask ChatGPT to do a variety of different things that even a bundle of AIs ten years ago couldn't hope to do even 100th as well as it can. With the slightest push, the entire thing falls apart. They can generate essays and even whole novellas... until you actually read them, and realize that it's obliterated by snowclones, appositive phrases, nominative absolute phrases ("noun did [X], a [Y]," or "[X] said [Y], his tone [Z]"), negation phrases ("it's not X, it's Y") etc. and never even seems to say much of anything, despite trying its damnedest with sophistic pseudo-profound "quote-bait" dialog. You generate an AI slop image, it looks amazing, until you zoom in and see that the fingers are bent wrong, the eyes are fuzzy and the iris seems to be bleeding its color, polyester fabric seems to be made out of latex, and also there's a dog shaped like Cthulhu in the background You try generating AI music, and you hear an eargasmic lick, and you want more of that. Good luck, it's never going to generate what's in your head no matter how many times you reroll it or describe it. All this and more. It all fosters this sense that the AIs we have now aren't just broken or anything but rather outright frustrating, because it's like we *can* automate some tasks... but arbitrarily not others. The automation works to a limited extent, but breaks very quickly. It's supposed to augment human capability, but it works enough that people don't realize how limited it really is and rely on it end to end (see "vibe coding"). And then everything breaks down, like the façade of a building falling on in itself after someone leans on it.

Remote_Society6021 2025-09-26 08:46

Its a shame becuz it has beem this way since i joined the ai hype cycle literally two fucking years ago

LordHeretic 2025-09-26 12:43

I'll never be Nazi-friendly enough to give money to fucking Tesla. That's all on you, capitalists.

Add comment

Login is required to comment.

Login with Google