← Back to topic list

Schrodinger's FSD: it both works and it doesn’t

RipWhenDamageTaken | 2025-11-15 23:07 | 220 views

I bought my Tesla in 2019 and sold it in 2022, but I never stopped following FSD news. I’ve noticed a recurring pattern: FSD both works and doesn’t. 1. FSD is claimed to be ready for unsupervised usage, but every time it makes a mistake, the user/driver is the one that is blamed. 2. Every time a new version comes out, the community claims that it is a massive improvement over the last, and that it is practically perfect. And yet the next one will come out and the same story repeats. Again and again and we’re at version 14. 3. Tesla claims that it is an AI company, and that automation is how it will make money. For example, Tesla claims that when Optimus is ready, sales will be through the roof. But at the same time, FSD sales are incredibly bad. Is FSD ready? Or not?

Comments (120)
DryAssumption 2025-11-15 23:15

Exactly how I feel about generative AI. One minute.. “wow, this is genius”, next minute.. “a seven year old wouldnt have made that mistake’

daveo18 2025-11-15 23:15

FSD works fine until it doesn’t.

Full_Breadfruit_5685 2025-11-15 23:18

Only works when the cat is driving.

razorirr 2025-11-15 23:19

I just drove it from detroit through chicago to milwalkee.  0 hands on wheel, 0 takeovers on road.  400 miles The only manual things i had to do was plug it in at the two charging stops, and move over one stall as it parked in a handicap charger.  Id never seen one of those before.

SC_W33DKILL3R 2025-11-15 23:19

Tesla has always known FSD doesn’t work, there is a documentary about it coming out with ex employees testifying. They were told the project and Elmo’s promises were to increase value. A decent company would not have released software with such defects that got people killed

zeeper25 2025-11-15 23:26

I just did an unsupervised test drive, FSD is pretty advanced at this point, it drives very well, but yes, I was waiting to take over, there just wasn’t a reason to do so. Their free autopilot didn’t work as well as HDA2 in my Ioniq 5

hibikir_40k 2025-11-15 23:29

I tried it when they gave free previews, but there were key locations around my house where it shat the bed consistently: The "put me in danger" kind, because it couldn't handle some lanes, or a left turn with impeded views. It's been a while since they did this though, so I can't speak of the latest versions.

y4udothistome 2025-11-15 23:33

Not

Namerunaunyaroo 2025-11-15 23:37

Yes , it does have a Seasame Street feel about it.

SuicidalPand-a 2025-11-15 23:40

If you try it to find out which, you die.

EverythingMustGo95 2025-11-15 23:41

Toonces does a great job. Until he doesn’t. (Corrected cat’s name, swore at spellchecker)

chicagopwj 2025-11-15 23:44

I drive 70% of that route 24 times a year in my Y.  Please explain to everyone here how it 1) handled tolls? 2) how handled all the construction lanes? 3) how it handled the exchange between Ohio and chicago?  You know why you won’t, because it simply didn’t happen.  This is utter nonsense and pure fantasy…

razorirr 2025-11-15 23:57

1) The car picked to not hit the toll sections until it made it to illinois, which is all open road tolling. So no having to interact with toll booths 2) construction was fine. The car shifted each time right around as it was passing the orange merge signs. 3) Did you forget about indianas existance? Theres no "exchange between ohio and chicago" or are you just some troll who doesnt know what you are talking about.  You now know why you wont respond back. Or if you do its just going to be personal attacks as you have nothing valid.

Pure_Excuse6051 2025-11-16 00:03

It like the fishermen on the market each year telling it's customers the fish tastes the best he's ever had. For that to be true the fish had to be fucking dreadfull 10 years ago.

mustangfan12 2025-11-16 00:10

FSD doesnt work because its camera only. You need Lidar and Radar because a camera can only generate a 2d image. There's also times when light hits the camera at the wrong angle or debris gets on it, etc. Cameras are only 95 percent accurate, you need lidar and radar to fill the gap

AlteredEggo 2025-11-16 00:25

You will know when Tesla is serious about their self-driving capabilities when they are ready to take financial responsibility like Mercedes does. Until then, it's just marketing hype. That usually shuts up the sycophants and marketing accounts.

RipWhenDamageTaken 2025-11-16 00:29

Yea this is exactly what I’m talking about. Yes FSD is flawless, yet there will be a much better version, coming soon. Yes FSD is flawless, yet Tesla struggles to sell FSD subscriptions? Yes FSD is flawless, yet the user/driver still has to take full responsibility in all aspects? If you’re struggling to comprehend the problem, let me help with an example. Let’s say I tell you that my son is extremely smart and capable, and that he can get the task done, but you have to monitor him literally all the time and if any mistake arises, that’s on you, would that make any sense?

Dimathiel49 2025-11-16 00:31

More Heisenberg I think. Everything about it is uncertain.

RipWhenDamageTaken 2025-11-16 00:33

“This calculator is extremely smart but please double check its output every time please”

Engunnear 2025-11-16 00:34

FSD makes a lot more sense when you make the distinction between “driving” and “moving”. You could get an eight year old to make a car move, maybe even at high speeds, and possibly for a substantial distance. You’d never trust that child to drive, though. FSD is very similar - it makes the car move in ways that resemble competence to those who are easily awed, but no sane, intelligent person would ever allow it to drive for itself.

fishsticklovematters 2025-11-16 00:41

Toonces\* [https://www.youtube.com/watch?v=5fvsItXYgzk](https://www.youtube.com/watch?v=5fvsItXYgzk)

fishsticklovematters 2025-11-16 00:42

It phantomed braked at the exact same spot every. time. A busy intersection. I would turn it off through there to avoid problems.

razorirr 2025-11-16 00:42

Do you have any links where telsa has called themselves flawless? That word is used here so much yall are mandela effecting yourselves.  Petition your government to allow them to go without attention monitoring, its not legal in my parts of the country. Yet when i point this out, yall will then immediately say its not ready and shouldnt be allowed.  You can't simultaniously complain about that the user is blamed while not allowing tesla to be blamed.

RipWhenDamageTaken 2025-11-16 01:00

That last paragraph doesn’t make any sense. I feel like I lost brain cells trying to comprehend that. I’m not saying that tesla claimed that FSD is flawless. I’m saying that YOU claimed FSD is flawless in your comment. I’m not going to sit here and spell out everything. Read what I wrote again. Jesus Christ. Way to miss a very simple point.

MUCHO2000 2025-11-16 01:01

At least FSD works well the vast majority of the time. They don't even have a demo of Optimus yet but still claim they will be rolling out in 2026

Calm_Historian9729 2025-11-16 01:02

FSD is listed as level 2 autonomous so treat it like and advanced cruise control. That means its a driver assistance feature not a total control of the car feature. Legally the human is always responsible and liable even if the robot made the mistake and this will not change anytime soon.

mukansamonkey 2025-11-16 01:06

AI: investing billions of dollars into vastly complex software for the purpose of making computers worse at math.

ponewood 2025-11-16 01:16

When is this coming out and where? It’s like my favorite movie already 🍿

SC_W33DKILL3R 2025-11-16 01:18

https://variety.com/2025/film/global/elon-musk-unveiled-documentary-trailer-whistleblowers-1236574785/

SC_W33DKILL3R 2025-11-16 01:18

https://variety.com/2025/film/global/elon-musk-unveiled-documentary-trailer-whistleblowers-1236574785/

ponewood 2025-11-16 01:19

TY

SocialJusticeAndroid 2025-11-16 01:26

Who are all these people who are going to buy Optimus? The same masses of people around the world who are disgusted with 🤡 and refuse to buy Tesla cars? We’re all just going to be ok buying Tesla robots? And for how much, what are they supposed to cost anyway? (I just looked it up. They’ll be priced like cars, perhaps upwards of $30K 🤡 has suggested. I’ll remind everyone that 🤡 also said the ill-fated wankpanzer was supposed to cost $39K.) Also did 🤡 really name his robot after the leader of the Autobots? This would all more aptly be named after a Decepticon.

zitrored 2025-11-16 01:33

It’s my hypothesis on why AWS, Azure and others are running into production problems. They are gung ho on this promise and often finding themselves failing miserably.

SocialJusticeAndroid 2025-11-16 01:33

They need to use a Heisenberg Compensator.

[deleted] 2025-11-16 01:35

[removed]

SocialJusticeAndroid 2025-11-16 01:42

They do?!?😂 It’s like those people who keep giving firm dates on the rapture. Tesla shareholders are equivalent to televangelist’s dupes. Although not quite as bad as some of them can at least take advantage of their fellow dupes until the eventual mother of all rug pulls commences.

frackthestupids 2025-11-16 01:55

Damnit Donut!

[deleted] 2025-11-16 01:59

Unfortunately you will need all users to stop using FSD to make true Schrödinger's FSD happen. I think the Heisenberg's Uncertain FSD is a better fit here as FSD's true status cannot be determined as user/observers intervene the FSD's true state. I think Elmo will be happy with this excuse

moridinamael 2025-11-16 02:03

I use it about half the time that I’m driving nowadays. To me the concept of a “mistake” is undefined and thus nobody can agree on anything. If it doesn’t something I might not have done, like taking an unprotected left when an oncoming car feels a little too close, i let the FSD do its thing, while some people might slam on the brake. Later they’ll say “it almost killed me!” when it would have been totally fine. (It gauges distance better than the human eye, so it sometimes does stuff that feel dangerous but isn’t.) Sometimes it makes goofy decisions in heavy downtown traffic sometimes. Not dangerous, just attempts to save time that I as a human know will actually take more time. Since I am not actually driving I don’t care that much. It’s suboptimal but is it a “mistake”?

FlyingArdilla 2025-11-16 02:11

Does that go before or after the turbo encabulator?

nlaak 2025-11-16 02:43

> I just drove it from detroit through chicago to milwalkee. Your anecdote is exactly that, and doesn't invalidate the anecdotes of hundreds of people that say the opposite. You should go watch the video of the guys that decided to use FSD to go cross country from (IIRC) LA to somewhere on the east coast and ended up hitting a ladder in the road in the first hour. The consensus of the damage at the time was that it probably totaled the car (piece the battery compartment and destroyed several other things). Hilariously, they were all pro-FSD and sat like imbeciles watching the road ahead through the screen, saw the ladder, and waited for it to do something, while it did nothing.

McLeod3577 2025-11-16 02:44

FSD works up until just point of impact, deactivating just in time to avoid liability

MUCHO2000 2025-11-16 02:46

The stock has increased in value over the last 6 months based on the idea that a H1B visa engineering team will solve the complex problems facing developing an autonomous intelligent humanoid robot. This is not to take a jab at those workers. It's just a description of what is literally occurring.

TominatorXX 2025-11-16 02:57

I want to see this movie

RipWhenDamageTaken 2025-11-16 03:10

Only a small percentage of the world population has $35k lying around for an unreliable maid. If people were willing to pay money for convenience, FSD would already be selling like hot cakes. No one can afford it, and no one would finance it either.

RipWhenDamageTaken 2025-11-16 03:14

Therein lies the problem: it cannot take responsibility for itself, while it claims to have capability.

daveo18 2025-11-16 03:19

Correct. Tesla fans used to claim this was FUD, now [proven to be true](https://electrek.co/2025/03/17/tesla-fans-exposes-shadiness-defend-autopilot-crash/)

moridinamael 2025-11-16 03:41

I would say it has the capability. What it doesn’t have is the ability to make the human driver feel perfectly comfortable. It is almost certainly a safer driver than me — if only because it is always paying attention, and my mind often wanders.

mrbuttsavage 2025-11-16 04:12

You'd have to be absolutely nuts to go eyes free with FSD if you are the one responsible for it crashing. You could kill someone that way, including yourself, and be the one responsible for it.

FlipZip69 2025-11-16 04:17

Musk once said something honest (go figure) when he referring to the difficulty of increasing reliability from, for example, 99.9% to 99.99%, then to 99.999%, and so on. Basically is the "march of nines." This is because each additional '9' represents an order of magnitude increase in safety and requires solving the exponentially rarer, "long tail" problems in the data. Right now FSD is at about 99.3 (One incident every 380 miles). At there current rate, they are years away from getting to 99.9 (One incident every 1000 miles). To be fully FSD, Tesla needs to be at 99.9999. Or 1 incident every million miles. It gets incrementally harder to add each 9.

torokunai 2025-11-16 04:55

CGPT helped me get a Scheduled Task installed with GMSA account last week on Windows ... I greatly prefer iterating with that vs. trying to piece together help from old forums.

torokunai 2025-11-16 05:00

Elon pulled a two-fer of promising a service rollout by June and also being available to 'half the population' by December. The June milestone was a real de-minimis effort and I don't think he's going to come close to his December promise, but yet the stock is still 20% ($250B) over where he made those promises.

Big-Dudu-77 2025-11-16 05:03

What do you mean is FSF ready? People who have it use it. Some people like it, some people don’t. I got my 1st Tesla recently and it came with 3 month of FSD. I didn’t think I would need it, but after I started using it, I used it till the very last day. I am cheap and don’t want to pay for unnecessary service but I can see paying for it if I have a long trip planned. As for version 14, there are lots of post of people not liking it due to AI reacting to shadows or leaves. That was the main complaint, but I never experienced that. What I did experience is AI having a hard time deciding if it should switch lane in slow traffic. I just manually took control. Is it ready for unsupervised? I would say no. It’s very close though.

torokunai 2025-11-16 05:03

oddly enough, Tesla has put off pothole avoidance and parking lots to last, when I would have tackled these first if I were doing an ADAS.

MUCHO2000 2025-11-16 05:37

I truly thought we would see 500 before 400 the way they were pumping the stock. I was wrong. The whole thing is bizarre. I've said this before but if Elon was serious he would have bought a whole lot more than 1 billion. It's an unimaginable amount of money but it's like someone who has a 10 million dollar net worth investing 29k.

Lacrewpandora 2025-11-16 05:52

>parked in a handicap charger.  Id never seen one of those before Those are for Cybertruck owners.

Zippytang 2025-11-16 06:26

Who exactly is going to be buying these half baked Optimus robots?

Normal-Selection1537 2025-11-16 07:33

They still can't make it work well enough for that in Musk's Las Vegas sewers where they control all the traffic. It's a dead end solution that'll never work well enough.

Common-Violinist-305 2025-11-16 07:55

it is not ready

onwatershipdown 2025-11-16 08:26

Tesla is retro-futuristic… they’re bringing back gruesome 1950’s-style traffic deaths.

Significant-Branch22 2025-11-16 08:36

As of right now it should be illegal to go eyes free in a Tesla, you should be pulled over by the police if you’re caught doing it and you should lose your license

FlagFootballSaint 2025-11-16 09:01

Millions, I tell you: MILLIONS!!!! /s

Blargh_Rargh 2025-11-16 09:11

To fulfill his contract bonuses, Musk will just sell them to his other companies such as SpaceX, like he is doing right now with all the unsold CyberTrucks.

AndyTheSane 2025-11-16 09:26

You can go eyes only.. that just requires human level visual processing and intelligence. Which is well beyond any AI system so far..

microtherion 2025-11-16 10:15

If one is financed, does that make it Optimus Subprime?

SocialJusticeAndroid 2025-11-16 10:34

🥁 😃He’ll be here all week folks!

analyticaljoe 2025-11-16 11:29

It's all just definitional. I have paid for FSD -- I did so more than 8 years ago now -- and still own the car which is otherwise a fine car. My definition is "attention". Between the paint it black video and Elon talking about the car driving itself from LA to NYC; the company was promoting the ability to sit in the car and not pay attention with the exception of those pesky regulators. > FSD is claimed to be ready for unsupervised usage, but every time it makes a mistake, the user/driver is the one that is blamed. I have this argument with FSD owners every once in a while and they all end the same way. The person tells me how awesome FSD is and how they never have to intervene anymore and it's driving them from work to the grocery to home. And I say: "Then you should start reading a book or doing email." And that's the end of it because it pushes them right against the reality that neither Tesla, nor they, think that it is safe enough to ignore. I think Tesla has, thankfully, finally gotten more honest by sticking the word "supervised" on the end of FSD. On the other hand, they've started lying more by sticking the word "robotaxi" onto cars where there is someone getting paid in the car to ensure you are getting there safely. If there are two things you can count on Elon for, they are fascism and lying. But that's the standard: can you ignore it and do something else. Because unless you have repetitive stress injuries and monitoring is physically easier for you; any arguments of "I find it less mentally draining" just means you are not doing it well enough. If you have to mentally track N things to drive safely, then you have to track N+1 things to monitor your Tesla. All the previous N, plus the car itself. And even then, no monitor can undo a brake check.

MarchMurky8649 2025-11-16 13:15

"Nov. 16 premiere"

rutanfan12 2025-11-16 14:03

Cybertruck owners

BringBackUsenet 2025-11-16 14:35

The same wankers that always piss away money on expensive trophies for themselves.

BringBackUsenet 2025-11-16 14:36

I've found AI can be helpful, until it's not, which is more often than not.

BringBackUsenet 2025-11-16 14:37

Which is why Tesla's management needs to go down for negligent homicide.

BringBackUsenet 2025-11-16 14:38

Red Asphalt VI, The Tesla Edition.

BringBackUsenet 2025-11-16 14:40

You could kill someone anyway. Those 4 robotaxis in Austin that crashed were all "supervised".

BringBackUsenet 2025-11-16 14:41

Have you seen the videos of people sleeping while their Teslas drive? Then the cops have to chase the car for miles trying to wake up the "driver".

BringBackUsenet 2025-11-16 14:42

It's Full Reckless Driving.

BringBackUsenet 2025-11-16 14:43

If they say $30k, then figure at least $100k for the bottom tier model.

BringBackUsenet 2025-11-16 14:45

I'd prefer the 8-yo.

BringBackUsenet 2025-11-16 14:50

\> At least FSD works well the vast majority of the time. It works just well enough for people to have a false sense of security about using it. It also doesn't help the situation by calling it "self driving". They should be honest like other carmakers and call it something like "driver assist". \> They don't even have a demo of Optimus yet but still claim they will be rolling out in 2026 "Next year". Yeah i love you. The check's in the mail. I promise not to c..........

BringBackUsenet 2025-11-16 14:50

The Greater Fool Theory

rajrdajr 2025-11-16 14:50

> It's a dead end solution that'll never work well enough. The only-cameras decision pushed out completion by at least a decade under current rules. Waymo is doing the heavy lifting in developing the self-driving market and they use video, LiDAR, ultrasonic range, and radar on high precision maps. Elmo’s bromance with POTUS 47 failed to save the EV tax credit, but maybe round 2 is to push for the NHTSA to redefine the rules for self driving cars. Perhaps they would only require them to be safer than human drivers?

BringBackUsenet 2025-11-16 14:51

No, the stock increases because it's a bubble with a life of its own. TSLA is completely detached from Tesla.

BringBackUsenet 2025-11-16 14:52

That alone is why I would never use it, even if it did work.

TominatorXX 2025-11-16 15:20

Is it streaming anywhere?

Bill92677 2025-11-16 15:27

Like hype is a new concept. Come on.

Engunnear 2025-11-16 17:17

I would have said four year old, but they can’t reach the pedals.

DryAssumption 2025-11-16 17:43

I barely ever hear hallucinations get mentioned amongst the AI hype, yet several years in they’re still a major problem that doesn’t seem to be going away soon

BringBackUsenet 2025-11-16 18:21

They can still steer a Big Wheel.

CaptainMegaNads 2025-11-16 20:07

It is, in fact, getting worse. Open AI should hurry their IPO.

razorirr 2025-11-16 21:16

I mean since you took me saying "it didnt make any mistakes in a 400 mile drive" as flawless, thats on you. Flawless would be "it doesnt make mistakes". You seem blind to this difference or are just jumping to a conclusion about what i said to suit your needs.  So of course you didnt understand the last paragraph as you are missing core concepts of english as a whole here

razorirr 2025-11-16 21:18

Ohhh look, no response back, called it :)

razorirr 2025-11-16 21:19

All 30 seconds early

razorirr 2025-11-16 21:21

Crash reporting to NHTSA requires the deactivation to be >30 seconds prior for ADAS to not require reporting.  So you can argue "ok they dont report then" and ill leave that up to you to prove, but your FUD claim as it stands is FUD, even if FSD is perfectly snapping off a half second before impact it would be countable.

razorirr 2025-11-16 21:27

Humans arent 1:1,000,000, we are 1:250,000 Also what is your definition of incident? My 1:250,000 is reported crashes. In no way are teslas crashing 650x more than humans, they would be uninsurable.

beren12 2025-11-16 23:57

Which drivers? Drunk ones? High ones?

beren12 2025-11-16 23:57

I refuse to call him private doughnut!

beren12 2025-11-16 23:58

Pretty sure they only count it within 5secs according to the fsd sub.

beren12 2025-11-16 23:59

And 10k/yr subscription cost

beren12 2025-11-17 00:00

But that includes drunk, high, sleepy, non-roadworthy beaters, etc.

Competitive_Mall_968 2025-11-17 01:33

Tbh, you sound like a person Optimus and other robots will easily replace. Companies will buy them. 30k is nothing to replace an expensive, lazy, whining, redditing poor performing, toxic dude who rather habe another job but not fit for anything more complicated. Who's effective work time is not even close to his paid time. With something that costs 30k, works 365/24/7 and maybe most important, doesn't piss off the management everyday with their shit.  If Optimus starts getting close to 30-40% human performance/hour this thing will be the most wanted item on the planet. Even if the thing is not quite as effective as a human you can just get two. Even three if you start to factor in everything, including your managers mental health.

FourLeggedJedi 2025-11-17 02:24

2 weeks.

NilsTillander 2025-11-17 05:45

Point 2 is the most amazing. So many rounds of "the new version is amazing". Makes you think how bad it was, and still is.

razorirr 2025-11-17 12:36

And? When comparing FSD vs Human, you dont get to factor out all those, since they are on the road. While teslas will eventually get old enough to be "non roadworthy beaters" a computer wont get drunk or high, so part of the benefit of having computers drive and removing the steering wheel eventually is that that removes the threat of "that guy driving in front of me is hammered"

razorirr 2025-11-17 12:50

Level 2 ADAS: Entities named in the General Order must report a crash if Level 2 ADAS was in use at any time within 30 seconds of the crash and the crash involved a vulnerable road user being struck or resulted in a fatality, an air bag deployment, or any individual being transported to a hospital for medical treatment Direct copy from NHTSAs page here.  https://www.nhtsa.gov/laws-regulations/standing-general-order-crash-reporting Tesla has reported 2439 incidents. Which is basically all of them. This is going to be a combination of AP and FSD though as both are considered level 2 as level 1 is "car controls exactly 1 thing".  IMO tesla is the only one actually doing proper reporting as a huge amount of other cars have lanekeep now yet if you look at the figures, GM for example only reported 70, and ford 29. I get the argument could be made that tesla might be worse, but 100x worse is going to make them uninsurable. I think its that NHTSA constantly only investigates them so they have their reporting on lock while the others don't

beren12 2025-11-17 12:51

Well, that’s my point. Drowsy driving is estimated being the cause of almost 20% fatal accidents, and 100k to 325k accidents total every year. About 30% of all fatal accidents are from driving drunk. Call me when FSD drives better than regular humans, not like a drunk or sleepy one.

razorirr 2025-11-17 12:57

Sure. Let me know when this sub will take actual numbers instead of saying FSD crashes every 1:380 miles while humans are 1:1000000 when 1:380 is definately wrong (my car would have crashed 65 times last year) and a simple google shows the guys estimate showed humans as 4x better than we are.

beren12 2025-11-17 13:12

Well, that’s the thing, if you didn’t stop it it probably would’ve crashed that often. It’s like that cross country trip where people tried to trust it, and didn’t intervene and it rammed into a piece of debris on the road. Surprisingly, though, most people aren’t willing to trust their life to the machine. If you are, never override it, and let us know how it works.

beren12 2025-11-17 14:41

Funny, you mention that. Some insurance companies are exceptionally expensive to ensure a Tesla with. And if you read the Tesla subs, a lot of the drivers are doing very dangerous things with FSD, which is probably why it is so much worse for accident numbers. People admit to reading emails watching movies not paying attention, and they use FST everywhere some people claim over 95% of their driving has it enabled . I only would use Lane keep on the highway.

razorirr 2025-11-17 16:06

Progressive and state farm were both cheaper for my model 3 than my challenger. The plaid cost more but i expect a 100k car to run more than a 40k car.

razorirr 2025-11-17 16:27

Then by your logic i crashed twice this weekend when i actually crashed 0 times with 0 takeovers.  The only correction i had to make was when it backed into supercharger stalls,  which it did successfully 2 times, and without incident the other 2 times.  The without incident ones were that it picked one with blue lines and im not handicapped, and the other was that of the two open spaces it could pick from, the car next to the first spot was a mach e and those use the spots next to its charger, so i didnt have a cable to use.

beren12 2025-11-17 16:28

Reread what I typed. I didn’t say every intervention would be a crash.

razorirr 2025-11-17 16:37

But flipzip is. Which is ehere the numbers came from. Which is why i said initially that its an unfair comparison. They are trying to compare required interventions per mile to crashes by humans per mile.  Apple, meet Orange.

TurboUltiman 2025-11-17 21:09

I don’t know what it was like in 2022 but I have a 2024 and the fsd is pretty amazing. Use it for about 90% of my driving with no issues.

theviolatr 2025-11-18 01:39

So Elon said that "soon" one can text and drive. Well actually I can do that now in my car...and if i crash, guess who is responsible??? Ultimately this comes down to liability. Until Tesla takes responsibility legally for anything it will just be smoke and mirrors. Hence the incessant talk about robots lately to distract

FlipZip69 2025-11-18 04:13

Let us break this down. First humans crash at rate almost 5 times in bad weather than good weather. In good weather they crash at a rate about once every 1.2 million miles. So that is the only number you can compare FSD to as FSD only works in decent weather. Secondly, even in good weather, FSD reverts control back to a driver on average once ever 380 miles. That was a year back but that rate is not climbing fast. So lets say it is once every 500 miles. Can you imagine how bad a driver would have to be if they let go of the steering wheel on average once ever 500 miles? And that in good weather. So tell me, what stat do you use?

razorirr 2025-11-18 04:36

Do you want to break it to my car that its not actually running FSD driving me around in rainstorms and snowstorms then? I also wonder who the hell is getting all my reversions you keep claiming. Is that a number tesla released first hand or is that dan odowd and friends? Since i got my Plaid 49,000 miles ago the only forced takeovers ive gotten were me accidentally gassing it to over 85, or friends thinking they could peek at their phones and me laughing at them as it kicked them out quickly.

FlipZip69 2025-11-18 04:42

That was the stat Tesla released before they started hiding those stats. More so, the taxi has had 4 full on accidents in 4 months in trials in Texas. Trials in good weather only, all of 12 cars on the road and... with a safety driver. Can you imagine the number of accidents if they did not have a safety driver? Luckily the accidents were minor and only one personal injury. But that in no way is indicative of a human level driving. These were completely FSD issues and not something unique to being a taxi.

razorirr 2025-11-18 11:20

Id go with 8.  I got to that since waymo has 2500 cars and under the NHTSA adas ads crash reporting data, which shows teslas 4 crashes in the last 12 calendar months waymo had 1331.  Id not call 1 crash per every 2 cars over the last year as indicative of human driving either. So pull em all or pull none.

FlipZip69 2025-11-19 01:35

Waymo drives 250,000 trips per week with thousands of cars. Tesla does about 1000 per week with 12 cars. More so, unlike Tesla that hides the crash details entirely, Waymo posts exactly what happened. And of those accidents you listed, only about 4 percent were the fault of Waymo. Or about 53. At Tesla's rate, if they drove he same number of trips as Waymo, they would have 1000 accidents in the same period. There is a reason Tesla hides every accident. They are not even close and you can go to Waymo's site to look at absolutely every accident. But Tesla hides it and actually took the Texas government to court to have their accidents details remain hidden. Why is Tesla so afraid to release that information?

razorirr 2025-11-19 03:45

Its interesting that waymo has every 1 in two cars get crashed into per year. Are every one in two yellow cabs in NYC getting in an accident per year? If not, that shows waymo is worse than humans and by everyone in this subs argument, be taken off the road.

Add comment

Login is required to comment.

Login with Google