← Back to topic list

Tesla Self-Driving System Will Be Investigated by Safety Agency

HumarockGuy | 2024-10-18 13:58 | 199 views

Comments (193)
Ok_Investigator_5137 2024-10-18 14:12

Technically, they should be looking at the other auto makers. Volkswagen has a huge problem with their automatic driving GMC runs their headlights both high beam and low beam all the time there’s lots of other flaws from other auto makers. that aren’t being checked Tesla by far is one of the best ones that I have seen and have actually used provided you pay attention to the road. If you don’t pay attention It’s quite impressive how well it stops and let’s see you know

iwannabethecyberguy 2024-10-18 14:26

The system isn’t perfect, but I still think it drives safer than most other people on the road.

YouKidsGetOffMyYard 2024-10-18 14:26

Yep I believe Tesla's cars have been investigate 14 times, other car manufacturers also get investigated as well but not as much. Tesla leads in FSD though and has by far the most self driving cars on the road and owners actually using self driving. So take it as you will..

feurie 2024-10-18 14:30

How is that “technically”?

timestudies4meandu 2024-10-18 14:36

manual drivers crashes no investigate? es not norml

adeadfetus 2024-10-18 14:37

How many people have VW and GMC killed?

wentwj 2024-10-18 14:38

what do you mean manual drivers crashes aren’t investigated?

wentwj 2024-10-18 14:39

it seems obvious they’d get investigated more. Not only do they have more usage and therefore more importance but they keep saying things like they’ll release a car without a steering wheel in 2026 that everyone can purchase based on this system.

Fresh-Chemical1688 2024-10-18 14:45

That's not enough tho. Any system that will ever be allowed as self driving technology in any well regulated country has to be far better than the average driver. If you got millions of cars on the road that drive the exact same way and they all have the same weakspots, the danger gets multiplied. For example: this investigation: if all cars on the road are teslas with fsd and they have all the problem with visibility in these conditions, you will have far more accidents with that setup. Now you have drivers that have problems reacting to bad visibility, but the driver involved in another car can maybe salvage their mistake before something bad happens, because he is better at dealing with low visibility. If both drivers aren't able to handle the situation, there will be a crash, which would be the situation with all cars on the roads being autonomous.

AJHenderson 2024-10-18 14:49

Except that they have accidents in that case less than the average. Minor differences mean no two situations are exactly the same. There isn't going to be some freak weather event that causes every single person driving a Tesla to be killed simultaneously. The vision system occasionally doesn't have enough info to avoid an accident despite its best efforts and that occurs less frequently than it does with humans. Now that doesn't necessarily mean it shouldn't still be investigated to make sure that's the case but the NHTSA has a bad track record for reducing functionality if there is any risk to an automated system even if the risk of humans doing it instead is higher.

timestudies4meandu 2024-10-18 14:50

manual driver crashes keep happen so much often but they keep letting manual drivers crashes es not norml

acornManor 2024-10-18 14:50

Pure troll comment - unfortunately, people have killed themselves via inattentiveness and perhaps over relying on the system to keep them safe and ignoring the need to actively remain in control of the vehicle and monitor what’s it doing - no doubt the system does a much better job today in helping prevent people from not paying enough attention - in some cases it goes a bit overboard in that regard

[deleted] 2024-10-18 14:50

Yeah but they are not advertising it as full self driving, they are just driver assists, Tesla wants to take the driver out of the equation with their robotaxi where the riders won’t even have a steering whelk or brakes to control the vehicle. This has to be fully regulated otherwise it’s gonna get messy

wentwj 2024-10-18 14:51

what are you talking about? There’s an entire system for removing people licenses, etc. What are you suggesting they do?

timestudies4meandu 2024-10-18 14:52

they must stop manual driver crashes because they think es norml

Joatboy 2024-10-18 14:52

Who's liable for a failure of the automated system? The driver or the manufacturer?

wentwj 2024-10-18 14:53

Manual driver crashes are investigated and people lose the ability to drive if deemed unsafe. What are you suggesting be changed?

timestudies4meandu 2024-10-18 14:54

change the leaders of crash investigate

AJHenderson 2024-10-18 14:56

For Tesla's system, the driver. They are clearly specified as ADAS systems not autonomous systems currently. The driver is responsible for the operation of the vehicle. I can, however, tell you I've been in situations where my Tesla had a much better understanding of what was going on around me in adverse weather than I did and was able to draw my attention to things before I would have otherwise seen them.

wentwj 2024-10-18 14:57

There’s an entire police and court system. Are you suggesting to remake that system? Or are you just fundamentally not understanding that you would need to evaluate human driver crashes differently from automated driver crashes?

timestudies4meandu 2024-10-18 14:59

no more manual drivers, because it must stop

Impressive_Sleep_801 2024-10-18 14:59

4 incidents out of 2.4M cars with fsd sold to date from 2016. That’s a helluva record and testament that the technology is a success.

wentwj 2024-10-18 15:01

your serious suggestion is that everyone’s drivers license should be immediately taken away, all cars, buses and semis turned in, and no one is allowed to manually operate motor vehicles.

soapinmouth 2024-10-18 15:04

How many times can the NHTSA investigate fsd? At this point it's just a go to job justification exercise for them to keep wasting tax dollars. Nothing has changed since the last time.

cwhiterun 2024-10-18 15:13

The insurance company.

Pantone382c 2024-10-18 15:17

After all the work there are a few more nags and a few minor changes. I think it proves that overall the system is safe or they would have shut it down.

BrainWatt_252 2024-10-18 15:19

that's one of the worst articles I've read from nyt for a long time, not even offering any information about how tesla's driver alert system works?

Fresh-Chemical1688 2024-10-18 15:24

Yeah ofc there won't be events that wipe all out. But still, an automatic system will always be under more scrutiny than human drivers are. And therefore expected to be safe to an extreme degree. Even if it's just because they can ask for that from an automatic system that is one entity but they can't from millions of different people that drive. You can argue if that's the wrong approach or not. I think it's the right approach, but in most cases human drivers should be under way more scrutiny aswell. The way people drive here in Germany sometimes is madness. Or if I think about how my grandpa still drove his car at the end of his life... there should be way more tests to see if you are capable, not just one time getting the permit and you are good for life. And tbh as a European it's strange to see what tesla fsd is allowed to do without doing the regularory approval process.

arathos2k 2024-10-18 15:32

I'm fine with supervised FSD and its great for what it does today, but it's unclear to me how unsupervised FSD will work without lidar, for the reasons mentioned in the article. Many times I will get something like "Right camera obscured, limited auto pilot" or something like that when commuting.

[deleted] 2024-10-18 15:35

[removed]

gnoxy 2024-10-18 15:39

The problem with lidar + radar + whatever sensor on top of vision is ... who is your source of truth. When these systems disagree on what they see, who do you trust?

jtmonkey 2024-10-18 15:45

Don’t they release the data? https://www.tesla.com/VehicleSafetyReport

altimas 2024-10-18 15:45

What you're missing is the autonomous systems get better, quickly, across the whole fleet. Investigations like this is fine, it helps to improve safety for everyone.

TheWay0fLife 2024-10-18 15:48

It's not just Tesla, it's all of Elon's companies. This is lawfare SEC looking into purchase of X FCC refuse to qualify Starlink on program to provide internet for rural America. FAA slows down approval of SpaceX launch Coastal Commission not allowing SpaceX to launch in CA.

cookingboy 2024-10-18 15:51

That’s a wild statement. I feel reasonably safe when I let a stranger drive me anywhere, which is why I use Uber, Lyft, taxi, etc. I would *not* feel safe if a Tesla “robotaxi” showed up today with the current version of FSD, without a steering wheel, to pick me up from the airport. We are nowhere close to compare FSD to human drivers yet.

DIY_Colorado_Guy 2024-10-18 15:54

As a European, you should know that the government regulatory requirements you have for every petty little thing are why your continent as a whole relies on US technology (generally speaking). It's nearly impossible to progress when you're tied down by miles of red tape.

cookingboy 2024-10-18 15:55

Vast majority of those 2.4M cars did not sell with the FSD package. And there are far more than 4 incidents while AP is engaged. Either way it’s hard to call something a success when it literally doesn’t do what its name say it does.

JustPath3874 2024-10-18 15:58

Looking forward to DOGE with Elon helping reduce some of these political investigations. Get out and vote people!

TheKobayashiMoron 2024-10-18 15:59

It's definitely more *cautious* than most people on the road. It just makes mistakes at times. I think the more realistic statement is that FSD *with an attentive driver supervising it* is safer than most people on the road. My car does 99% of the driving but we aren't in 'sit in the back seat without a driver' territory yet. Not by a long shot.

ZetaPower 2024-10-18 16:02

Again…. “SUPERVISED”

arathos2k 2024-10-18 16:04

Riding a Waymo, it seems they have this figured out pretty well. Even though I have no control there I feel super safe in that car. I have supervised FSD and have had to correct/cancel it three times in the past year for potentially unsafe driving.

HumarockGuy 2024-10-18 16:06

4 incidents? I can name 4 fatalities off the top of my head and that doesn’t even include plowing into stationary emergency vehicles.

arathos2k 2024-10-18 16:07

100% agree. I love it for what it is when I use it, but I am definitely aware and attentive for safety reasons. But if it's the same tech in the Robotaxi, I would not feel comfortable with that. On the other hand, I take Waymo's all the time and feel very comfortable in those cars.

thunderslugging 2024-10-18 16:08

I have Waymo in my city and I've yet to see one accident from them. Think these systems are better drivers than humans.

[deleted] 2024-10-18 16:09

Real question for fellow owners of FSD: Would you feel comfortable if FSD drove you to work today with you blindfolded today? I know some fans claim they have barely any interventions, but I find that hard to believe. I will at least step on the accelerator, or to stop it trying to go into the bus lane (or just stop switching lanes nonstop, especially before a particularly busy turn) or to interrupt because it doesn’t understand when it’s appropriate to let someone merge ahead. Usually going to work I have 4-5 “light” interventions (speeding up or canceling a lane change), and 2-4 “heavy” interventions (braking suddenly, grabbing the wheel to adjust a missed turn). This doesn’t include turning into driveways or out of parking lots, which I do manually always. I’m curious how many people are actually at “supervised full self driving” at this stage.

Mundane-Tennis2885 2024-10-18 16:13

I've had wild unsafe experiences including a crash while in an uber.. The driver was distracted and hit the car in front. I've disengaged fsd because it was being dumb and overly cautious. I have not had to disengage in an actual unsafe critical situation yet.

[deleted] 2024-10-18 16:15

[removed]

cup1d_stunt 2024-10-18 16:18

Why do you feel the need to make this political and about Elon?

cup1d_stunt 2024-10-18 16:23

What is the FULL in FSD supposed to mean and who is supervising anything in those robotaxis Elon promised to be released in 2026? Let them investigate it, how is anything wrong with that? Since when do we blindly trust companies and their marketing when it comes to safety of people?

[deleted] 2024-10-18 16:26

I also have the same amount of interventions in my 40 mile commute through SoCal Wild people think it's safer than most drivers.

Impressive_Sleep_801 2024-10-18 16:28

Why? The car drives itself fully autonomously under supervision. How else would you rather call it? And why this frustrated feeling of betrayal for something that is getting better every month? It’s not that is a trivial task and being able to be in this position is already rather remarkable. Would love to see your updated infotainment system…

[deleted] 2024-10-18 16:31

https://youtu.be/FGFoV7NPR9k?si=YEzk4CZbFZ6V6HHt

[deleted] 2024-10-18 16:32

🤣

[deleted] 2024-10-18 16:32

https://youtu.be/FGFoV7NPR9k?si=YEzk4CZbFZ6V6HHt

[deleted] 2024-10-18 16:33

https://youtu.be/FGFoV7NPR9k?si=YEzk4CZbFZ6V6HHt

_Smashbrother_ 2024-10-18 16:34

That's because those Uber drivers are on their best behavior since you can rate them. Average people driving is no where as good. Now imagine the below average drivers (plenty of crashing videos online). FSD is better than those.

cookingboy 2024-10-18 16:36

You are exactly correct, because FSD at the end of the day is a powerful ADAS. And human + ADAS is most likely the safest combo we have today.

farfromelite 2024-10-18 16:37

It's not lawfare. As much as America likes freedom, there's laws and regulation _for a reason_. That reason is usually public safety. Tesla is making a bold claim of unproven technology, and to keep the public safe, it's wise to be a little bit cautious.

cookingboy 2024-10-18 16:40

Yeah because if not for ratings those uber drivers would just be crashing left and right for shits and giggles /s > plenty of crash video online Yet you know what video I can’t find online? FSD safely driving on public road without a human behind the wheel.

Life_Connection420 2024-10-18 16:41

Just another example of a government that is trying to suppress genius due to politics.

farfromelite 2024-10-18 16:41

If it's advertised as "self driving", then people will use that to self drive. I don't know what kind of human you are, but obviously you don't drive with ice on the windscreen, but people sometimes do even though it's illegal. Same with self drive. If it claims to be self drive, then it's a shitty self drive if it can't detect bird shit on the sensor. Basically it's not really the level of self drive that would allow full autonomy and Musk is hyping it up again. It's basically his thing. It's why Tesla is far ahead of any other EV stock at this point.

_Smashbrother_ 2024-10-18 16:42

That's because FSD is still supervised. Dude I use FSD to get to work and back everyday. It's like a 70 mile round trip. I rarely ever have to intervene. So I'm very familiar with it's capabilities. I would rather FSD drive me than the those shitty human drivers.

farfromelite 2024-10-18 16:44

That's basically what billionaires do. They are able to bend the regulatory framework in their favour in a way that ordinary people can't. One law for the kings and untouchables, and another for the rest of us. You know the difference between a millionaire and as billionaire? A billion. You're not one of them.

cookingboy 2024-10-18 16:44

> fully autonomously under supervision That’s an oxymoron. It’s not full self driving unless it can do it without a person behind the wheel, at all time. > How else would you rather call it “Supervised self driving” is a start. > frustrated feeling of betrayal Because of Elon’s marking and promise, my coworker leased a P90D in 2016 and spent thousands on FSD and by the time he returned the car in 2019 he got none of what he paid for. And that’s 5 years ago. If he leased another Tesla with FSD in 2019 the same story would have happened again.

farfromelite 2024-10-18 16:45

The number of posts I've read even this week about fsd dangerously driving is wild. If it claims to be fully autonomous, it had better be a lot better than that.

cookingboy 2024-10-18 16:46

> FSD is still supervised Translation: FSD isn’t full self driving. > I rarely ever have to intervene Until you say “I *never* have to intervene”, FSD isn’t FSD.

LurkerWithAnAccount 2024-10-18 16:52

Chris took his first Waymo and watching it through it eyes, I agree that there may have been instances in this particular Waymo drive where a driver behind the wheel might have intervened. Being in the back seat, it’s just not possible, so you just have to roll with it. The Waymo on this ride exhibited many similar quirks of FSD including a jiggling detected car on the screen, wiggling path line, and an abrupt brake/steering input for no apparent reason. https://youtu.be/Fjh3hD-Zlsc?si=XaXXcfu5cmVL_1NW

TheKobayashiMoron 2024-10-18 17:02

And you shouldn't, because it isn't there yet. They have the best Level 2 ADAS on the market but it's a big leap from that to Level 4 autonomy. It'll be interesting to see how this plays out.

_Smashbrother_ 2024-10-18 17:05

And I'd still trust FSD over the shitty drivers.

Impressive_Sleep_801 2024-10-18 17:15

In 2016 he would have not bought full self driving. Terms and conditions is something you need to read before purchasing something that expensive anyway. He was buying into a beta product he should have been aware of the risk and if uncomfortable opt out. Is not that someone has forced him to buy it. As for Supervised Self Driving - that does not help differentiating enough with an autopilot. The car technically drives itself fully autonomously. The (supervised) feels like a more friendly and law compliant way to tell the drivers they need to pay attention.

Baul 2024-10-18 17:19

> If it claims to be fully autonomous, it had better be a lot better than that. It doesn't. That's what the word "supervised" means.

old-new-programmer 2024-10-18 17:19

I trust FSD about 98%, so no, I would never drive without a blind fold. I drive defensively and even when FSD is on, I have the same mentality and I am almost always very aware of my surroundings, so anytime I've needed to interveen I have and it's been fine. With that said, most of the "interventions" are because it is trying to do something dumb, like you said, it will try to merge into an express lane, over a solid double white line, which you get sent a nice $150 fine for and it wants to do it repeatedly. There is also a bug in my google maps that always wants to take the highway near my house, even though the highway isn't actually near my house. So there are bugs, but when its in traffic and actually making decisions, it's pretty damn good. Even in hairy things like leaving a concert, it really does amazing. I do wish it had the extra sensors just for better accuracy but I got the car during the .99% special and I've never had a new car, so I'm still loving it.

subliver 2024-10-18 17:20

The question is how many miles have been driven with FSD in the cars that have it. Not counting cars.

[deleted] 2024-10-18 17:28

[deleted]

LtoRtoLtoR 2024-10-18 17:35

Hear me out, I'm trying to understand here the hypothesis that vision only is the way to go. Vision systems can only see what it can see, hence it can't "see" through a snowstorm, fog, etc. A human driver can only only see what it can see, hence it won't do better than a vision only system. The only solution to that problem, is to slow down, to have time to see and therefore act. So it can be said that in those circumstances, FSD was probably driving too fast for the conditions. The other solution to that problem is radar, which doesn't need to "see". I guess we can say that eventually if there are two competing systems on the market, radar based systems will be the only ones able to drive at full speed in low visibility situations.

[deleted] 2024-10-18 17:41

You can't find that because A) Tesla does not claim that FSD supervised it is autonomous, and B) Tesla drivers generally read their manual and don't use their cars in unintentional ways. People like you are the ones who pretend that Tesla has an autonomous vehicle on the roads today, not Tesla.

[deleted] 2024-10-18 17:43

If I never intervened once in my car, it would still be vastly safer than most drivers I encounter in the road today. My interventions nowadays are for efficiency reasons, like the car is being too cautious or the car is going to miss an exit and I don't want it to safely reroute on a longer path.

Mediocre-Message4260 2024-10-18 17:43

Dumb question. Why would I feel comfortable misusing the technology? It's not UNsupervised FSD.

[deleted] 2024-10-18 17:43

Why do people ask questions with very easy answers? The driver is. No different than cars with advanced cruise control today.

cookingboy 2024-10-18 17:46

I’m not the one who is saying FSD can now drive better than humans in this thread. If it still requires human supervision, then itself by definition isn’t superior to human.

[deleted] 2024-10-18 17:47

If you make a multiple ten thousand dollar purchase based on what someone says on Twitter and regret it, you deserve to get scammed.

cookingboy 2024-10-18 17:52

> you deserve to get scammed Victim blame aside, so you agree that Elon was scamming people then? At least we are on the same page.

adeadfetus 2024-10-18 17:56

I'm not talking about the people who weren't paying attention, I'm talking about the bystanders. This investigation is largely being spurred by one specific incident that involved a pedestrian that was killed.

RazingsIsNotHomeNow 2024-10-18 18:13

Well Elon claims unsupervised next year lol

Content_Bar_6605 2024-10-18 18:16

No way. Even autopilot and not FSD phantom breaks all the time. It’s scary as hell.

WillzyxTheZypod 2024-10-18 18:23

Likely the one that chooses the safer course of action. It’s the same way our brains react if we hear, but can’t see, an emergency siren, for example. Or when food looks fine but doesn’t smell quite right. As another commenter noted, Waymo seems to have figured it out. SpaceX has also figured it out. Its rockets are autonomous. And I’d imagine they use a whole host of different types of sensors to lift off, remain on course, and land, all without human intervention.

[deleted] 2024-10-18 18:33

[deleted]

TheAce0 2024-10-18 18:42

> it drives safer than most other people on the road. I'm not sure where you're driving, but where I live, most people don't [do shit like this](https://youtube.com/shorts/u707P3pJoIQ) when driving.

vinotauro 2024-10-18 18:42

Model 3 highland owner here. Any time I've ever used FSD (for fun or show a friend/family the technology), it does something incredibly incorrect or with no confidence. Switching lanes when it doesn't need to, cutting other cars off, driving slower than shit in the fast lane, randomly stopping FSD altogether because it can't see road lines or signs. I'll only use autopilot on the freeway or maybe just autopark if I'm really lazy.

NatKingSwole19 2024-10-18 19:04

Not a chance

Rare-Illustrator4443 2024-10-18 19:04

No. I’m someone who has had to make very few interventions, but it only takes a single mistake for a crash.

lioncat55 2024-10-18 19:34

Have you ever been in a car with a teen driving? Unless you can roll back time, there is no way to know if those “heavy” interventions were really needed. People miss their exits (at least good drivers, bad drivers never miss their exit), plenty of times when riding in an Uber or Lyft I would do things differently. I have not used FSD a crazy amount, but I think I would trust it for my drive from home to work (about 15 miles)

GoSh4rks 2024-10-18 19:42

If you gave me like $1m and it was only once, yeah I'd take that bet. But that's all it is, a bet.

Igotnonamebruh42 2024-10-18 19:45

It *drives safer than most human? FSD+human supervising? Maybe. FSD alone without supervision? Hard no.

acornManor 2024-10-18 19:58

I've not heard of that incident - will have to look into what happened...

ZetaPower 2024-10-18 19:59

That’s simple. As long as “FSD” has not been released and Tesla has not assumed liability, YOU are the driver, YOU are responsible. The system is just an ADAS, SAE level 2 until legally and formally stated otherwise. Plenty of warnings, clear for every user. It’s not 2026. Robotaxi MUST be FSD since it’s supposed to be driverless. Elon has always delivered, but never on time, so we’ll see.

cup1d_stunt 2024-10-18 20:02

Are you aware of the broad concept of robotaxis?

ulmersapiens 2024-10-18 20:10

I would be ecstatic with Level 3. I just want to read or do email and take over in ~30s if it needs me.

majesticjg 2024-10-18 20:15

Let me put it like this: If I touched nothing, would it be super-efficient? No, it wouldn't, but I don't think it would get me into an accident. I think of it as a dim-witted but generally-competent uber driver. It doesn't make the same decisions I would make, but I don't think it would get anyone hurt or killed. It might annoy other drivers a bit, though.

Fun_Muscle9399 2024-10-18 20:15

No. While I use it every day, I know when I need to be prepared to intervene. There are certain roads and situations it doesn’t handle well yet. It’s certainly a useful feature, but I very much still want to supervise it. Sometimes I may have no interventions on a drive and sometimes I may have 5-6 in 20 minutes.

Poles_Pole_Vaults 2024-10-18 20:16

I would say the “safer than a human” comes into play for *most* scenarios. FSD and software struggles in the fringe scenarios that may not happen often and that’s where I’d be scared of it making a mistake. I.e someone running a red light, making split second decisions, etc. I haven’t had to experience it but I’d be concerned what incredibly defensive driving and inclement weather driving looked like to FSD.

Blaze4G 2024-10-18 20:36

So you would trust it being blind folded / not able to intervene?

BikebutnotBeast 2024-10-18 20:36

Yes once it's the unsupervised version.

ZetaPower 2024-10-18 20:55

Sure, have been for years. Read the (no longer online) masterplan. You seem to be unable to differentiate the current software, its (in)abilities, legal and formal status from a future promise.

cup1d_stunt 2024-10-18 21:04

How are YOU as a driver liable/responsible if YOU as a driver have no control over the car in the robottaxi? Haha ok, you just edited you previous post. That is very convenient for you

J2048b 2024-10-18 21:05

I think waymo is the only one that trully works… tesla is being propped up by a snake oil salesman, a charlatan…. This is never going to work and cost people money to help their ai…

lioncat55 2024-10-18 21:08

I'll be doing a road trip with City Driving in a week and I can give a more definitive answer.

ZetaPower 2024-10-18 21:13

Never edited anything. Maybe you finally read the entire post? Just keep comparing current Lvl 2 FSD-supervised vs future lvl 5(?) Robotaxi….

katieberry 2024-10-18 21:14

Absolutely not! Just yesterday it almost rammed a cyclist crossing the road, inside a crosswalk, at 40mph. (What road design decisions lead to there being an unsignalled crossing on such a large road I don’t know, but that’s not the cyclist’s fault.) On my daily commute there are also: two off-ramps that it absolutely cannot handle, a barrier which abruptly shifts the road two feet left that with FSD is always a near-miss at best, a left turn at a large intersection where it veers out of the lines (potentially into opposing traffic if anyone is turning left the other way), a lane change it can’t make in time, a highly congested lane merge where it behaves dangerously erratically, and a complex intersection where it doesn’t understand the lighting layout and will both try to go when the lights are red and abruptly stop when the lights are green. This is in the San Francisco Bay Area, the place where it was often claimed to work best.

adrr 2024-10-18 21:32

But isn’t even approved for ADAS in countries where manufactures have to prove their ADAS solution is as safe as a human driver. No FSD in EU or China but you can use other ADAS solutions like Blue Cruise.

djwurm 2024-10-18 21:32

FSD cannot reliably get out of my neighborhood.. it constantly tries to leave to get out on main road and then left but it hesitates and freaks out. I have to take over. it also at the next main intersection when i have to turn left it tries to hit the median as it cuts the turn way to much inside.. highway it's way better but still acts like a tourist driving the city for the first time.. it doesn't know how to handle traffic when you need to get off in say 2 miles the right lanes are slow.. it tries to get over to go faster but then is now in a situation it can't get back over to the right to take the exit.

adrr 2024-10-18 21:34

Thats why they use three sensors. 5D radar(which are in our Teslas but not used), lidar and vision.

cup1d_stunt 2024-10-18 21:40

Weird thing to lie about, that last paragraph was definitely not there, your post never mentioned robotaxis. Also, on desktop it says you edited it. How stupid do you think people are?

ZetaPower 2024-10-18 21:41

Not as stupid as you.

cup1d_stunt 2024-10-18 21:46

Wow, I admire your quick-wittedness and your intelligence you demonstrated with that answer. Anyway, the point is still valid. IF that right now wrongly-labeled FSD wants a chance to be in cars, especially ones that don’t have any controls for drivers (robotaxis) then I am fine with the highest level of scrutiny. Why not?

cookingboy 2024-10-18 21:57

Thank you for being reasonable. It’s absolutely insane that people claim FSD is safer than human when they have no problem getting into an Uber with their kids but how many of them would get into a Tesla robotaxi today with their family?

thebiglebowskiisfine 2024-10-18 22:01

HW4 - solid as a rock. I only intervene when I want to take over. Outside of that I have really no issues with it. V13 is scheduled to drop before the end of October - so we will see the progress.

Sweaty-Mood127 2024-10-18 22:03

Dude it literally says edited above your post

L1amaL1ord 2024-10-18 22:06

I'd image lidar sensors get dirty just like cameras. Lidar+camera solutions also still need cameras to read signs, blinkers, or low reflectivity objects. What all the lidar systems do have and where FSD might be lacking is with high definition maps. In my experience, most of where FSD messes up because of weird lanes or road structures, HD maps would probably fix that. If you think about how a human drives through familiar roads, it's basically with cameras and and HD map (familiarity of the road). Storing and maintaining an HD map is a nightmare though, so I'm sure Tesla doesn't want to do that in order to scale. Maybe a middle ground between google and HD maps could be a good compromise though.

JasonQG 2024-10-18 22:08

Who is claiming FSD unsupervised is safer than a human?

[deleted] 2024-10-18 22:16

Would I be comfortable letting it drive me around blindfolded? Nah. *But*, I am one of those people with barely any interventions. Like, it's not that I'd *expect* to need to take over along the way so much as it is that I'd want the option. There are also some really basic things it just doesn't handle at all that *always* require an intervention, like speed bumps. There aren't any on my regular routes, but they do exist.

ebikeratwork 2024-10-18 22:25

I have HW3: For me, 12.3.6 was close with only minor interventions (too fast over a speed bump or similar), but did a good job driving me to/from work and it used all the right lanes (mostly or was able to correct when it didn't). [12.5.4.1](http://12.5.4.1) is a regression (but better than 12.5.4) and I don't trust it (it gets into wrong lanes all the time or changes lanes in a dangerous manor, e.g. switching lanes a few feet before a line of cars in one lane to get closer to the traffic light because that lane has less cars) .

cookingboy 2024-10-18 22:39

People in this thread lol: https://www.reddit.com/r/teslamotors/s/lzPaZu2bZC

JasonQG 2024-10-18 22:45

I hope they meant with a human supervising. I mean, I guess they have to have meant that, because that’s the only thing that exists. I think sometimes people say things that sound insane, but it’s really just bad wording

cookingboy 2024-10-18 22:51

But if that’s what they meant then it’s also a meaningless statement. It’s like saying “adaptive cruise control is safer with human supervision than human alone”, which is the whole point of safety technology like this.

goodatburningtoast 2024-10-18 22:51

Fucking finally

goodatburningtoast 2024-10-18 22:52

So 2035

JasonQG 2024-10-18 23:01

People argue all the time that FSD+human is unsafe, so I don’t think it’s meaningless to say that

GroovyQschoolboy 2024-10-18 23:09

Insane thing to say lmao

[deleted] 2024-10-18 23:12

[deleted]

MDPROBIFE 2024-10-18 23:33

What version you on?

UncleGrimm 2024-10-19 00:03

I’m pretty sure it would get me into an accident. Since 12.5 I’ve had 3 critical interventions where the car tries to leave a stop-sign, or a turning lane, right in front of oncoming traffic.

haarschmuck 2024-10-19 00:05

LiDAR because it works nothing like a camera which can only see light photons hitting the sensor. Everything from there out needs to be calculated using pixel values which introduces much more opportunities for interference based on lighting conditions on the roadway. LiDAR uses rotating lasers which strike an object and the time it takes to return is given as a distance which is extremely accurate. Like radar, the signal either returns a value or it doesn't.

toecramper 2024-10-19 00:28

This is autopilot not FSD

willybestbuy86 2024-10-19 00:29

Nope and I'm a fan of the system

89bBomUNiZhLkdXDpCwt 2024-10-19 00:51

I live ~five minutes from work and there’s no fucking way I’d allow my car to drive itself with me blindfolded.

NoLimitSoldier31 2024-10-19 01:02

Honest answer, no. But only because there is 2 spots where it gets confused. But in my experience its pretty damn good and i only have hw3.5 (i think) but def not latest. Theyre getting damn good.

74orangebeetle 2024-10-19 01:06

I wouldn't call it safer than most drivers...I'd say it's safer than the worst drivers (who tend to be the ones you notice the most). The percentage might depend on your area. Even being able to use turn signals puts it above a good chunk of drivers (who can't or won't do that).

[deleted] 2024-10-19 01:08

Yes, like 99% of all other CEOs out there. They make promises that are unrealistic and are just marketing. I'm amazed there are people in the world that don't yet understand that, especially with a purchase that should take more than two minutes' worth of research.

[deleted] 2024-10-19 01:10

That's like saying computers aren't better than humans at doing rote arithmetic because they require a human to operate.

cookingboy 2024-10-19 01:56

Computers don’t need human supervision to do arithmetics. WTF are you even talking about lol.

[deleted] 2024-10-19 01:59

12.5.4.1 HW3

bartturner 2024-10-19 02:09

I would not get far. FSD gets hung up after about a quarter of a mile when leaving home. Our subdivision main drag is divided and there is a tall berm in between the lanes. Humans drive to the middle and wait as you can see. FSD is yet not able to do the same. It is a small area between the lanes.

whiteknives 2024-10-19 02:15

>Would you feel comfortable if FSD drove you to work today with you blindfolded today? No, but that is not a scenario that FSD should be used for today. I absolutely would feel comfortable if every other driver on the road was supervising while FSD was in control unless a human intervention was required because FSD doesn’t doom scroll instagram from behind the wheel and get distracted by text messages.

Heidenreich12 2024-10-19 02:50

Nah, but I also don’t use cruise control either. I just see it as a better cruise control.

realityr 2024-10-19 03:23

I can't wait for the government to require multi-sensor systems in all self-driving cars, instead of just relying on cameras like Tesla does.

digitalluck 2024-10-19 03:34

Yeah not at all. I was doing a road trip today and during the sunset, the sun was causing my car to phantom break so many times whenever a car remotely came near the edge of their lane. Soon as we hit a shadow, the FSD functioned like normal. I wouldn’t trust FSD on its own at all.

SecretBG 2024-10-19 04:00

Does phantom braking still happen even when FSD isn’t engaged?

Wacktool 2024-10-19 04:06

Hell no.

digitalluck 2024-10-19 04:25

Not when it isn’t engaged, no. I’ve only ever experienced it when FSD was engaged. It’s not common, but now that the sun is setting around the time I’m driving home I’ve been experiencing it a lot more.

gnoxy 2024-10-19 10:52

If we assume you are right, why do Waymo cars have cameras on them? If you are going to trust something regardless of what any other input you have.

gnoxy 2024-10-19 10:54

Waymo has been locked in the same geofenced areas for years now. Get Waymo in situations it has not seen.

HumanLike 2024-10-19 11:09

I have zero interventions, just phantom brakes that would be annoying if I was blindfolded but not dangerous. I think calling this version supervised is spot on. Also sat a video of a waymo stuck at a broken red light yesterday. My car Tesla made if through a broken red light with flying colors earlier in the year

LeatherClassroom524 2024-10-19 11:14

I love using FSD but it’s a long way from blindfold, at least in my city.

Etrinjx-Void 2024-10-19 11:59

Nope, not because it couldn't do it, but because it has a tendency to occasionally act erratic (phantom braked at 2 lines, confusing lane changes that fight make sense, etc) but the car otherwise is good with that i can ignore it actually for decent periods of time, it's like having a student driver at the wheel honestly, and you're the instructor ready to take over here and there

WillzyxTheZypod 2024-10-19 12:16

It’s a matter of common sense, really. We all know the system struggles in bright sunlight and inclement weather—we’ve all seen the warnings in our own cars. You need something that can see through the weather. Humans have multiple senses for a reason. Airplanes have multiple sensors for a reason. SpaceX rockets have multiple sensors for a reason.

WillzyxTheZypod 2024-10-19 12:29

Because there’s a benefit in using multiple types of sensors.

[deleted] 2024-10-19 12:52

Dude at this point these guys are just throwing stuff at the wall and knowing it does not stick just to bring down tesla stock price. I mean how many investigations like this do you see against Ford & GM when its found out that something legit?

[deleted] 2024-10-19 12:53

No not at the moment which is the point. Tesla litterally tells people to keep their hands on the wheel and eyes on the road while FSD is on.

[deleted] 2024-10-19 12:53

They've been investigate multiple times and guess what they always find something either very minor or nothing

rwrife 2024-10-19 13:05

My 36 mile daily round trip (through Seattle), the most I do is force it to get into the exit lane early. I have mine set to chill and minimal lane changes (which I have to manually enable on every drive).

stefan41 2024-10-19 13:05

I know. Wild that the govt allows autonomous driving systems today that only have 2 sensors on a mast, using reflectors to sense to the rear or sides…

theblackened21 2024-10-19 13:25

My commute is roughly 15 miles of interstate and .25 miles of two lane road. Yes, I would absolutely allow FSD to drive it blindfolded.

DrPeppehr 2024-10-19 13:56

Definitely not because it’s not even claimed to be there yet. It’s getting there though and it’s almost perfect. I have noticed now that it very rarely makes any uncomfortable mistakes. The mistakes it makes for me at least near my house has been it keeps take too long to take an exit, which isn’t a problem when there’s no traffic, but I’m worried that if it does that with a bunch of traffic, it’ll probably just miss the exit and take the next one

zlex 2024-10-19 14:28

Nah sorry that’s total bullshit. Before it became obvious that FSD was garbage the advertising claimed that the only thing holding back functionality was legal and regulatory.

[deleted] 2024-10-19 15:54

[deleted]

JoJoPizzaG 2024-10-19 16:06

It definitely a lot better than the bottom 20% of the drivers.

Dstrongest 2024-10-19 21:05

That’s a hit or miss question because it depends on how much free time I have available to account for the wrong exits , and weird lane merges and occasional “just says take over. “ However it drove basically the entire 35miles to work one day last week.

Dstrongest 2024-10-19 21:06

Is that what that was . I’ll be more observant next time it happens .

Dstrongest 2024-10-19 21:08

Well it’s more attentive than most drivers , not sure I’d say safer yet . But the fact it doesn’t check its phone , or put on make up , or any of the other distractions people seem to do , it has the potential to be safer.

Dstrongest 2024-10-19 21:25

I find that during rush hour when too many cars are around and it’s trying to watch all of them it’s gets slow to react and really twitchy . However , when I turn on the switch for “ no unnecessary lane changes, and then tell it when to change lanes, by putting on my blinker , only when I want it to change lanes , it makes for a happier commute . I also activity increase and decrease my speed limit settings with the speed-wheel . I do this based on traffic and breaking patterns I’m seeing down the road. I also swap back and forth between chill and aggressive follow patterns . Based on how fast traffic is moving too. After reading this above 🤦— My car will be taking g Prozac in a few more drives .

I_TittyFuck_Doves 2024-10-19 22:17

Thank you. I swear some people in this sub just open wide to put Elon’s balls in the back of their throat

ClumpOfCheese 2024-10-19 22:26

I think FSD is pretty good for what it is. I think the best question to ask is if you would rather have every other driver on the road using it. FSD is not good enough to replace me as a driver, it does too many embarrassing things and makes me look like an idiot. Could it drive me home blindfolded? Almost, but it wouldn’t be able to leave my work out of the parking garage and it refuses to park in my driveway. The in between part of the drive has issues with the car slowing down too much for no reason, it doesn’t accelerate fast enough when traffic starts moving again, it makes stupid lane changes, it gets stuck behind bad drivers going too slow and then I’m trapped and can’t get out. The car has major issues with speed limit signs and will often see 55mph truck signs and get stuck going 60mph in a 65 zone where everyone is going 75+ if there isn’t traffic. So if it was to take me from work to home it could probably do it, but would I want to be seen in my personal car with how bad it makes me look? No. Would I pay $99 a month for this feature after the trial ends? No, I don’t see any value in it because I have to monitor it so much and prevent many bad decisions from being made. Autopilot is good enough. Another issue I just remembered is you can’t tell it to stay out of express lanes, so I have to disable lane changes when I’m by express lanes. At most, I’d maybe pay $20 a month for it. But the other issue I have is that if FSD gets in an accident or causes damage to my car, I’m the one who has to deal with that problem, not Tesla.

EnjoyMyDownvote 2024-10-20 00:20

If someone offered me $100,000 to let FSD drive me to work blind folded, I would take that. Granted I’m in California and I heard FSD is better in CA (no idea how true that is).

Ljhughes8 2024-10-20 07:44

Fsd in my area is better than 85 percent of people.

flompwillow 2024-10-20 11:02

Huh, I’d say it would beat many Uber drivers I’ve had, but be a little below average. Barely.

lucidus_somniorum 2024-10-20 12:46

Meanwhile pedos are hit often by cars driven by people. Next up the weather

gnoxy 2024-10-20 13:04

> weather I don't do common sense. I prefer expert sense. So lets say there is a blizzard out. Black ice + feets of snow + white out. There is no human or robot that can drive in those condition. I mean we try, but its only through luck if we make it some place. Going with this extreme to point out that some weather will never be drivable. To be frank, I cannot drive with the sun in my eyes either. Live in an area where roads on east - west hills, the sun will be directly in front of you road height. You cannot view the horizon, at all. There are no sunglasses that will help you here. I would love to see how lidar perform in that situation.

lookin4points 2024-10-20 13:15

I might say better than 50% of the drivers when you add in speeders, texters, eaters, drunk/high drivers, tired, etc.

The1Prodigy1 2024-10-20 15:47

0 chance, it almost tuned into a truck today because... Actually I couldn't even tell. It also almost 2 different exits since the last release... It's more of a supervised driving on long road trips than anything.

The1Prodigy1 2024-10-20 15:51

Lol except FSD stands Full self driving...

The1Prodigy1 2024-10-20 15:54

What? Lol 1- It's not full self driving so people like me actually avoid those accidents  2- FSD in its current form is a marketing scam 3- it's been how many years they they said it will be unsupervised and it's still not

jvoss9 2024-10-20 15:57

I actually would. It’s about 45 miles. Maybe V11 I wouldn’t have because it didn’t handle construction and make shift lanes well but its current state is very good IMO. I do make small adjustments to fix speed and move out of the left lane so blindfold would likely make other drivers less happy but I don’t have concern I would be safe and make it without issue. The last time I had to do a real intervention on FSD was Beta 11 back in October 2022. Every time I “intervene” now is just to improve the drive not because the system is a safety concern.

Baul 2024-10-20 16:20

Life can be confusing if you purposefully omit certain words. Have a day.

Senior_Protection494 2024-10-20 22:31

For one thing, FSD would disengage if it doesn’t see your eyes. So there’s that.

CapB1083 2024-10-20 23:33

I just received another 30 day trial of fsd on my 21 m3. I reluctantly have to say, it is worse than the trial earlier in the year. I was hoping for a slight improvement, but have been disappointed over the last two days.

couldbemage 2024-10-21 04:50

As if there weren't a ton of people already sleeping while using FSD. Don't even need the steering wheel weight anymore, just put in sunglasses and nap.

DaffyDuck 2024-10-21 14:19

I think that’s really the key. Tesla needs to put skin in the game and operate a bunch of robotaxis effectively before many will trust it unsupervised.

joggle1 2024-10-21 16:40

Absolutely not. FSD has finally gotten useful for my commute starting with the 12.5 release (prior to that, I only enabled FSD occasionally as I had to intervene so often that basic autopilot was better). But even with it working far better than before, it still has fundamental limitations. It does not detect potholes. It does not try to dodge road debris or dead animals (unless it's large enough for the car to detect). And it still drives dangerously at times. It *can* handle construction areas, but in my experience it sometimes gets much too close to traffic cones for comfort.

joggle1 2024-10-21 16:48

It depends on the road IMO. For example, there's a hilly road I drive on where there's very limited visibility when cresting hills. There's signs stating to slow down, but almost nobody does. The car on FSD will always slow down. That's certainly the safer behavior in that case as there could be a traffic accident or a deer blocking your lane on the other side of the hill that you may not be able to react in time to avoid hitting if cruising at over 60 mph. But I wouldn't go as far as to say that FSD is *generally* safer than human drivers at this point. For a counterexample to my previous one, there's a windy, unmarked road going up the hills into my neighborhood. On FSD, my car will drive pretty much dead center on the road when going around tight turns. That's definitely unsafe as it's too far to the side of oncoming traffic.

SPACEYMOOTANT 2024-10-22 16:12

If you're going to use full FSD you have to understand you're not driving and you cannot except the AI to drive exactly as you would. You have to bite the bullet and let it do its job, even the embarrassing moves. But most people panic and take over even when they didn't need to. I've watched countless FSD videos on YT and I see a damn good system, way better than anything else out there.

lioncat55 2024-10-30 22:16

Just did a road trip from SoCal to Vegas and back along with some Surface Street driving. I'd definitely be a little nervous, but I'd trust it blindfolded going from my house to work.

lady-lurker 2024-12-15 03:49

I’m honestly too nervous to use the FSD on the road but I absolutely love the self parking functionality. I wish that was possible to purchase separately

Add comment

Login is required to comment.

Login with Google