← Back to topic list

Tesla’s own data confirms Autopilot safety regressed in 2025

Zorkmid123 | 2025-07-23 16:51 | 201 views

Comments (25)
curiousitymdg 2025-07-23 17:51

Confirmed from experience with my Model 3. It disengages unexpectedly, slows down around highway curves that are safe at speed. And, though not related, I still detest implementation of auto wipers.

MarchMurky8649 2025-07-23 18:55

Have they found the limits of what can be done with a cameras only, end-to-end neural network approach? Or is it something else, such as Musk becoming more detached from reality, decent staff being driven away by his politics and management style? Probably all of the above.

[deleted] 2025-07-23 20:09

>Have they found the limits of what can be done with a cameras only problem is there's no limit to what can happen outside the car.

Digg-Sucks 2025-07-23 20:10

This entire report is pure spin from a company that's made spin its business model. If even their own cherry-picked, self-reported data can’t show improvement, imagine how bad the real numbers must be. Here’s why this so-called "safety data" is garbage: - It’s not Autopilot vs humans. It’s Autopilot + human supervision vs humans. Totally different scenario. - Crashes where the airbag or seat belt restraints don’t deploy are excluded. - Crashes where Autopilot is disengaged right before impact conveniently don’t count. - The data is self-reported by Tesla with no independent verification. - Autopilot is used mostly on highways - already the safest driving environments - yet Tesla compares it to all human-driven miles, including city streets and rural roads. It’s smoke and mirrors, not science.

SC_W33DKILL3R 2025-07-23 20:20

Didn't they remove a lot of human written code to move to a more ai based solution. I believe reading this was around the time FSD started having trouble with roundabouts.

Apprehensive-Box-8 2025-07-23 21:41

The issue (I think) really is that Musk is trying to outperform the human brain at basically the same tasks that the brain is doing on a daily basis instead of trying to find an alternative route that is better suited for today’s technology. See, while I was driving today I realized that what my brain does is kind of identical to what Tesla vision is doing. I look at a specific situation with my eyes and my brain then compares that few milliseconds of Information with what I have stored in my memory and decides to - for example- pull out at an intersection or not. The thing is: human eyes are (usually) superior to cameras in picking up things and the brain has wayyyy more processing power than any in-car computer will have for the foreseeable future to process that input and compute it into a decision. Humans, though, tend to make errors. Based on being stressed, not concentrating, too little experience. This is where computers can help and outperform us. But they need different concepts to do that - like different input sensors and other ways of decision making. Elon seems to be obsessed with building super human robots that just perform all human tasks in the same way as humans, but better and I frankly don’t think we’ll have the technology to achieve that anytime soon.

MarchMurky8649 2025-07-23 23:30

As I understand it, yes. So now, if they have a problem, e.g. even though they know a light is red, sometimes the cars get bored waiting and run the light, they have lost the ability to write some code to the effect IF LIGHT RED WAIT UNTIL LIGHT GREEN or whatever.

MarchMurky8649 2025-07-23 23:37

[Our brains have been evolving for millions of years, since before we were mammals, to be optimised for keeping us alive.](https://www.reddit.com/r/RealTesla/comments/1ljxbxo/comment/mzop1p5/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button) Musk has made the mistake of thinking that, just because a human brain can learn to drive in a few hours, he can create a safe enough autonomous driver, just by throwing data at a neural network.

[deleted] 2025-07-24 00:11

That should push the stock up another 15%.

Computers_and_cats 2025-07-24 00:22

Every time Tesla tweaks their software they make something worse. They fix some phantom braking issues to create other ones. My 23 Y's current more common phantom breaking moments are triggered by heat mirages, passing people making a left turn that have their own left turn lane, and something to do with the sun being at the wrong angle at the wrong time. Maybe if Tesla could find a ceo that wasn't boofing drugs and hiring unqualified children their software would be worth a crap.

Computers_and_cats 2025-07-24 00:23

Oh god this is both amusing and terrifying. Can't imagine a car being bored at a red light.

MarchMurky8649 2025-07-24 00:56

[It seems to happen quite often.](https://www.reddit.com/r/TeslaFSD/comments/1li6ric/run_red_light/)

Computers_and_cats 2025-07-24 01:33

Wow

netscorer1 2025-07-24 02:54

One thing that this report does not capture is number of accidents where tesla driver or auto-pilot were ‘at fault’. There are countless situations where another car driver causes accident, and it would still be reported and all tesla doom-boys would cry wolf saying that this is yet another proof that auto-pilot is garbage. Just like a recent deadly Tesla crash at the intersection that killed young woman and severely injured her boyfriend. This was all over this forum posted many times as proof of evil auto-pilot. except the report from the court proceedings clearly showed that it was 100% driver fault and auto-pilot could not prevent the crash.

Hixie 2025-07-24 05:46

When I look at that graph my interpretation is just that the data is really noisy and nothing has really changed over the past 6 quarters.

Done_beat2 2025-07-24 06:03

LiDAR

[deleted] 2025-07-24 06:53

Maybe you need a new one?

[deleted] 2025-07-24 06:56

Elon is trying to make money. He doesn’t care if the tech is good, it just has to be good enough to make him money.

IcyHowl4540 2025-07-24 07:57

\^-- this. Also, username casts random shade at Digg, A+ experience all around.

wongl888 2025-07-24 14:37

Just like the human drivers the model trained on. 🤣

curiousitymdg 2025-07-24 15:11

Hahahah. No.

opsers 2025-07-24 20:14

Driving on one-lane winding highways I had to take control multiple times because FSD was either riding on or crossing over the double line. There was one point where it tried to do it going around a blind corner and I immediately took control, and that was the end of it for me on that road. I don't know if they're trying to make the turns smoother, but that is simply not safe. There are so many other issues too, but I will say I rarely encounter phantom braking on routes I used to get it frequently... so I guess there's that.

KaleLate4894 2025-07-24 21:23

It’s the AI. It’s learned it can’t do this!

veganparrot 2025-07-28 06:46

Human drivers also do try to explicitly follow the rules of the road. It's not all training based (except maybe in a philosophical sense). Like, you are often constantly checking your own decisions against the rules of the road wherever you live. And then reasoning around that, in combination with lived experiences. Helps in rare scenarios too, such as: "there's a Wiley coyote wall in front of me". Even if it should never happen, the brain can discern it through reasoning, and decide not to go full speed ahead.

DrXaos 2025-07-29 08:01

It appears to be looking at cross traffic behavior and deciding “the light will change imminently” even when it is not. The way to fix it is add negatively weighted scenarios to the training data with bad behavior , probably synthetically generated, in combo with equivalent ones where it does the correct action.

Add comment

Login is required to comment.

Login with Google