← Back to topic list

'This is mission critical': Inside Tesla's battle to get Full Self-Driving approved in Europe

dtyamada | 2025-09-04 17:07 | 82 views

Comments (64)
tjtj4444 2025-09-04 17:22

yeah, good luck managing GPSR then. That is the EU product safety regulations that on higher level states that all products sold in EU must be safe, regardless if more specific regulations exists or not. No way they can show FSD is safe (since it isn't).

Status_Ad_4405 2025-09-04 17:32

Yes, bureaucracy ... That annoying thing that keeps companies from killing people

I-Pacer 2025-09-04 17:38

Fuck off Muskkk. Nobody wants your Nazi cars regardless of what snake oil and/or shitty software you have in them.

Sockoflegend 2025-09-04 17:54

I am a big believer that self driving cars can be safer than human drivers eventually. Musk trying to push out the technology before it is safe is going to set that back decades. Putting out a product that is insufficient will irreparably damage public opinion.

AceMcLoud27 2025-09-04 18:04

He had me at sub micron precision.

Exciting_Turn_9559 2025-09-04 18:11

Why would we want to approve something that they have lied about since the beginning?

Krieg 2025-09-04 18:21

Tesla FSD in the Autubahn? Nein

bootstrapping_lad 2025-09-04 18:25

Yep. It will get there and everyone will be shocked that humans drove themselves before. But it's not going to come from rushing Elon's camera-only vaporware through.

Lacrewpandora 2025-09-04 18:49

Heck, let me help TSLA out here. Each country will have its own unique requirements, but in general its just a short application form, and as long as the ~~cabbie~~ ***safety driver*** has a valid license, all you have to do is pay a fee.

fossilnews 2025-09-04 18:49

Just put a driver behind the wheel, like Austin. Mission accomplished.

relentlessoldman 2025-09-04 18:56

Cool they can get sued there too

OrdinaryPollution339 2025-09-04 19:05

It will be very disappointing if any EU countries allow FSD to go to market. The only reason it's for sale in the US is that there hasn't been any functioning regulatory environment for at least a decade (and DOGE killed the dregs that were left). If Tesla wants to sell their SAE level 2 (! I stand corrected) driver assistance they should, at a minimum, have to change the intentionally misleading and dangerous name. Edit: spelling and SAE L3 to L2.

Top_Junket2991 2025-09-04 19:06

Supporters say: well FSD works 99.99% of the time. It's that 0.01% that can cause accident every 10k miles.

SpudsRacer 2025-09-04 19:10

CyberTrucks will be giving anonymous taxi rides through the streets of Berlin any day now...

IShouldNotPost 2025-09-04 19:13

They use kilometers in Europe, so they’ll consider 10 megameters instead of 10k miles

oregon_coastal 2025-09-04 19:19

I thought it worked out to 54 litres?

ExcitingMeet2443 2025-09-04 20:10

>It has posted videos of its cars navigating narrow Roman streets, busy German boulevards, and Paris's Arc de Triomphe  Illegally if real? Or is it AI generated bullshit? Because *there is NO WAY FSD could successfully navigate the Arc de Triomphe*

TheBlackUnicorn 2025-09-04 20:53

> Supporters say: well FSD works 99.99% of the time. 99.99% sounds bonkers. I don't know what the actual numbers are, but I've been in a Tesla on "FSD" and it can't make it more than a couple city blocks. Time is a weird metric, I feel like we'd usually measure by distance between interventions, if we were going on time I'd say like maybe 30-40% of the time in the city, maybe about 80-90% of the time on the highway.

JamesLahey08 2025-09-04 20:56

Are there a few key people that made auto pilot for planes? Like have them be advisors.

Quercus_ 2025-09-04 21:00

The percentage of time that self-driving does things right is completely irrelevant. The only thing that's relevant is how often it does something significantly wrong, across all driving scenarios it will be exposed to.

Fun_Volume2150 2025-09-04 21:02

First they would have to demonstrate level 3.

WoolshirtedWolf 2025-09-04 21:03

It also gives Tesla an out if Europe gives in to internal and external pressure to get the OK. If anyone gets killed, Tesla will throw it back on the regulatory system that passed the car. No way in hell I'd trust this car company. I hope they toss the car out on its rear bumper.

iftlatlw 2025-09-04 21:29

These vehicles are harmful to traffic flow, an insurance nightmare, and hazardous for their passengers and other road users. I would vote against self-driving vehicles of any kind on public roads.

iftlatlw 2025-09-04 21:30

And those wrongdoings include hesitation which will dramatically impact traffic flows. They are just a useless technology in this environment.

lurksAtDogs 2025-09-04 21:35

"Keep in mind that this is mission critical for our leadership," a Tesla employee wrote in an email to the RDW last November, urging the agency to approve Tesla's testing permit by the end of the month. Dutch out of office reply: I’m out of the office. I will respond next year. Dealing with lots of international customers, I strongly prefer the Euro email style of I’ll read this in a few months vs the Americans where everyone is 1 day away from some stupid deadline.

[deleted] 2025-09-04 21:47

Computer vision caps out at 97% accuracy according to a doc video I just watched. That means three out of 100 times it gets something wrong.

Public-Antelope8781 2025-09-04 21:52

No, just like dollars. 10 dollar are 305 cent.

Alert_Breakfast5538 2025-09-04 22:03

Absolutely zero chance they could navigate UK streets.

No_Safety_6803 2025-09-04 22:08

You want people to adopt “FSD”? Accept liability if it causes an accident. Until musk does that it’s just assisted driving.

Row-Maleficent 2025-09-04 23:15

We don't all drive on the same side of the road! Every country has different rules for taxis with some quite powerful lobby groups even within cities... I remember trying to order a taxi in Milan at a hotel concierge... He said, are you a millionaire? No chance that Tesla can navigate this... Ask Uber!

ComicsEtAl 2025-09-05 00:04

Imagine if they applied that attitude to getting it to work?

Fun_Volume2150 2025-09-05 01:07

I learned how to drive roundabouts at the Place d'Etoile, and fully agree with this statement.

dtyamada 2025-09-05 01:14

I remember from one of my trips to the UK, a roundabout surrounded by roundabouts. No way it figures that out.

dtyamada 2025-09-05 01:16

Tesla is famous for hitting deadlines!

dtyamada 2025-09-05 01:18

They could manually program it to succeed as opposed to general FSD. They've faked other videos before.

Same-Dig3622 2025-09-05 04:31

MuskRAT couldn't even get past Texas, now he wants to try to scam Yurope? LOL

dezastrologu 2025-09-05 05:51

fortunately EU still has its bearings somewhat fElon’s bullshit camera-based driving will never get approved

pilgermann 2025-09-05 07:38

It's besides the point anyway. Just because conceptually self driving can be safer doesn't mean Tesla's implementation is good.

neliz 2025-09-05 07:48

It's 235 miles in city limits, slightly higher outside.

ExcitingMeet2443 2025-09-05 08:48

They would have to "manually program" every other driver using that [roundabout](https://youtu.be/-2RCPpdmSVg?si=LoizZeg4qNKIJmjM) too...

gwenver 2025-09-05 08:54

FSD has issues on US grid system cities with roads half a mile wide. It would shit itself trying to drive around most of Europe.

Dommccabe 2025-09-05 08:59

So... who can we bribe to let us make this killing machine kill people??

[deleted] 2025-09-05 10:17

How many fahrenheit is that? 9000?

Advanced_Ad8002 2025-09-05 12:01

So you also didn‘t bother to read the article! Tesla‘s not even going for level 3 in Europe. They just only started / are about to start *testing* their FSD crap for level 2+ (ADAS). (no, FSD is not available in Europe. not even as a beta) Something they should have done already *years* ago. But were too lazy and stupid to bother.

SolutionWarm6576 2025-09-05 12:32

People care about their own personal experience with “FSD”. If it works well for the individual, then that’s their reality. Unfortunately, they don’t care if it fails and causes harm and even death for others. Seems like that’s the way of the world now. Hyper individualism.

dtyamada 2025-09-05 13:22

I meant just for the video, they don't care if it actually works. It's just about optics.

dtyamada 2025-09-05 13:22

*Fingers crossed*

[deleted] 2025-09-05 14:22

This is the case with most engineering. It is easy to do most of the cases, but the edge cases get you.

earth-calling-karma 2025-09-05 14:23

That's 3/16ths of a foot pound in dollar money.

[deleted] 2025-09-05 14:27

Yes, the average USA person drive 12k-14k miles a year, but the average American will be in 3-4 accidents in their lifetime (this doesn't mean at fault). Tesla would be averaging once to two a year (at fault).

[deleted] 2025-09-05 14:33

In the bay area, it's so easy to tell who is using FSD. They drive like a teenager learning to drive but having some episodic micro-seizures every now and then.

Public-Antelope8781 2025-09-05 15:21

In land dollar or sea dollar?

karl-pops-alot 2025-09-05 16:48

We don't want it.

ExcitingMeet2443 2025-09-05 17:26

>Hyper individualism **American** hyper individualism

ExcitingMeet2443 2025-09-05 17:29

>Tesla is famous for hitting deadlines And trees, barriers, emergency vehicles; and the brakes, randomly.

OrdinaryPollution339 2025-09-05 20:16

I read the article. And just skimmed it again to confirm. It doesn't mention SAE classes. FSD is L3, no? Autopilot is L2? Edit: just looked it up and FSD is SAE L2 officially. Crazy that Austin allowed it to operate (briefly) with a driver in the passenger seat when L2 requires a driver. It does illustrate my point about a non-functional regulatory environment, though.

Advanced_Ad8002 2025-09-05 20:19

FSD is still only level 2 (or arguably the new ‚level 2+‘). FSD may disengage anytime w/o prior warning. Driver has to be able to take over anytime, immediately. Driver is always legally responsible. Autopilot is also only level 2.

OrdinaryPollution339 2025-09-05 20:38

Yeah, it's mind boggling to me that both (AP & FSD) are L2. I am a long-time Tesla/Musk skeptic and pay attention and I am still confused. It's no wonder the general public thinks Tesla has self-driving cars. I occasionally take a Waymo and I barely trust it not to kill me (just slightly more than I trust the typical cabbie or uber driver lol). I don't like sharing the road with tech-bros in Teslas "beta testing" L2 software with no supervision / training / safety plan*. Again, this should be illegal. Hopefully in Europe it will continue to be. * between everyone being on their phone, zero law enforcement, and giant trucks and SUVs everywhere - driving in the US is already approaching mad max levels so Teslas are just one more source of anxiety / danger to give wide berth to on the road.

atpplk 2025-09-06 06:28

If FSD can barely get approved in the US it is millenia away from getting approved in Europe.

atpplk 2025-09-06 06:29

Well, not that I want to defend Tesla, but you're also probably seing wrong things too when you drive. Does not mean you're not able to aggregate signals and perform error cancellation.

variaati0 2025-09-06 07:26

"This is mission critical for our CEO" Dutch(probably): that is a you problem, not an us problem. Now where are the testing results we asked.. and the independent testing agency verification to go along with it? Our mission critical is safety and the thing you are asking is safety critical...  so... the.... test.... results.

dagelijksestijl 2025-09-06 19:18

> afterward, a Tesla employee wrote in an email that they made "good points," and Musk "gets it now better." > > In emails, Tesla employees appeared frustrated with the permits the company has been asked to obtain, the tests it has been required to conduct, and scheduling and communications issues with RDW employees. > > They said an accelerated testing timeline was crucial to its success and asked the agency to "be more reasonable in their requests," claiming a testing snafu cost "valuable time." They also argued that European drivers were missing out on new technology. It turns out that we don't want public beta testing on public roads.

dagelijksestijl 2025-09-06 19:22

It would probably get beaten by a modern Peugeot driver on the double mini-roundabout, let alone the Magic Roundabout

[deleted] 2025-09-08 13:45

Aggregate what signals? Tesla's are vision only. The real players in the space use use three technologies, vision, lidar and radar. Humans also have multiple senses. It's absolutely chilling that Tesla finds a 3% error rate in their only technology acceptable.

Add comment

Login is required to comment.

Login with Google