← Back to topic list

Elon says "LiDAR also does not work well in snow, rain or dust due to reflection scatter. That’s why Waymos stop working in any heavy precipitation."

komocode_ | 2025-08-26 18:39 | 431 views

Comments (439)
AutoModerator 2025-08-26 18:39

**I am a bot. This is a friendly reminder that unwelcoming toxic/griefing/pessimistic sniping comments that are not on topic and don’t move the discussion forward will be removed. A ban will be issued if necessary. Consider this before commenting. Report posts or comments that violate the [Rules](https://www.reddit.com/mod/teslamotors/rules/). Thank you.** If you are unable to find it, use the link to it. We are not a support sub, please make sure to use the proper resources if you have questions: [Official Tesla Support](https://www.tesla.com/support), [r/TeslaLounge](https://www.reddit.com/r/TeslaLounge/) personal content | [Discord Live Chat](https://discord.gg/tesla) for anything. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/teslamotors) if you have any questions or concerns.*

WelpSigh 2025-08-26 22:39

A self driving system can use both cameras and lidar, allowing it to see in conditions that are strong for one sensor and weak for another. But at this point he's just being stubborn.

NetJnkie 2025-08-26 22:42

Wait until he tries his FSD w/ cameras in the rain or snow.

bobkuehne 2025-08-26 22:42

Works great. Drive it every day, all seasons.

NetJnkie 2025-08-26 22:44

Uh..no...no it doesn't. Rain? It'll start complaining and then stop working. "Full-Self Driving May be Degraded" blah blah blah

joshshua 2025-08-26 22:45

Why not run vehicles with both for a while to understand the relative strengths and weaknesses? Maybe a hybrid model is the most effective and both Waymo and Tesla are doing it wrong.

AManHere 2025-08-26 22:45

I don't know about that. Heavy rain - fine most of the time but I still get moments when autopilot says the conditions are too extreme. With snow - even worse.

VeryRealHuman23 2025-08-26 22:48

Well first it will put the wipers to FULL FUCKING SEND when it drizzles and then eventually scream at you in the pouring rain that the camera are obstructed.

WelpSigh 2025-08-26 22:49

Waymo cars have 29 cameras. They already have a hybrid system.

joshshua 2025-08-26 22:52

Cool, thanks for replying 👍

ChymChymX 2025-08-26 22:54

He's stubborn because he can make the vehicles with great margins as-is, whereas Waymos are ridiculously expensive per vehicle. Also, the FSD software in theory can work anywhere, does not require pre-mapping (also another upfront lidar expense for every new market), and he sells it as a SaaS product and will likely license it in the future. IF they are able to pull it off, continue to expand market SAFELY, and remove the safety passenger monitors over time, it will indeed beat lidar-based taxis from a business bottom line perspective. Edit: for those downvoting--I said IF in caps. I'm just explaining their thesis, not defending it. Bet against it if you want, easy enough to do.

WelpSigh 2025-08-26 22:54

You're welcome 👍

Starky_Love 2025-08-26 22:55

Tesla Vision won't work half the year in Florida

22marks 2025-08-26 22:55

I think LiDAR is a red herring, and they should be looking at FLIR and infrared. We know the infrastructure is designed for vision. Supplementing that vision with wavelengths that the human eye can't see is incredibly helpful. Cameras in a predictable forward motion create 3D structure from motion quite well using parallax. LiDAR is best used for static mapping. Visible plus NIR/thermal plus traditional HD radar is generally lighter, cheaper, and more robust. EDIT: Tesla autonomy stack competitor, Mobileye (who partnered with Tesla on HW1 before a falling out) officially announced it would terminate its internal development of next-generation Frequency‑Modulated Continuous‑Wave (FMCW) LiDAR for autonomous and highly automated driving systems. The decision stemmed from substantial advancements in their EyeQ6-based computer vision capabilities. Major developments are in Imaging Radar, which can be called HD or 4D radar. Here's a sample from one of the Imaging Radar companies: https://www.youtube.com/watch?v=3LV46bF3tiM. Radar s much more weather-resistant (in fog, snow, and rain) than LiDAR while also being cheaper and lighter. It's a better compliment to vision and I believe we'll see that in the mass market.

sessamekesh 2025-08-26 22:56

I love what Tesla is trying with their self driving tech and am truly rooting for them. So many lives could be saved, and so many lives could be made better. It will truly change the world when (not if) they pull it off. But every time Elon opens his mouth about the whole camera/lidar thing it feels like he's a stubborn kid doubling down on a snap decision he made 15 years ago.  We get it. Human eyes are just cameras, you shouldn't need more than that. Cool.  Depth perception from binocular vision isn't effective past a dozen meters or so though and you don't see anyone intentionally driving with one eye shut.  I'm not even saying Tesla needs to adapt. But the constant smack talking of _good technology_ just erodes my faith that FSD is anything more than a carrot to dangle in front of investors.

fooknprawn 2025-08-26 22:58

Canadian here. Yeah well autopilot doesn't work in heavy snow or slush because they refuse to add weather mitigation to all the cameras. Heaters and sprayers to back and side cameras would go a long way. Heck, if I lose a camera or it decides it doesn't want to engage, I dont even get basic cruise control. Oh and before I get hate: my 22 Model Y has radar and ultrasonic sensors but Tesla decided to deactivate all of those in favor of cameras only. 🤬🤬🤬

scully19 2025-08-26 23:05

Heavy Snow is bad. Worse is driving in slush that gets sprayed and leaves dirt on the window. Covers the camera because they are all in one place and you're done. Need to be constantly spraying washer fluid.

yeah__good_okay 2025-08-26 23:06

I would never use it in the snow. I tried, once, and it did not go well.

[deleted] 2025-08-26 23:07

[deleted]

GrundleTrunk 2025-08-26 23:08

If a self driving system can operate on cameras and discard lidar data due to inclement weather, then it should be fine on cameras in optimal conditions as well. The reality is either cameras are sufficient or they aren't. Maybe they aren't, but so far the march of progress appears to indicate otherwise.

gttom 2025-08-26 23:12

If the car doesn’t have lidar it’s effectively always ignoring it and could crash in in the same scenario

[deleted] 2025-08-26 23:14

[deleted]

physicshammer 2025-08-26 23:15

you know what they do see though, parked firetrucks.

JamieTimee 2025-08-26 23:15

That sounds like an issue which can be solved. Cameras alone have their limitations and no amount of software can make up for missing hardware.

The_Lawler 2025-08-26 23:15

I had to drive for three straight hours because the rainfall was too much for FSD. This was on a 10 hour trip where the rain was seriously bad for 30 minutes. Yet FSD was unusable for nearly 3 hours. I can’t even imagine what would happen if a robotaxi stopped functioning for 3 hours.

[deleted] 2025-08-26 23:17

[deleted]

Valuesauce 2025-08-26 23:17

There’s a decision to be made there. What do you trust? If your vision says it’s fine, but your laser says you will hit something — how do you know which to believe is correct? If you are the ai driving, you now need to decide which to believe and act on it. If your response is just use the vision then my question would by why do you need the lidar and that’s exactly why Tesla choose just the one input. The one that’s harder to properly use overall but is less likely to have a false positive. It’s also cheaper.

1988rx7T2 2025-08-26 23:18

That’s not actually how it works. If one sees something and the other doesn’t, you have to basically pick which sensor to trust.

thebiglebowskiisfine 2025-08-26 23:18

I worked in industrial automation when more advanced sensors (proximity, laser, vision, etc) began replacing physical limit switches to control machine operation and guarantee worker safety. Musk is correct in his approach. Our engineers would never double up on inputs that could conflict. Bad things happen when you take this approach.

redline83 2025-08-26 23:20

In theory, I am a professional racing car driver.

WelpSigh 2025-08-26 23:22

Machine learning works exactly the same with any combination of sensor packages. It already works for a quarter of a million Waymo rides per week.

wish_you_a_nice_day 2025-08-26 23:23

But guess what. Camera only gets blinded by the sun

cookingboy 2025-08-26 23:26

I posted this comment when you posted to the investor sub and I’ll just post it again here: Anyone who's done a senior design project in EE or similar major would be able to tell you what Elon has been saying about the difficulty of “sensor fusion” is just absurd. In AVs LIDARs are used for localization (measuring location and movement vector of objects), which is not what cameras should be used for at all. Sensor fusing cameras and LIDAR for localization is about as stupid as trying to sensor fuse cameras and microphone when all you are trying to do is to build a sound recorder. The whole "it's not a good idea to fuse LIDAR data with camera" is only remotely a problem because of the idiotic decision to use camera as the principal sensor for localization. Imagine instead of using a microphone, Elon is trying to build a voice recorder by using advanced cameras to detect surface vibration of objects by "seeing" sound, and then tells people they shouldn't use microphones because sensor fusion is hard. If that sounds fucking stupid, it's because it is. The rest of the industry laughed out loud at this idea 9 years ago and they aren't laughing anymore, not because Elon was proven right, but because everyone just feels bad for Tesla engineers at this point. For everything else, there are things LIDAR is better at and there are things camera is better at. But overall the more sensor you have the better. Just look at what they have in the F-35, they have state of the art radar and electro optical sensors.

sessamekesh 2025-08-26 23:26

Exactly! And if the results are good, I don't care if the means look a bit weird from the outside looking in. I'm not opposed to sticking to camera-only tech, I'm sure there's a laundry list of great business and tech reasons I'm not aware of on top of the ones that I am to stay the course. But I do want to push back hard on what Elon keeps saying, which in this analogy is "keeping both eyes open is stupid anyways, sometimes you get a distracting eyelash in your eye and keeping both open doubles the risk." It's weird and just reeks of insecurity, which is odd for someone leading FSD tech by a country mile.

SirBill01 2025-08-26 23:26

You can also just use cameras which can see better in any conditions than LIDAR.

JFreader 2025-08-26 23:27

So have degraded sensing sometimes or all the time.

SirBill01 2025-08-26 23:29

Yeah but in anything that is too heavy for cameras the LIDAR long ago failed altogether .

cookingboy 2025-08-26 23:29

> so it’s not really needed The goal of FSD is to produce the best driving system we can, given the technology we have at disposal, so it can be *safer* than human drivers/ It is not to do the minimally possible to match the safety margin of a driver with zero depth perception just to prove a point for an egotistic non-technical CEO to win his internet argument.

jabroni4545 2025-08-26 23:30

It can use both, but it won't rely only on lidar if the cameras are obstructed.

jabroni4545 2025-08-26 23:32

Can't prove waymos are better in snow since they've been avoiding snowy areas up until now.

dellfanboy 2025-08-26 23:33

This comment needs to be stickied. Non engineers will not understand. Thanks for bringing logic to the conversation.

GodwynDi 2025-08-26 23:34

Big if.

stylz168 2025-08-26 23:35

Let me know when the auto wipers have been fixed. 2023 MYLR with FSD 12.4 and it’s utter trash.

whirlwind87 2025-08-26 23:39

When is the trade off or more data worth it though? Industrial automation is most heavily in manufacturing often the most repetitive parts. How often are you dealing with weather, traffic that can have far more unexpected quick change conditions.

randommeme 2025-08-26 23:39

You write software to make this decision based on all available inputs. You can even learn this, take your robot out to a known map and collect data while it’s snowing, use the known map to label the data. Sensor fusion is not some new or insurmountable problem.

ic6man 2025-08-26 23:40

Oh. Is that why my AutoPilot shuts off in a drizzle due to “Poor Weather Detected”. Or is it that it’s been in perpetual Beta ever since it was released?

GodwynDi 2025-08-26 23:42

If the human eye is sufficient, cameras are. It's almost entirely a software issue now. Nut whether or not its an actually solvable issue is beyond me.

NetJnkie 2025-08-26 23:42

Fine. But my point is Elon’s solution isn’t any better at all.

FoShizzleShindig 2025-08-26 23:47

The F-35 has to guide missiles and bombs and detect aircraft beyond visual range. An apt comparison would be FPV drones which are using just cameras and AI for target acquisition.

dabuttmonkee 2025-08-26 23:49

My Tesla will stop working if the sun is too bright let alone if there’s heavy rain or fog.

oyputuhs 2025-08-26 23:50

Humans have other senses

zxn11 2025-08-26 23:50

False negative would be worse than a false positive though yeah?

Fullmetalx117 2025-08-26 23:50

If not new then how come Tesla’s vision system has caught up so quickly to established systems by Waymo and the like? It kind of gives some credibility to musk’s first principle approach here (which I typically cringe at)

GodwynDi 2025-08-26 23:51

Drive by intuition, or maybe sense of smell

Mr_Kitty_Cat 2025-08-26 23:52

Honestly just put lidar on the cars and don’t have them active. Shuts everybody up

[deleted] 2025-08-26 23:53

[deleted]

oyputuhs 2025-08-26 23:53

Sound to hear the environment, touch to feel the road conditions, and smell can help in certain circumstances (burning rubber)

Cimexus 2025-08-26 23:54

Not many that are used in driving though. Hearing to an extent.

LionTigerWings 2025-08-26 23:54

Rich coming from the guy that makes a system that sometimes stops working when the sun is shining at just the right angle… and also sometimes doesn’t work in the rain or snow.

Aaco0638 2025-08-26 23:55

Thank you, ffs what’s with people wanting the bare minimum especially when it comes to heavy self driving machinery. No i don’t want these things to be like humans i want them to be BETTER than humans.

1988rx7T2 2025-08-26 23:56

He literally said he likes Lidar in the dragon space craft. There’s no trash talking. He’s describing a serious technical hurdle that they feel they’ve overcome by devoting the resources into making cameras work better rather than in trying to make multiple types of sensors, each with their own advantages and disadvantages, work together.

oyputuhs 2025-08-26 23:56

Hearing and touch (road conditions)for sure. We can use only vision but obviously we would perform worse. Hell even smell helps.

1988rx7T2 2025-08-26 23:57

It’s a fair point that they don’t have camera cleaning hardware on the sides and in the back.

FunnyProcedure8522 2025-08-26 23:57

That’s not how it works. You still don’t grasp the idea of solving conflict data points. What’s your software going to do? Learn with what? Are you saying you are going to run around area whole winter, hoping for different snow amount in various places in the city, record them in time 24 hours by 7 days, so you can map it to work next year? Do you know how stupid that sound?

1988rx7T2 2025-08-26 23:57

Tesla did. They had a radar for years, and have development vehicles with r&D grade Lidar.

[deleted] 2025-08-26 23:58

[deleted]

1988rx7T2 2025-08-26 23:59

The robotaxis have a new windshield wiper mode to rapidly clean the forward cameras. The side and rear cameras don’t have cleaners but have been redesigned to reduce the impact of rain. That doesn’t help salt and mud though.

puppetmasteria 2025-08-26 23:59

I need it for parking. Two times have I hit something when the sensors said I was far away.

gogators1000 2025-08-27 00:00

How would lidar help with hearing or touch?

Beastrick 2025-08-27 00:00

Have they caught up? Last time I checked only system driving without any safety drivers still is Waymo.

FunnyProcedure8522 2025-08-27 00:00

Of course it works, and if it’s down pouring rain what do you think Waymos do? It pulls over to the side or drive straight into flood zoom And yes FSD has been running in winter in all sorts of terrain and environments. Waymos can’t match that.

thespieler11 2025-08-27 00:01

My autopilot powers through regardless of that message

WelpSigh 2025-08-27 00:01

This is effectively how ML works all the time. There are millions of contradicting data points even with one sensor type over many navigations, and ML finds correlations to navigate it.

gibsonblues 2025-08-27 00:01

There are others reasons as well why Lidar is not needed. Do YOU use Lidar to drive? Vision is akll that is needed. Want to add thousands of dollars to each of millions of cars to sell to people? Waymos cost $150k. Tesla need to be for sale.

1988rx7T2 2025-08-27 00:01

So you’re proposing that they develop entire new neural nets to judge which sensor to be the primary sensor in any given situation, even though hardware 4 is already hitting its limits? It’s not feasible, and you have zero evidence that Waymo isn’t manually tuning the sensor judgment with heuristics.

LionTigerWings 2025-08-27 00:01

I have it on my 2024 hw4 from time to time.

oyputuhs 2025-08-27 00:02

No, but the point is that we don’t only drive with vision. So, the argument that only one type of sensor is adequate is not strong. I’m not saying the discussion about cost or complexity is bad. But I hate the argument, “Well, we use vision, so that’s all we need”.

FunnyProcedure8522 2025-08-27 00:03

Radar and ultrasonic won’t help in your case. If it’s heavy snow LiDAR will do even worse. It can’t deal with reflections. What you have is old HW3 and maybe not even on FSD. You need to wait for v14 on HW4 FSD for real test.

needaname1234 2025-08-27 00:05

Fsd seems to work in an incredible amount of rain actually. It's been a while since I had it in snow though.

lylesback2 2025-08-27 00:08

Touche. We'll played

FlappySocks 2025-08-27 00:08

So imagine your driving down the highway at 70mph, and a plastic bag blows across your path. The camera says, it's a plastic bag. That's harmless, drive on. LiDAR says there is an unidentified object in our path, and we are about to hit it at 70mph. BRAKE!

FunnyProcedure8522 2025-08-27 00:08

Let us know when you decided to upgrade your software. That’s like a year+ old. Haven’t been an issue since 12.6.

tacobell_shitstain 2025-08-27 00:08

The human eye isn't always sufficient though. There are plenty of times where glare, inclement weather, or just suboptimal lighting conditions result in human's not being able to accurately see objects or judge distances. Cameras are also subject to these issues. I'm not saying lidar is the solution or even part of the solution, but human anatomy should not be the benchmark for driving capability. We're extremely good at crashing into shit on a regular basis.

StartledPelican 2025-08-27 00:09

>Oh and before I get hate: my 22 Model Y has radar and ultrasonic sensors but Tesla decided to deactivate all of those Your '22 Model Y does not have radar.  Your USS works still. Source: I own a '22 Model Y. >Tesla began removing the forward-facing radar sensor from Model Y vehicles built in May 2021 and later, transitioning to a camera-only system known as Tesla Vision Source on radar:  https://www.caranddriver.com/news/a36542541/tesla-model-3-model-y-pure-vision/ Source on USS: Go check your settings for parking sensors. You can choose between USS and Vision.

katherinesilens 2025-08-27 00:09

Tesla Vision is good and impressive for what it gets sensor-wise, but I would not say it has caught up to Waymo and others. Waymo absolutely performs better and more consistently. Even Robotaxi is heavily supervised and off-freeways still, so Tesla's risk assessment either agrees or has low comparative confidence. Going vision only is also not a "first principles" approach. I don't know where that phrasing came from but it's definitely misapplied here. I know Elon likes to toss that around but it doesn't mean what he thinks it means. A first principles approach to rain detection, for example, would be putting your hand out of the window and observing raindrops. The principle involved is that humans can observe rain, and this is a first principle because it is obvious enough to not require proof. A convention-first approach to rain detection would be to use a glass-mounted sensor. What our cars have is MV attempting to identify raindrop through cameras--an approach that saves money through reuse but doesn't increase quality or reliability, as we know during rain. What's being called "first-principles" there is really just cost-cutting simplification. If a mechanism doesn't work as well, it doesn't work as well, no matter what means was used to arrive at it or what it cost. FSD and our rain sensing systems may be good from certain metrics, but they simply do not have the full performance band of systems augmented with radar, lidar, etc.

N2_Deox 2025-08-27 00:09

FSD handles fairly heavy rain. It will incrementally limit max speed as rain gets heavier, but it does well regardless.

kittysworld 2025-08-27 00:10

They can make it unavailable during bad weather, as least for now. There will be human driving taxis eager to get the business.

[deleted] 2025-08-27 00:10

[deleted]

stylz168 2025-08-27 00:10

My mistake it is 12.6.4, though stuck with HW3 unfortunately so pretty sure 13 will never make it.

gibsonblues 2025-08-27 00:10

Waymos still don't drive on the freeways. My Model Y took me from Dallas to Colorado Springs. It's not just about which technology is better. Waymos cost $150. How would Teslas sell millions of those? And it wouldn't work on the freeway? That's stupid. Do humans need radar and lidar to drive? No. All that is needed is to see what humans see, and a brain. Tesla's do that and at a price that can scale.

WelpSigh 2025-08-27 00:12

Ok seems like the waymo approach is working though, so if it is "manual tuning" that appears to be a good approach vs Tesla which you are telling me can't get better without another hardware upgrade

cookingboy 2025-08-27 00:12

For some reason I couldn’t copy paste from the site, but this is a relevant part from the paper: https://imgur.com/a/KJ34kxp I didn’t have time to read the whole paper, but at a glance I didn’t see where camera data is used for distance and velocity measurement. The part you talked about is very different from what I was talking about, because in that case sensor fusion makes sense. For example: LIDAR: detects there is an object at x direction and y distance. Camera: detects a stop sign at x direction. Fused data: detects a stop sign at x direction and y distance. HD Map: there is a record of stop sign at xyz coordinate, and it matches. AFAIK, Waymo does not rely on cameras for distance and velocity measurement, since even the human eyes are notoriously unreliable for that, and radar/lidar is literally made for that purpose.

[deleted] 2025-08-27 00:12

[deleted]

[deleted] 2025-08-27 00:12

[deleted]

[deleted] 2025-08-27 00:13

[deleted]

WelpSigh 2025-08-27 00:14

I am aware of the concept of sensor fusion. But again, Waymo does a quarter million paid rides per week while dealing with sensor fusion so it can't be impossible.

McD-Szechuan 2025-08-27 00:14

Worked well in snow for me

[deleted] 2025-08-27 00:16

[deleted]

sykemol 2025-08-27 00:18

>continue to expand market That part of your post jumped out at me. Tesla has no AV ride hailing market to expand. In theory, yes if all those things come true, the business case works out. But there are lots of other possible outcomes too. One is that he brings AV ride hailing to market, but is years later than one or more competitors. Another similar outcome is that all the majors have already licensed or developed their own AV tech before Tesla's is ready. And by the time Tesla's tech is ready, perhaps Waymo and others no longer require geofencing. Another is that at one time his vehicle margins were great, but they've gotten pretty mid in recent quarters. His margin advantage is eroding away pretty rapidly. Basically, in your post there are a number of assumptions about how Tesla's future tech could improve. Those are likely good assumptions. But you should also assume that Waymo's tech will improve as well. I think Tesla's window to hit a home run in this case isn't closed, but it is closing.

[deleted] 2025-08-27 00:19

I concur. I was in a heavy downpour on a road trip back in July and FSD worked flawlessly. (Can’t say the same about the stupid autowipers though. Would much rather have a $2 rain sensor than LIDAR)

wilydolt 2025-08-27 00:19

Has Elon never driven in a Tesla?

FunnyProcedure8522 2025-08-27 00:20

No you don’t have millions of contradicting data sources in ML training. Come on now, what exactly are you training for? Randomness of the data set? It’s all about probabilities. You can’t rely on result if the source of data sets generating conflicting data. Which one do you choose? So one is 51% and the other is 49%. So machine will always go with 51% because it believes data from one source is better than the other. Then what’s the point of having second sensor?

WelpSigh 2025-08-27 00:21

But camera-only *also* does not work in the wrong conditions. For that matter, I can't drive in the wrong conditions. Much of this also depends on the risk profile: Waymo actually can drive in a lot of conditions that they choose not to do so. For example, it *can* drive on interstates, but they aren't as confident in it, so they don't.

nametaken_thisonetoo 2025-08-27 00:23

Must be true if he said it.

StickFigureFan 2025-08-27 00:26

If in doubt, stop

WelpSigh 2025-08-27 00:29

The third alternative is the one that is common in ML: you assign weights to inputs based on your confidence in them, which can be determined from real world data collection. It's a very human approach. ML is entirely about figuring out contradicting input. Sometimes it's dark out. Sometimes pedestrians do one thing and sometimes they do another. Traffic lights have different timing. Light and weather conditions differ constantly, which affects the input the algorithms are getting. That's really what they excel at.

cookingboy 2025-08-27 00:30

This was in my original comment: > In AVs LIDARs are used for localization (**measuring location and movement vector of objects**) I specifically called that part out. And nowhere did it say Waymo uses cameras for position and velocity measurement. What Waymo doing is sensor fusion done right. Cameras are much better at detecting for example orientation, and terrible at measuring distance and velocity, but combining it with other sensors, now it has detailed info on all those fields. In fact, by reading the paper, you should be convinced that Tesla’s approach is deeply flawed.

thecrispyleaf 2025-08-27 00:31

You mean like every overhead bridge shadow during the day my model 3 phantom brakes at? 2022 with cam only.

Vegetable_Pirate_702 2025-08-27 00:32

Lidar is stupid same strength and weakness as light. Why is nobody talking about radar? Hd radar would fill in vision’s weak.

Hiddencamper 2025-08-27 00:33

…. Why did we get rid of radar? He was the one who said radar was superior for rain and fog

[deleted] 2025-08-27 00:33

[deleted]

mflboys 2025-08-27 00:36

Wow! You solved it! You should apply to Tesla, I’m sure they’d give you a senior engineer role since you know how to significantly improve one of the most advanced self-driving systems in the world

FunnyProcedure8522 2025-08-27 00:40

You misunderstood how ML and NN work. You can not have contradict random data points from data sources contradict each other in training. All training data needs to be cleaned. Please explain how you assign weights to random contradict data? This is not possible. People simplify what needs to be going into training thinking it’s just a black box just feed all the data including junk contradicting ones and hope the model will figure out. It works for LLM because outputting something close enough is ok, not for FSD when your choice is complete opposite go or no go.

gltovar 2025-08-27 00:40

here is the dig, when you are writing software / creating ai models at this caliber you are on a razors edge whether you even need both sensor suites. To back up my claim is this MASSIVE real world recreation test done in china recently: [https://carnewschina.com/2025/07/24/chinas-massive-adas-test-36-cars-15-hazard-scenarios-216-crashes/](https://carnewschina.com/2025/07/24/chinas-massive-adas-test-36-cars-15-hazard-scenarios-216-crashes/) (youtube video at the bottom, which is very high quality) In this test they shut down an actual section of highway to recreate real world edge cases. You’ll see that for available consumer model cars, that tesla’s vision only system was head and shoulders above the competition, many which have lidar equipped. The funny thing about this test, when watching the video, it felt like they didn’t really want to spotlight foreign car brands as much as domestic brands, so the tesla performance almost felt like a foot note compared to the narrative they wanted to convey. But this huge test certainly didn’t include extreme weather conditions, (though their channel does feature small scale extreme weather tests), and non consumer vehicles were not showcased. If I were to guess I’m sure the waymo would perform just as well if not better than the tesla. Honestly I think an independent world wild olympics for this kind of car tech would be very interesting, do a lot to shake off biases on all side and really drive home the state of the driving art.

WelpSigh 2025-08-27 00:42

Why are you acting like this is impossible when literally Waymo rides are combining them, right now, with radar + lidar + cameras? It's obviously possible to do, it's the same concept. There are many, many variables that are combined to figure out the correct path forward. That is literally the entire thing that makes ML good.

Snoo93079 2025-08-27 00:42

If it was easy we'd already be doing it. That's why the logic in sensor fusion is so important.

Tall-Jellyfish-4158 2025-08-27 00:42

You mean like the time the Tesla thought an overturned semi was the sky and crashed right into it. Cameras only is dumb and there's a reason why every other manufacturer uses BOTH.

[deleted] 2025-08-27 00:43

[deleted]

Toastybunzz 2025-08-27 00:43

Does it though? The only time AP ever turned off for me in the rain was after it hit standing water and hydroplaned. Does it limit your max speed or give you a warning that visibility is low? Yes but it still works.

spiderweb91 2025-08-27 00:43

That is fair. Tesla has nothing to learn and has figured out everything there is to self-driving.

soldieroscar 2025-08-27 00:45

I called and they won’t refund me my 6,400 they charged me for FSD back in 2019. 6 years later and still no product that I paid for, and my car cant accept the new hardware.

FlappySocks 2025-08-27 00:46

You miss the point entirely.

mflboys 2025-08-27 00:46

They sure as hell don’t have anything to learn from anyone in this thread. You’d be laughed out of their office if you walked in there and claimed to have any insight into their system. And no, watching 2 YouTube videos on ML and LIDAR doesn’t count as credibility.

pcurve 2025-08-27 00:47

Yeah but lidar works in pitch black... and Waymo has multiple radars

Malcompliant 2025-08-27 00:51

Under what conditions is Lidar measurably better than cameras?

WelpSigh 2025-08-27 00:53

Lidar is an active sensor, which emits light. It therefore works better in settings with less than ideal light. That's why, as Elon mentions, he uses it at SpaceX.

Unhappy-Plastic2017 2025-08-27 00:56

Trust the laser beam

Malcompliant 2025-08-27 01:00

Cars also emit light, using headlights and taillights. Which is why FSD works fine on dark rural roads with no lighting and no reflective markings.

WelpSigh 2025-08-27 01:02

But lidar can very clearly see, for example, deer on the side of the road that the headlights are not illuminating. And they deal with shadows better.

WenMunSun 2025-08-27 01:05

Teslas are able to listen too, like for ambulance and police sirens. But you’re missing the point- which is humans don’t need LiDAR to drive.

Flashy-Bus1663 2025-08-27 01:06

Doesn't removing all the contrasting data result in the model overfitting. As I understand it, you need the data to match real world examples. In a toy system or a very small system, you might get away with removing all the contrasting data. But if you want something to run in the real world, you need to handle those edging corner cases. The two sensors provide conflicting data. This is usually why and deployments of these types of things there are redundant systems to help deconflict these types of inputs. Though if one cost of sensor it says there's a wall right in front of you and another cost of sensor says it's open road. I would hope that most self-driving systems decide to disengage and stop driving.

Malcompliant 2025-08-27 01:06

Tesla's headlights do illuminate the sides of the road enough for the side cameras to see. Humans might not notice it through low dynamic range human eyes :)

WenMunSun 2025-08-27 01:08

Yeah I don’t get this theory. If this were the case why not just say so? Why would he not just say LiDAR costs too much - a statement he has actually made in the past and which he later amended by saying even if LiDAR were free he wouldn’t use it. I just don’t see the point in lying about a totally valid economic decision if that were the case.

allahakbau 2025-08-27 01:08

Also a camera resolution issue.

spiderweb91 2025-08-27 01:10

What makes you so sure? I work in one of the world's top tech companies with some investments in a similar space and I know many people qualified to have intelligent opinions on this topic on Reddit. By extension no large company in the world can do any wrong and none of them need to listen to anyone outside of their field of experts. History has proven time and again that that's not the case.

allahakbau 2025-08-27 01:10

The CEO of Xpeng and tech teams seem to agree. Head of tech there mentioned sth about Lidar can’t see type of object so when camera and hardware gets better Lidar becomes kinda useless. There’s no for Lidar to tell a plastic bag from a brick. But cameras can. Camera also sees further or sth.

oyputuhs 2025-08-27 01:13

I’m not missing the point. You’re missing the point lol. I’m not saying an autonomous car needs to have the same sensing as a human. I’m not even saying a vision only system can’t get you to a functional autonomous vehicle. I’m saying it’s reasonable to argue that a suite of sensors could perform better in more scenarios than just cameras alone. It’s also pretty dumb to compare humans to autonomous vehicles, especially since we want cars that drive much more safely than we do. Could that happen with just vision, mics, and compute? Maybe, but why not try multiple approaches?

FlugMe 2025-08-27 01:17

This fallback has caused accidents and deaths. [https://x.com/kenklippenstein/status/1612848872061128704](https://x.com/kenklippenstein/status/1612848872061128704)

DoomBot5 2025-08-27 01:18

>The one that’s harder to properly use overall but is less likely to have a false positive Tell that to their rain sensing logic. Still can't distinguish between rain and a shadow of a branch on a sunny day.

sfo2 2025-08-27 01:19

Among all arguments in favor of cameras only, the “humans have eyes therefore cameras are sufficient” is probably the laziest and worst.

edit_why_downvotes 2025-08-27 01:22

Most car accidents are "accidents" because of someone shitting the bed, and a majority of automobile fatalities have alcohol involved. NOT "my peepers were obstructed"

burgonies 2025-08-27 01:30

Waymo is all over Atlanta and it seems to rain here 5 days a week

TYMSTYME 2025-08-27 01:34

Yeah it’s so easy just write the damn software Elon! Sheesh

Jakoneitor 2025-08-27 01:34

Tesla also stops working in any heavy precipitation lol

wighty 2025-08-27 01:37

> Humans might not notice it through low dynamic range human eyes ummm... dynamic range is probably not what you mean to say here. Human eyes have better dynamic range (+20 EV vs like +10-15 EV for cameras). Have you ever created an HDR image by stitching, or maybe just played around with your cell phone's camera and possible HDR setting to notice the differences?

GodwynDi 2025-08-27 01:40

This is what it feels like getting old.

ic6man 2025-08-27 01:42

NoA turns off. Autopilot limits the speed.

oyputuhs 2025-08-27 01:44

I’m sure a sufficiently advanced dog-like alien species could drive by smell

Igotnonamebruh42 2025-08-27 01:47

But Waymo also use vision, not just lidar. They’ve started to utilize both lidar and vision, I guess they have better sensor fusion

WindyNightmare 2025-08-27 01:47

He said my FSD driving Model 3 would be worth like $200k one day so there should be plenty of wiggle room for high tech sensors.

EaZyMellow 2025-08-27 01:49

Waymo seems to still have a problem doing that though. Remember, Waymo uses what Tesla does, and MORE. They still struggle with precipitation. Your environment can change beyond what data was uploaded to the system prior to leaving, AND it can rain. Tesla’s camera-only isn’t saying that it’s insurmountable or impossible to use multiple sensors with & against each other, it’s simply a problem they don’t want to try and solve. No matter what system you end up using though, if it has cameras (which it always will) you’ll need to do exactly what Tesla is currently doing. And more.

HuyFongFood 2025-08-27 01:52

Shouldn’t that have been handled by proper software development? Oh wait, he kicked the founders out of the company and has spent the rest of the time since then being a complete numpty and mucking about in things he doesn’t actually fully understand. There’s a reason why PayPal developers have him “play code” to “fix” and then would revert his changes the next day and get on with their own work. If he would shut up and just let the engineers and developers develop for entire cycles he might have not produced that POS CyberTruck that is officially the worst EV truck on the market. Teslastans are just hilarious the amount of smoke they’ll blow up each other’s asses to try and justify getting screwed by Elon.

HuyFongFood 2025-08-27 01:54

You just won’t accept that you’re wrong and continue arguing instead of just admitting it and walking away.

adrr 2025-08-27 01:56

Here's an example of Waymos that stop working in the rain. https://www.youtube.com/watch?v=Bm1A3aaQnh0

StickFigureFan 2025-08-27 01:57

The problem there was abruptly stopping (and not having a place to pull over)

ChymChymX 2025-08-27 02:00

He did over 6 years ago: “Lidar is a fool’s errand,” Elon Musk said. “Anyone relying on lidar is doomed. Doomed! [They are] expensive sensors that are unnecessary. It’s like having a whole bunch of expensive appendices. Like, one appendix is bad, well now you have a whole bunch of them, it’s ridiculous, you’ll see.” He is sticking to that belief (again, right or wrong).

IHSFB 2025-08-27 02:14

Most people are not computer vision engineers. Commenters here are just regurgitating established talking points about Tesla and FSD.

apocxp 2025-08-27 02:15

My software is updated. Still an issue. On HW3, so that must be it.

dsyzdek 2025-08-27 02:18

Indeed. This is why trains (railroads) have fancy signaling systems — it’s impossible to stop in sight distance at any reasonable speed. So we have signals miles ahead that progressively slow us down to prepare to stop.

what_about_zissou 2025-08-27 02:19

The worst is when it caps your speed to 65 due to "poor weather" and it's hardly raining.

[deleted] 2025-08-27 02:19

Fake news. Mine has never turned off in any weather

Brothernod 2025-08-27 02:22

A false positive Vision sees a bag, lidar sees an object, listen to LiDAR and slam on breaks. A false negative Video sees a shadow, you don’t have lidar, listen to vision and hit a person at night.

cgreentx 2025-08-27 02:29

FSD and summon in my Model 3 are completely unusable in heavy rain. You can say that LiDAR doesn’t work in heavy rain, but cameras are utter garbage in it as well. So dumb

Beneficial-Bite-8005 2025-08-27 02:35

To the same degree?

sfo2 2025-08-27 02:38

Sure. I’m just always surprised it’s not self evident that humans maybe aren’t the best benchmark. Because we, ya know, suck ass at driving.

bumskins 2025-08-27 02:42

Yep, getting stuck in intersections all the time.

wilso850 2025-08-27 02:42

Do cameras not have the same issue when they get covered in precipitation or dust?

[deleted] 2025-08-27 02:43

[deleted]

Meats10 2025-08-27 02:45

This is so dumb because people use taxis more when it's raining. That would be like closing your hotel during peak travel season.

Beneficial-Bite-8005 2025-08-27 02:46

That’s the fundamental claim you’re making You’re saying that camera is just as good as camera + LiDAR but you just admitted you know nothing about it 🤣

bear-tree 2025-08-27 02:48

I'm not sure if this is well understood, but it should be. It's a neural net. Nobody is "writing the software."

tonydtonyd 2025-08-27 02:51

Once every few million miles isn’t bad. At worst we see once a week, which is like 2 million miles.

ffiarpg 2025-08-27 02:52

How so? You are comparing the status quo driver (humans) to a new type of driver (computers) and comparing them in a specific domain, sensor input. The status quo driver can drive with only vision. It stands to reason that only vision would be sufficient for the other type of driver if it can meet and exceed other areas like decision making (hard) and output (easy). It can also be true that camera + lidar is superior to camera. But that isn't the claim.

Rubix321 2025-08-27 02:52

To be fair, my FSD regularly refuses to activate in heavy precipitation... even in light mists or if the road is wet.

[deleted] 2025-08-27 02:52

[deleted]

tonydtonyd 2025-08-27 02:53

Should have used much higher quality radars…

sfo2 2025-08-27 02:55

Humans fucking suck at driving. The status quo is awful. I want my self driving car to understand the world around it better than a human ever could, make better and faster decisions than a human ever could, react faster than a human ever could, etc. Humans are a terrible benchmark.

tonydtonyd 2025-08-27 02:55

That looks like the fire truck hit the pulled over Waymo while trying to skim by, don’t think that’s a good example.

[deleted] 2025-08-27 02:56

[deleted]

tonydtonyd 2025-08-27 02:57

Waymos don’t drive on freeways with external customers but they have been giving riders to employees (who are not safety drivers, these are senior engineers and C level)

Winterwind17 2025-08-27 02:58

Classic know nothing salesman trying to appear technical.

[deleted] 2025-08-27 02:58

[deleted]

gorgeousphatseal 2025-08-27 03:02

Ok, meanwhile the only time my autopilot shut off was a torrential downpour.

tonydtonyd 2025-08-27 03:02

That’s a split road, meaning the firetruck was going opposite of traffic which it can obviously do. But I don’t think it’s logical to assume the Waymo would drive into the firetruck given they stop for them in every other video (sometimes poorly)

malou_pitawawa 2025-08-27 03:03

I agree. As long as it can’t send windshield washer fluid automatically to the front camera, it will never work 100% of the time in winter…

tonydtonyd 2025-08-27 03:03

Cameras + high quality radar is clearly the best of both worlds without having to deal with lidar. The resolution from Waymo’s radars is pretty insane for radar IMO.

ffiarpg 2025-08-27 03:03

I agree we can do better but humans are *currently* the best driver, they are the benchmark by definition.

Igotnonamebruh42 2025-08-27 03:03

Getting stuck once every few million miles in a fully autonomous Waymo or getting stuck once every thousand miles or even hundreds miles in a Tesla, which one you like?

hoppeeness 2025-08-27 03:04

I think there is a good enough here that everyone overlooks. If you can make the car 10x safer than humans in most conditions and scale it to 1 million vehicles or 20x safer but only scale to 100k in the same time…what is better for society?

[deleted] 2025-08-27 03:09

[deleted]

tonydtonyd 2025-08-27 03:11

I never said Waymo couldn’t do better, just that from the video you posted it looks a lot more like a firetruck on a narrow street clipping a car that is trying to pull over for it. But OK, get defensive.

yallmad4 2025-08-27 03:12

Lmao tell me you don't know anything about sensor designs without *telling me* you don't know anything about sensor design.

CaterpillarSad2945 2025-08-27 03:14

Why do airplanes have two air speed sensors? How do you even decided which one to trust? Would be safer if we remove the redundancy! Have to add the /s if it’s not clear to you. If still don’t get it lookup Boeing 737 max crashes.

[deleted] 2025-08-27 03:15

[deleted]

[deleted] 2025-08-27 03:16

[deleted]

[deleted] 2025-08-27 03:17

Not remotely accurate. Enter MVIS

tonydtonyd 2025-08-27 03:28

Again, I never said Waymo is perfect around firetrucks. I agree with you that they should perform better, yet we’re talking about a few handful of cases across well over 100 million miles with customer. I myself have spent dozens of hours in Waymos and had them deal with firetrucks just fine. Meanwhile my wife’s ‘24 model 3 still needs baby sitting on FSD when I drive it. I think my issue is you seem so dead set on something *being wrong* in your book, yet it’s working 24/7 with only the occasional hiccup.

aloys1us 2025-08-27 03:31

Works better in dark and low light and sun blinding situations though…

[deleted] 2025-08-27 03:35

[deleted]

aloys1us 2025-08-27 03:39

Road lines maybe. Not 3D objects though

[deleted] 2025-08-27 03:42

[deleted]

[deleted] 2025-08-27 03:42

If either sensor says I’m going to hit something I’m going to stop the car. That’s why you have two different sensors with different strengths.

pw154 2025-08-27 03:43

> But Waymo also use vision, not just lidar. They’ve started to utilize both lidar and vision, I guess they have better sensor fusion Waymo's approach is not designed to be scalable to every city and country because of dependency on HD mapping and fixed geofencing. Great for geofenced robotaxis, shit for mass consumption. If Tesla used Waymo's approach FSD would be much more limited in scope than it is now, and not sustainable/scalable for Tesla.

aloys1us 2025-08-27 03:45

Yeah but, lidar will detect ‘something’ there in the dark.

[deleted] 2025-08-27 03:48

[deleted]

aloys1us 2025-08-27 03:49

Yes. But it has advantages to have both. That’s all I’m saying

[deleted] 2025-08-27 03:51

[deleted]

aloys1us 2025-08-27 03:52

Cameras don’t work with no light or when they’re blinded by the sun. LiDAR doesn’t care about those scenarios. Understand lol So yes. Lidar works better in those situations because cameras don’t work at all

pw154 2025-08-27 03:52

> Why are you acting like this is impossible when literally Waymo rides are combining them, right now, with radar + lidar + cameras? It's obviously possible to do, it's the same concept Waymo is heavily geofenced and relies greatly on HD mapping which is not nearly as scalable as Tesla's vision only approach. It's an entirely different model which is not commercially viable for a consumer grade vehicle, at all. If the implementation was technically and commercially viable you'd see many other consumer grade ADAS systems that incorporate lidar that blow Tesla's FSD out of the water. Guess what? There are zero in the North American and European markets. Even the Chinese ones with Lidar+Vision do not outperform vision-only FSD.

[deleted] 2025-08-27 03:53

[deleted]

pw154 2025-08-27 03:57

> Have they caught up? Last time I checked only system driving without any safety drivers still is Waymo. Put a Waymo outside of its geofenced HD mapped zone and see how it does without a safety driver. FYI Waymo had safety drivers from 2015-2020, whereas Tesla just started their robotaxi service now. Considering this I'll give Tesla slack on their use of safety drivers.

pw154 2025-08-27 04:01

> Tesla Vision is good and impressive for what it gets sensor-wise, but I would not say it has caught up to Waymo and others. Waymo absolutely performs better and more consistently. What others? FSD is the most advanced ADAS that is commercially available at the moment. Waymo is good in a geofence but put a Waymo outside of its geofenced HD mapped zone and see how it does.

ItsAGoodDay 2025-08-27 04:04

You think Tesla isn’t already sending a ton of cars on the road mapping lidar data? Think again

dirtyvu 2025-08-27 04:07

Stop perpetuating the myth that the rgb cameras in the tesla are equivalent to human eyes.

pw154 2025-08-27 04:13

> 2023 MYLR with FSD 12.4 and it’s utter trash. I have a 2024 MYP, friend drives a 2023 MYLR. FSD 13 is leaps and bounds better than 12. I wouldn't trust FSD in his car, whereas I use mine almost daily

Ok_Excitement725 2025-08-27 04:16

I mean he’s the CEO, he’s made his decision. Sure it may be a stupid one long term but, whatever. His company his decisions.

stylz168 2025-08-27 04:19

Yeah I’m unfortunately stuck with HW3, but it works well enough for me on highways and such.

TheMartian2k14 2025-08-27 04:19

Ok.. use your same logic with only one sensor input. How can you trust a reading when it might be missing information ie. vision-only failing to detect an object due to glare/heavy snow/precipitation? How is that superior to multiple inputs which need to be tested and weighed?

pw154 2025-08-27 04:20

> You think Tesla isn’t already sending a ton of cars on the road mapping lidar data? Think again Nope, they're not mapping. They're calibrating the vision inputs by comparing the inputs with Lidar data, not even close to the same thing as what Waymo is doing

deten 2025-08-27 04:28

You dont just take away a sense because youre going to 100% trust one that is faulty. You have 2 senses, and make the best choice possible.

picflute 2025-08-27 04:36

Who cares? He needs to deliver a system to customers that works. How the competition does it is irrelevant when the first mover to get it "right" captures the market. The moment someone can reliably pull their phone out for an entire drive and stay disconnected from what is occurring in front of them they will capture the working class that has to stay plugged in at all times.

Igotnonamebruh42 2025-08-27 04:54

True but also you can opt to use vision as the main input but use lidar as assist input for autonomous driving. It’s not like you have to choose one, you can do both as long as you have an extremely well calibrated sensor fusion. As far as I can tell Teslas vision approach can only do as good as the best human driver since the system itself is learning human behavior just from vision input, to me that’s not what it supposed to be, it should be far exceeding the human driving capability, such as driving in severe weather.

cookingboy 2025-08-27 05:32

> The paper showed your overconfidence of what you said earlier is deeply flawed. The paper *literally* shows the importance of having a wide variety of sensors for the problems they need to solve for FSD, regardless of the argument we have over the precise definition of localization. Tesla, no, Elon himself thinks he alone is smarter than the rest of the industry combined. We've learned that he does so in many areas, and none of them are true in reality.

pw154 2025-08-27 05:34

> As far as I can tell Teslas vision approach can only do as good as the best human driver since the system itself is learning human behavior just from vision input, to me that’s not what it supposed to be, it should be far exceeding the human driving capability, such as driving in severe weather. Considering FSD can process input from 8 cameras simultaneously and adjust its path up to 20x per second with millisecond precision, while humans rely on a single viewpoint, limited attention, and a much slower reaction time, I'll wager that it can already do better than the best human driver in the majority of circumstances

FencingNerd 2025-08-27 05:53

The big problem with LIDAR is range. Realistically, you need LIDAR to go out at least 1km, otherwise, you're only using the LIDAR for short range anyway. The problem is that a LIDAR module with that kind of range is prohibitively expensive (like $M per unit). It's basically an R&D project. Inexpensive LIDAR operates at around 1um, and has range of around 200m. So for longer range, you rely solely on the camera.

[deleted] 2025-08-27 06:00

[deleted]

JohnHue 2025-08-27 06:08

Lack of information doesn't make for a better decision process.

latca 2025-08-27 06:12

Radar could have worked well in low visibility except it sits disabled in my car now.

FaudelCastro 2025-08-27 06:17

This doesn't make sense. For one, sensor fusion is a solved issue (for e.g., fighter jets using radar and IRST). Second, Teslas also have multiple cameras and you can have conflicting information from different sensors of the same tech.

aft3rthought 2025-08-27 06:27

There’s an entire discipline for this with thousands of experts and decades of research. It’s called sensor fusion. https://en.wikipedia.org/wiki/Sensor_fusion

kroghsen 2025-08-27 06:39

I mean, he is not wrong. And in his defence, cameras are - for obvious reasons - the best equivalent to the sensors humans use when they drive. However, I fail to understand why you can’t just use both?

drgmaster909 2025-08-27 06:47

> senior design project in EE or similar major would be able to tell you what Elon has been saying about the difficulty of “sensor fusion” is just absurd The EE I talked to stated the opposite. Even taught me a new phrase just today: Curse of Dimensionality. Do your EEs go to another school? Perhaps one in Canada?

drgmaster909 2025-08-27 06:56

I'm very bullish on Vision-only FSD. My experience with it has been amazing so far and the issues I've had are decision-making issues, not "I didn't see that (but lidar would've)" issues. That said I think it's a false dichotomy. Vision + Lidar don't need equal relevancy or priority in the system. They don't even both need to be _actively_ feeding data into the system. Vision _could_ be _completely_ in control with Lidar passively running in the background, and if Vision ever has low confidence in something ("is that a tar line or a pothole?") _then_ could it reach out to Lidar to pull in extra data to make a more informed decision. If Lidar never comes to Tesla I'd still believe in FSD (as a current daily user). If it does, I'd be willing to trust that too.

itsauser667 2025-08-27 06:58

Do you think you could drive a car remotely, just using a monitor and wheel with no haptic feedback, as you could actually sitting in the car?

hauser1234 2025-08-27 07:02

It’s not your laser saying you will hit something. It’s also not your vision that tells you it by itself. It’s a component that looks at both and decides whether object at road exists or not. It can decide on multiple factors and if still not sure choose safer action.

[deleted] 2025-08-27 07:06

[deleted]

[deleted] 2025-08-27 07:08

[deleted]

hawktron 2025-08-27 07:12

To replace human drivers it only has to be as good as human drivers.

nobod78 2025-08-27 07:16

And it is faster than mapping that's what you mean?

kroghsen 2025-08-27 07:23

Well, that is a matter of software. The question here is about hardware and what the limitations are. Camera and Lidar are two different hardware approaches. Phantom braking is a software problem, not a hardware problem.

CorkChop 2025-08-27 07:30

Every time it rains in Florida my Y stops FSD because of “weather” what is his point?

Fire69 2025-08-27 07:31

Don't forget there's only a limited amount of Waymo's right now. Imagine in the future every car is a driverless Waymo. You'll have lots of cars stuck in intersections all the time.

nobod78 2025-08-27 07:31

Bad faith, he clearly adds what he means: "localization (measuring location and movement vector of objects)". You should stop arguing, we all understand he wasn't talking about where the vehicle is on the map.

drgmaster909 2025-08-27 07:35

Agreed but there are better examples. Again I'm bullish on Vision-only but I do find the idea that Lidar is unworkable nonsense only in the sense that Vision can _choose_ to use it _when_ it has low confidence in some assessment, but otherwise be the sole source of truth. As it is, Lidar doesn't see much that Vision doesn't already see, and Vision has much more context to correctly interpret something like a bag floating across the highway (when Lidar can only scream "OMG SOLID OBJECT!"). Most of FSD's quirks are poor _decision-making_, not lack of data. But sometimes it can make _better decisions_ if it can detect it's seeing an optical illusion or can't get enough light on something at night, then query Lidar's pointcloud for clarity. And again if Lidar never came to Tesla I won't shed a tear. I'd hop in a Robotaxi right now. I'm just about ready to engage FSD on my own car and hop in the backseat as soon as I'm able for the lolz. I've no reason to defend Lidar. But there are ways it _could_ be used that only add to the system. Vision never has 100% confidence in what it's seeing. If some ambiguity dropped its confidence in some object it's tracking to 40% or lower only then it would it ping Lidar "What do you see at coordinates X/Y/Z" to boost its own confidence back up to 80%+

[deleted] 2025-08-27 07:43

[deleted]

gltovar 2025-08-27 07:47

Seeing as they aren't public with what they are doing my guess is that they are using that lidar data to bench mark the accuracy of their vision only approach. A kind of datum or alternative source of truth to compare against. That way if there are any deviations in the areas they test they know to capture and generate (they can build any road system at any time of day and weather conditions using unreal engine) to train their model with

gltovar 2025-08-27 07:51

Two things to keep in mind, this is (trying to be) peak human driver that can't be distracted, which is where the majority of human accidents happen. Standard video camera can often see better in fog and snow than human eyes. I used to review my car cams from time to time and even a decade ago there were video captures that exceed my IRL visibility. If the visibility gets really bad that computer vision cant navigate, then we are approaching weather that should curb all driving, at least until we have a majority of cars that have advanced enough sensor suites to drive in terrible weather

vk_phoenix 2025-08-27 07:52

Yeah with you sitting behind the wheel with a signed agreement to cover all liabilities and damages lol

kroghsen 2025-08-27 07:55

Well, sure - if your argument is that bad hardware gives bad results, you are correct. That is not what this is about though. That just means get a better LiDAR. A low resolution or frame-rate camera also yields poor performance. Lidars do not brake cars. Lidars are sensors. Like cameras. This is exactly the why sensor fusion is a good solution to apply here, because both sensors have limitations. This is obviously a software issue you are talking about. Whether you accept it or not. Given that the sensors are not low quality of course.

cookingboy 2025-08-27 07:59

> The paper proves what you said earlier wrong. No it does not. Nowhere in the paper does it show they use camera for velocity and distance. That has always been my entire point, which is to use camera for those measurement *is* stupid. Remember, even human eyes are absolutely awful at judging that. Can you look at a car in front of you and tell me exactly how many meters away it's from you and what velocity it's traveling at? You can't right? That's what Elon has been trying to do with computer vision.

[deleted] 2025-08-27 08:12

[deleted]

Scandibrovians 2025-08-27 08:14

You seem to know a thing or two about this stuff, can I ask you, why do you think Tesla only went with 1 front camera? I am a Computer Engineer myself and tinkered with some robotics, and I would think a single camera is abysmal for long-distance measuring? I can see the idea with the side cameras and back camera - they only have to pay attention to things that are typically quite close to them. But the front one has to gauge several hundred meters ahead somtimes. Wouldnt it as a minimum have been way better to go for two cameras?

[deleted] 2025-08-27 08:16

[deleted]

jervoise 2025-08-27 08:30

But they clearly have been mapping for Austin?

UnDosTresPescao 2025-08-27 08:30

The other day I was in Autopilot going down the highway when I had a Florida moment where it instantly started pouring. Autopilot immediately started screaming about low visibility and shut off. I wondered what would have happened if I had been in a robotaxi. Would it have shutdown in the middle of the highway?

kroghsen 2025-08-27 08:36

Okay, you obviously have issue with what I say and refuse to give up your line of reasoning. That’s fine. I am not telling you to put the most expensive sensors on the market on there - obviously. And you do not need to. LiDARs for these kinds of systems are not some great mystery. A lot of Tesla’s competitors use lidar. I had assumptions I see you don’t have. A reasonable quality sensor is a must of course. You can filter out a lot of noise from sensors. More than most people think. I know, because it is exactly my area of expertise to make control systems work with the sensors that are economically feasible. That means making the software that makes the state of the surroundings - which is not merely a measurement. Obviously, this is reasonably a matter of software given any reasonable sensor. But if you think what I mean by that is that any lidar sensor can be made to work with the right software, then you are right of course - that is not the case.

wade822 2025-08-27 08:39

Its a physical issue. The longer the wavelength the lower the resolution.

[deleted] 2025-08-27 09:03

[deleted]

kroghsen 2025-08-27 09:07

You can take from what I say what you will of course. As bad faith as you are, I would not expect any less. I would just add that if you knew anything about this, then you would know that a sensor fusion can never be worse than one of the two sensors alone. If it is, it is always a software problem. I will not waste my time explaining that to you.

ILikeWhiteGirlz 2025-08-27 09:08

Turn off LIDAR in inclement weather and rely on cameras. Not inclement weather, use both, with bias towards cameras.

[deleted] 2025-08-27 09:10

[deleted]

ILikeWhiteGirlz 2025-08-27 09:10

LIDAR would suffer same issues if covered in snow.

Evilsushione 2025-08-27 09:16

Airplanes use multiple systems to determine position, they usually have an odd number of sensors and basically believe the what consensus tells them. It’s more complicated than that but that’s the basic idea.

fooknprawn 2025-08-27 09:17

My point is not that they should be using LIDAR but their insistence on using cameras only without weather mitigation

Ambitious5uppository 2025-08-27 09:22

Cameras are MORE heavily affected by weather, not less. And also suffer from phantom braking MORE in good weather. Lidar fitted cars also have radar, camera and ultrasonics. Radar can see through fog & rain. Lidar can see through all but extreme weather. Cameras struggle in poor weather or bright sun. The best solution is a combination system. Which is what all other brands are doing. (Volvo is using 1 lidar, 8 cameras, 16 ultrasonics and 5 radars). Elon is just lying, becuase he made a bad decision when radars became hard to source during covid, and he's too proud to go back on that decision.

ILikeWhiteGirlz 2025-08-27 09:22

Front cam has heater on it, bumper and rear cam on HW4 Cybertruck has washer and hood to keep stuff off. I’m sure HW5 will have more.

cookingboy 2025-08-27 09:22

According to you and Elon. Neither of whom have technical background in this field and he has been proven wrong consistently and failed to deliver his promises for 10 years in a row.

Ambitious5uppository 2025-08-27 09:24

All condtions. There isn't a condition it isn't superior in. He's lying to you. Any severe weather that impacts the lasers enough to make them lose confidence is already too bad for the cameras to work in. He's taking something which is true, that in severe weather they lose confidence, and trying to make you think this is an issue cameras don't have. Incidentally, cars on the market with Lidar, also use cameras and importantly radars too. Radar can see through the bad weather to help clear up the info for the other sources.

cookingboy 2025-08-27 09:27

>Because I have access to FSD and have used it thousands of times. So you would be comfortable with sitting in the backseat, turn on FSD, fall asleep, and let it drive you and your family around in all weathers and all conditions at all time? Because that's the minimal safety requirement for a human Uber driver. Can FSD do that?

cookingboy 2025-08-27 09:28

https://www.smartcitiesdive.com/news/waymo-robotaxis-winter-weather-boston-nyc-dc/756917/

Ambitious5uppository 2025-08-27 09:30

That may have been an issue for Tesla with just cameras and radar. (Though this isn't why they removed them, they just became hard to source during covid and instead of slowing production like everyone else, they just claimed they didn't need it). But most cars aren't using just two things. Volvos have 1 lidar, 8 cameras, 16 ultrasonics and 5 radars. If only the camera is concerned about the tree shadow and the lidar and radar aren't, the camera is wrong. If only the radar is concerned about the metal barrier on the tight bend you're approaching, but the lidar and camera can see you're going to turn, then the radar can be ignored. Two sensors, yeah, more difficult. Multiple sensors, no. And now that every major manufacturer has put in orders for lidar sensors, to fit them over the coming year, the cost argument doesn't exist anymore either.

Lilacsoftlips 2025-08-27 09:47

Because Waymo is on the opposite spectrum of caution when it comes to supporting customers with their technology. Tesla has been willing to put its customers at more risk to sell a product. they are still afraid of the legal ramifications of removing the beta tag.

[deleted] 2025-08-27 10:16

[deleted]

cookingboy 2025-08-27 10:17

No he didn’t. His engineers did and he took credit. He has never done any real technical work but he forced his name onto papers and gave himself the title of “chief engineer” or something like that. The guy is not Tony Stark. People like Tony Stark do not exist in real life. He has an 7/10 understanding in a lot of things which is enough to fool the average Joe that he has a 10/10 understanding, but real experts know he’s BSing the moment he opens his mouth.

[deleted] 2025-08-27 10:19

[deleted]

[deleted] 2025-08-27 10:24

[deleted]

[deleted] 2025-08-27 10:27

[removed]

TheRuneMeister 2025-08-27 10:32

I don’t think anyone is arguing that Teslas system isn’t good, or even the best. The argument is whether other sensors could be useful. I remember Tesla being proud of things they could do with radar. Like seeing two cars ahead. Now whether or not this worked or was actually useful is a different debate, but there is no doubt that multiple data sources is an advantage. Now, if Teslas had 1 single camera and the argument was that it was hard adding sensors, then perhaps it made sense, but Teslas already have multiple sensors. They are called cameras. The idea that the decision will be made by one sensor or another is a red herring. Thats not how you use multiple sensors…or multiple cameras in a system such as Teslas. One camera doesn’t ‘make a decision’. You use them to map out the world around the car. Sure, sensors (including different cameras) can provide inputs that can be interpreted differently by the system. With enough data, these inconsistencies slowly gets weeded out…though its probably never going to be perfect. Just like humans with all their sensors. When we get to the point where cameras by themselves are better than the human eye…then fine. Until then, cameras can use a helping hand.

stinkytoe42 2025-08-27 11:01

In safety engineering and sensor fusion engineering, there's this thing called the "Swiss Cheese" model. Slices of swiss cheese, that you buy at the grocery store, have holes in them that allow light to get through. But, these holes don't line up with each other. So if you take two slices of cheese and lay them on top of one another, less light gets through. More slices, fewer holes. Just like with different sensors (visible light camera, IR camera, lidar, radar, etc). They all have failure modes where one sensor fails but others are still functional. So if you combine them, then there's a much better likelihood of still having accurate model of the real world. It's never perfect, and there are times when all fail (none of them will work through a piece of construction paper for example), but there's a synergistic effect by combining them. Any two are a great improvement over a single one. There's still the argument of cost. But, at the price point that Teslas are sold, and the fact that there are vehicles in the fleet with multiple sensor types, leads me to believe that this is mostly Elon just being obstinant.

[deleted] 2025-08-27 11:02

[deleted]

gltovar 2025-08-27 11:05

It is important to note Tesla USED to be proud of things they were doing with radar. A more modern take was touched upon when they started the ‘remove the radar’ path noted in their video here: [https://www.youtube.com/live/j0z4FweCy4M?t=1h33m37s](https://www.youtube.com/live/j0z4FweCy4M?t=1h33m37s) This video and the subsequent year videos take place when they were still working with partial human rules sets in trying to solve the autonomous driving problem. It is wild all of the systems they were trying to build to cover all the edge cases and for any one driving with v10 and v11 of FSD it was certainly rough times compared to how stable HW3 v12 and beyond have been (for personal experience). So part of the vision approach was the fact that at the time Tesla was trying to solve the problem, consumer Lidar sensor suites just didn’t exist in a cost effective and/or aesthetically pleasing way. Add to that the component shortages of 2020 and it highlights more reasons why they are taking this path. But another thing they have talked about, and users famously like to spot light are phantom braking. When radar is more trusted than vision, some HARD phantom braking issues were encountered. I remember reading some articles about driving towards overpasses causing some edge cases for this tech. Edge cases like this, along with other factors, is why they decided to go all in on vision. So I don‘t doubt that today if Teslas had access to a Lidar along with Vision, and if they trained their network with both sets of data they would achieve stellar results, but I think it is important to note that vision only teslas are achieving results that are out performing their peers some of which have Lidar. This demonstrates that adding the extra sensor data might be irrelevant. Add to that, which system do you trust when you have a conflict? Time will tell when others catch up, or if tesla ever adds lidar in future vehicles. Finally, I had mentioned in another post but cameras by themselves are actually better than the human eye. When I have reviewed various car cam footage during some drives (tesla cams and decade prior non tesla car cams) I have been shocked to see that the video is able to capture far more information/distance than I could see during very foggy drives and very snowy drives. I will say my HW3 cameras do occasionally seem to get sun blind if I have a dirty windshield. Not sure how big of a problem this is wit HW4 cameras.

bungle 2025-08-27 11:06

Airplanes put 3 (or more odd number of) sensors (for the same thing), so they can vote. Waymo’s argument is that the price of sensors will become a non-issue.

TheRuneMeister 2025-08-27 11:37

I think you are missing my point entirely. There are NOT multiple systems to be ‘trusted’. It is a red herring. I’m not arguing what to do with data but instead how you collect it. Teslas abilities are not tied to pictures of the real world. It is not actual ‘vision’. Instead, it creates a model of its surroundings. Sure, cameras is a great way to collect data for this, but that doesn’t mean that accuracy cannot improved with other sensors.

Valuesauce 2025-08-27 11:45

So if we are turning off lidar and biasing cameras then why do we even have the lidar? If in doubt you will turn off lidar, then what purpose is it actually serving? The whole point I’m making here is more types of sensors means the decision about which to trust or what to turn off and on is all now part of the problem but if we are still mostly agreeing that vision is “king” in all these scenarios then why don’t we just optimize the king and reduce the complexity of the problem, especially if the king is gonna be 95% of the control or more if we do add more sensors. Why the back ups that can cause confusion if incorrect if the main one is still always preferred in every situation unless it’s completely worthless in which case you should probably just stop… like a human who can’t see would.

greyscales 2025-08-27 12:23

Tesla is doing tons of sensor fusion: Sensor fusion between multiple cameras, between cameras and GPS, between cameras and map data, between cameras and wheel speed sensors, ... Elon is just being stubborn.

greyscales 2025-08-27 12:24

Industrial robots that only rely on a single sensor for human safety?

pw154 2025-08-27 12:42

> And it is faster than mapping that's what you mean? Yes faster and scalable, they're not mapping they're verifying / training the system with lidar to improve the accuracy of the training data. i.e. If the vision NN says “the curb is here,” lidar tells you the actual curb position down to centimeters. That helps Tesla train the NN to get better depth perception without needing lidar in production.

pw154 2025-08-27 12:45

> But they clearly have been mapping for Austin? Not like Waymo. They're using it to verify the performance of vision-only perception against lidar sensor data, ensuring the NN can handle edge cases without relying on lidar in production.

Terrh 2025-08-27 12:57

Ok but when is someone going to make that car? The system in my Tesla is laughably worse at driving compared to a competent human in many situations. It's arguably better in a small handful. This is a system that Tesla previously stated was far safer than humans are.

Terrh 2025-08-27 12:59

My Autopilot has never, ever done that... in fact I've taken over in horrible weather because I was shocked it was still trying to drive the car in weather that was clearly no longer safe for the speed it was attempting to still go. And I'm willing to push a lot further than most drivers are - this was weather that the majority of drivers would likely either pull over completely for or drive less than 40MPH in. AP was still zooming along at 80.

Accomplished_Sky_899 2025-08-27 13:01

This is all a very good point. At the end of the day they have potentially found a way for Vision to still be “Better than a human” and it’s most definitely less expensive. Those are the two reasons.

macewank 2025-08-27 13:20

My experience with FSD in "fairly heavy rain" is that it doesn't let you enable it because one or more of the cameras has obstructed views.

hoppeeness 2025-08-27 13:37

Not sure when…not really applicable to the discussion.

thebiglebowskiisfine 2025-08-27 13:43

Robotics are usually physically enclosed with light curtains monitoring parts entry/exit points (in the US).

SchalaZeal01 2025-08-27 13:57

We suck ass at driving for many more reasons than eyesight. Like trying to text while driving, or feeling like you're in Mad Max and trying to zig zag on the highway at 100 mph, while drunk.

SchalaZeal01 2025-08-27 14:00

The computer isn't distracted, incompetent (people who think road lines are a suggestion) or excitable (trying to race anyone, or ramming someone who cut you), its not texting, not talking on the phone or having a loud conversation with the passenger in front. Those are much higher contributors to collisions than eyesight.

[deleted] 2025-08-27 14:13

>What do you trust? radar, lidar, vision. pick ***TWO*** of them. "best 2 of 3" - second AND third opinions. [https://en.wikipedia.org/wiki/Triple\_modular\_redundancy](https://en.wikipedia.org/wiki/Triple_modular_redundancy) In [computing](https://en.wikipedia.org/wiki/Computing), **triple modular redundancy**, sometimes called **triple-mode redundancy**,[^(\[1\])](https://en.wikipedia.org/wiki/Triple_modular_redundancy#cite_note-1) (**TMR**) is a [fault-tolerant](https://en.wikipedia.org/wiki/Fault-tolerant) form of N-modular redundancy, in which three systems perform a process and that result is processed by a majority-voting system to produce a single output. If any one of the three systems fails, the other two systems can correct and mask the fault. >Tesla choose just the one input. you ***KNOW*** your eyes start to suck in monsoon rains.. but lidar AND radar say there's a sofa in the road.... what's the speed of those approaching headlights in the oncoming lane? is it safe to swerve into oncoming traffic to avoid a collision... humans do not continually evaluate evasive maneuvers. **Video: Watch Waymos avoid disaster in new dashcam videos** [https://www.kron4.com/news/bay-area/video-watch-waymos-avoid-disaster-in-new-dashcam-videos/](https://www.kron4.com/news/bay-area/video-watch-waymos-avoid-disaster-in-new-dashcam-videos/) lidar can see the deer in the dark through the trees. same applies to a kid approaching the intersection on a bike behind the bushes. eyes do not see that.

vsl 2025-08-27 14:18

Nah, wipers are utter trash on refreshed model 3 with hw4 too.

Annual_Wear5195 2025-08-27 14:23

Much fewer than if every car is a driverless Tesla.

soapinmouth 2025-08-27 14:25

What situation does lidar have better visibility than cameras? We're talking about light on both cases. Radar can see in some situations cameras don't do will in not so much LIDAR.

soapinmouth 2025-08-27 14:26

Autopilot is like 5 years old software and even then incredibly basic light weight lane assist. It's not a product with ongoing support or updates.

tuh_ren_ton 2025-08-27 14:26

The vision system's main output is a confidence score

Annual_Wear5195 2025-08-27 14:26

So.... It has issues in the blinding sun. You know, for a "perfect" system that *clearly* requires no other sensor inputs to function, there's certainly a lot of limitations.

Mountain_rage 2025-08-27 14:29

Lol, this imaginary car does not exist, but if Tesla ever builds it just imagine.... Except half the population wont want anything to do with it, and that number grows as his meddling comes to light.

Annual_Wear5195 2025-08-27 14:30

A person cannot be a competent CEO and a competent tech leader. Those are _entirely_ different skill sets that have little to no overlap. And that's with one company. It is physically impossible for him to do that with multiple. If you don't understand this basic logic, then you have no hope.

Mountain_rage 2025-08-27 14:30

Didn't Elon famously make a statement that you need as much training data as possible to cover all scenarios. Why couldn't the ai use both sensor data to make its analysis. Its basically how so many models interact in all the air we see today.

sfo2 2025-08-27 14:39

Well I’m convinced, then. Since poor visibility only causes some, but not all accidents, we can safely say that human vision is totally adequate for driving and thus there is no reason to pursue any other kind of sensor besides cameras.

Havok7x 2025-08-27 14:39

It's within reason, given the time, to figure out which system performs better. The issue is we don't have good insight into the differences as consumers. On the academic side the movement on computer vision has been fairly stagnant because it takes so much compute. It was fairly easy for myself to achieve SOTA because it's so stagnant. We don't know what these differences are because none of the large companies are working on even footing. In the end it was a money decision not a performance. Even with a competition like darpa used to do in the desert it still won't say if it's better to have cameras only vs camera plus lidar unless we get many many companies doing it.

Annual_Wear5195 2025-08-27 14:41

Waterloo and several other Canadian universities beat many of the top American engineering schools, so someone from Canada is likely more knowledgeable about this than the random EE you talked to.

sfo2 2025-08-27 14:43

I don’t know how to disagree harder with this. It needs to be way, way, way better than human drivers to be in any way an adequate replacement.

hawktron 2025-08-27 14:44

Why? If it’s as safe as humans what’s the difference to humans except having to pay for more just to have a driver? It will open up affordable fast transport to millions of people.

ILikeWhiteGirlz 2025-08-27 14:47

True, this is what Elon thinks and I agree. But maybe we have a chance to be superhuman like Doc Ock and that is better or helps until we can optimize vision so it even equates to humans which right now it doesn’t.

jtrader77 2025-08-27 14:51

It’s not ideal to use cameras and lidar at the same time. Indecision is less safe.

surloc_dalnor 2025-08-27 15:06

Yes

Igotnonamebruh42 2025-08-27 15:22

I think you ignored the fact that human brain is far superior than the computer that Tesla processes the vision input. It’s true that Tesla have the advantage of having way more vision inputs than human, but when it comes to making decision, especially in edge cases, Tesla still cannot outcompete the best human drivers.

Bigtanuki 2025-08-27 15:59

Yeah. And my 2017 MS fender cams throw "camera blocked" errors when I drive up the VERY dark country road I live on. Based on searches, that's not unusual. We're talking about clear moonless nights, no fog or rain. Elon's hubris hasn't allowed him to back down from the position that cameras only can do the job. Because of free supercharging I'll drive the wheels off of this car but I likely won't buy anything else from Tesla.

[deleted] 2025-08-27 16:07

[deleted]

DoomBot5 2025-08-27 16:13

Last I checked, a $2 rain sensor can tell the difference between rain and a shadow. Tesla clearly can't.

drgmaster909 2025-08-27 16:14

https://www.reddit.com/r/AskCanada/comments/1kob9zk/where_did_the_canadian_girlfriend_joke_come_from/

MidWesternClipper 2025-08-27 16:57

Why do cars even have rain sensors? the only proper use of such a thing is to automatically raise the windows when parked.

DoomBot5 2025-08-27 17:01

Auto wipers. It's a great feature that Tesla absolutely fails at due to refusing to add a $2 sensor to the vehicle.

pw154 2025-08-27 17:19

> I think you ignored the fact that human brain is far superior than the computer that Tesla processes the vision input. It’s true that Tesla have the advantage of having way more vision inputs than human, but when it comes to making decision, especially in edge cases, Tesla still cannot outcompete the best human drivers. Hence why I said "in the majority of circumstances". In edge cases of course it can't. Which is why it's not an L5 ADAS but an L2 ADAS, essentially a human driver augmented with 8 cameras, which is still statistcally safer than driving without FSD

mailslot 2025-08-27 17:22

It can use a lot more than that. The term is sensor fusion and it involves using every usable sensor as input. You wouldn’t trust the cameras for speed estimation alone, you’d use GPS + dead reckoning as well. LiDAR, camera, radar, and ultrasonic all have strengths in various conditions. They should all be used.

darkknight302 2025-08-27 17:50

Tesla’s rain sensor is garbage. It would randomly turn on if there’s a spec of water on the windshield. When you need it on when it’s raining it casually works and then decides to go into overdrive and wipe like crazy in light rain.

[deleted] 2025-08-27 17:56

Musk says a lot of things, most stats and "facts" pulled from his ass

[deleted] 2025-08-27 18:10

[deleted]

CarlCarl3 2025-08-27 18:13

checked in here for the first time in months and forgot how much this sub sucks since it went full hivemind Someone setup a polymarket bet for if Tesla or Waymo will achieve mass robotaxi first. I'd love to bet against all of you.

Winterwind17 2025-08-27 19:09

I didn’t know how someone so uneducated gets the confidence to call others sheep. Fascinating.

Malcompliant 2025-08-27 19:40

But this isn't true either. Lidar is not particularly good compared to cameras at detecting non-reflective items like black tires, even on a clear night.

[deleted] 2025-08-27 19:43

[deleted]

whitemiketyson 2025-08-27 19:44

What does he say about my car's autopilot not working because it's too dark?

Downtown_Eye_572 2025-08-27 19:55

Apparently he doesn’t know what Bayesian fusion and Kalman filters are.

twinbee 2025-08-27 19:56

> He has never done any real technical work but he forced his name onto papers and gave himself the title of “chief engineer” or something like that. Nah he knew nothing about rockets at one point, and then after much study came up to speed so much he impressed a world leading rocket engineer.

CarlCarl3 2025-08-27 19:57

okay, a fleet of 1 million fully autonomous vehicles giving rides, and not geofenced to specific areas within a state.

twinbee 2025-08-27 19:59

> You actually believe he’s a rocket scientist? Why didn't Boeing, Blue Origin or Lockheed Martin, with tons more money than SpaceX, accomplish a FRACTION of what SpaceX did?

cookingboy 2025-08-27 19:59

I can guarantee you he has never impressed even an aerospace engineering undergrad student. Being polite in order to not piss off an egotistic billionaire is not exactly a certification of approval. At most it’s one of those “oh you know surprisingly a lot for a person in a non-technical background, impressive” way.

twinbee 2025-08-27 20:11

Probably posted this link to you before, so apologies in that case, but for others: From this [thread of sources](https://www.reddit.com/r/SpaceXLounge/comments/k1e0ta/evidence_that_musk_is_the_chief_engineer_of_spacex/), numerous people express their admiration of Elon's engineering expertise. Here's one such quote: > "*Elon is both the Chief Executive Officer and Chief Technology Officer of SpaceX, so of course he does more than just some very technical work. He is integrally involved in the actual design and engineering of the rocket, and at least touches every other aspect of the business (but I would say the former takes up much more of his mental real estate). Elon is an engineer at heart, and that's where and how he works best.*"

Melodic-Ebb-7781 2025-08-27 20:12

Hahaha is this satire? Arguing against having more information because if they would conflict you need to prioritise? Call the president and ask him to axe the CIA, maybe their reports will conflict with the FBIs otherwise.

No_Lie_8954 2025-08-27 20:27

If i drive in the dark and vision says it is fine and the lidar screams i will hit something i would absolutely like my car to slow down at least.

IndividualWestern263 2025-08-27 21:11

What a stupid take

VintageSin 2025-08-27 21:30

You use heuristics and take the most likely weighted system that is correct at time of decision... Is this even a real question you're proposing here? I'd an ai can't weight and value two separate competing systems and which one is more useful to it's scenario then the systems alone wouldn't be able to operate either. If it's a vision only system it has three cameras with differing weights and values and opinions on the same scenario. If you add LIDAR your simply adding more options in the scenario. And yes that means more processing power is required, but getting it right also requires more processing power to. I've said it in a few threads already related to full self driving. But the technology from all parties, tesla included, is at such a level of completeness that you're only going to get marginal increases to match or better human like responses. And as with all ai the more data it has to crunch the better it's responses will be. Limiting it to a single system and being stubborn is antithetical to advancing the technology.

CleverNickName-69 2025-08-27 21:36

If your decision wasn't time sensitive that would obviously be true. More information would always be better. But in a situation where a delayed decision is just as bad as a wrong decision, too much unreliable information can make the system worse, not better. I'm not saying "cameras only" is right, just that "more information" isn't always better.

VintageSin 2025-08-27 21:36

Human eye isn't sufficient... There is an entire brain there Firstly which does a ton of heavy lifting. Secondly, we have other senses that are pretty friggen useful for driving. The sense of touch for example and how we can feel what happens to a car while navigating. And Thirdly we don't rely on a singular system to drive a vehicle.

VintageSin 2025-08-27 21:39

Humans don't use a singular system to drive either. You can't detach the sense of feel and sound from a human. Even dead people can hear things while driving to some extent. Yes obviously sight is extremely important. But if we're say 90% there with fsd, we should be using as many sensors as possible to get the marginal gains to 100%. Not saying LIDAR specifically.

GodwynDi 2025-08-27 21:41

Brain is the software, which was my point. Secondly, the Tesla does have plenty of other sensors as well. The discussion was primarily on the visual navigation part of driving, whether lidar or camera.

VintageSin 2025-08-27 21:54

And we both know LIDAR isn't visual... Secondly the software is a poor analog for a brain. Software is basically the nervous system reading signals. Ai neural networks are not anywhere close to general ai at this time.

Ambitious5uppository 2025-08-27 21:59

Black tyres... Yet it can detect virtually everything around them, and fill in the blanks. Tell me, how often do you see a tyre without it being connected to something? And even if in theory a tyre comes at you on its own, after flying off a lorry, the camera and radar would detect it. There by proving the point again. No one system is better. By lidar alone is better than a camera alone. - Though nobody else other than tesla are fitting just one sensor type. Nobody else is claiming it's too hard to get them to work together.

Moist-Scientist32 2025-08-27 22:10

There is no rain sensor…

barthrh 2025-08-27 22:29

It works the same way you do. You see, smell, hear, feel. You reconcile all of these inputs into a decision. Because you have multiple sensors, when it’s dark you still have input, and so on. Similarly, if you’re driving a car with parking sensors and one goes off, you look and decide if there is a threat. If you can’t see you probably trust the sensor. More input is better than less input.

NewRefrigerator4 2025-08-27 22:49

Doesn’t work if there isn’t any ambient light around either … like a backroad with no street lights. It’s a little ridiculous.

toumei64 2025-08-27 22:50

Cameras don't either, that's why they need to admit that removing radar was just a cost-cutting cheaping-out measure and bring it back. My 2017 S with EAP on radar used to drive 75 MPH in blinding rain at night.

GodwynDi 2025-08-27 22:57

You just reword my succinct statement with dozens of words as if you have found something profound.

Quiet-Resolution-140 2025-08-27 23:04

What if your camera says it’s fine and it’s not and there’s no secondary source to tell the car that?

[deleted] 2025-08-27 23:13

[deleted]

BillyD70 2025-08-27 23:16

Exactly this. Use the publicly acknowledged system as prime source for data and the super secret squirrel second-tier (n-tier) system(s) for future proofing tech.

BillyD70 2025-08-27 23:22

Once “smart” cars start talking to each other (sharing travel data - geolocation, proximity to other vehicles, speed of travel, etc.), the AI models will get MUCH better.

toumei64 2025-08-27 23:26

Yielded no value? They didn't actually bother to do anything with it so I think "yielded no value" isn't really an accurate assessment. It's also my understanding that Phoenix radar hasn't been cut and is still being installed in S and X, they just still haven't been doing anything (that we know of) with it.

gltovar 2025-08-27 23:45

Fine, in the way most people understand lidarsensor usage I have to focus on that point. But you definitely understand how much more abstract these machine learning models 'interpret' data. I concede that these systems in tandem likely offer more nuance in the formation of the 'algorithmic solutions', and I am not intimately familiar what edge chases lidar looks like in a 'noisy' situation. My point is, at the armchair general level of understanding, people will point out that lidar can only help. My point is that in execution if you built a system sophisticated enough that is taking in lidar and vision and outputting perfect driving which out geo fencing or special HD mapping, then that system is likely able to handle that task without lidar as well. Its my opinion that tesla is demonstrating this, both in my first hand account using their product, as well as that massive IRL test case I linked to in my original comment. Lets say that using vision alone you achieve a 5x better driver than a real human per mile, I think is a massive win. Now, as with everything, once we are normalized to that kind of success maybe there is an upper limit to vision only success. To hit 10x or 20x of human driving skill you need an extended sensor suite. It is how success in this space is defined. I think it is unrealistic to expect absolute perfection out of the gate. Honestly the system is on part with how terrible of driver we already are that is a massive win for the tech, let alone1x - 2x better. So I think it is important to define what success looks like so we can be in agreement on what we, as humans, are trying to achieve. Now it remains to be seen if, broadly speaking, that vision only can get to say 3x better human drivers per mile, able to be purchased by a normal person and used in any situation that an average driver can handle.

gltovar 2025-08-27 23:51

Yeah I wish Tesla still did their AI days: [https://www.youtube.com/watch?v=ODSJsviD\_SU](https://www.youtube.com/watch?v=ODSJsviD_SU) it offer some glimpse into their technical process, even if there is a massive bias on the information as it is coming from the companies mouth.

ItsAGoodDay 2025-08-28 00:13

Say it with me… how do they calibrate the vision inputs? By MAPPING the surrounding area WITH LIDAR and feeding all of that data into their vision model.  Lmao you’re so dense. Trying to say Tesla is better than Waymo because they don’t rely on lidar mapping data and then turn around and say they use the LiDAR data they capture for input in their NN training data.

pw154 2025-08-28 00:47

> Lmao you’re so dense. Trying to say Tesla is better than Waymo because they don’t rely on lidar mapping data and then turn around and say they use the LiDAR data they capture for input in their NN training data.  Nothing to do with mapping mate, Waymo uses lidar to create high definition maps of their geofence and Tesla uses lidar to verify that the camera data captured is valid. Tesla and Waymo use two entirely different methodologies for autonomy. Waymo may be "better" within its mapped geofence but try to take one outside of it's mapped bounds and see what happens. Tesla on the other hand can FSD everywhere. If you want some education about how the system actually works I recommend starting here: https://x.com/pbeisel/status/1831315999037210900?lang=en

Schnitzhole 2025-08-28 00:50

My 2026 model Y does just that. Plus seeing forward is kinda useful

Schnitzhole 2025-08-28 00:52

I have no idea why they don’t have iR sensor and emitters. That seems basic at this point.

OG_Particularmatter 2025-08-28 01:11

Agreed. It's such a proven technology that can "see" further than cameras.

SirWilson919 2025-08-28 01:54

Right now, Waymo uses about 10x the energy to operate its computer compared to Tesla. What happens when HW5 steps Tesla's driving computer up from 150W to 800W and also improves efficiency. 99% of interventions are caused by dumb AI decisions, not by sensor limitations. Waymo makes mistakes all the time because more sensors doesn't make your AI less dumb.

adorablefuzzykitten 2025-08-28 02:35

I expect frat houses doing more aggressive testing of capabilities than I have read from any EV company or government entity.

itsmontoya 2025-08-28 02:50

My Tesla constantly slams on the brakes for absolutely nothing. I'd love a "dumb" cruise control

Wafflesnobbert 2025-08-28 03:00

He must have never tried to use his own FSD in even SLIGHTLY bad weather. Mine refuses to work if it's raining, foggy, snowy, etc. What a tool.

hoppeeness 2025-08-28 03:02

Also not the point. Half the population is 4 billion people.

Armchairplum 2025-08-28 03:09

I'd see it as a weighted input. We get alerts when the camera visibility isn't good or if its obstructed. Why not then adjust the weights to the LiDAR component? Have it prefer the camera vision Recently I learned that the basic cruise control won't function if the cameras are obstructed. Granted I have an older tesla but it was interesting to note. I was driving in the morning when the sun will easily overwhelm the front cameras. I was not using Autopilot and the car doesn't have the FSD (plus I'm in NZ so its not currently available) I would expect it to behave like the dumb cruise control like my 2014 Mazda 3 SP25.

colinstalter 2025-08-28 03:20

my AP turns off in even the lightest rain. And won’t turn back on even when the rain is gone unless I manually disable wipers.

Relevant-Doctor187 2025-08-28 03:28

If 2 out of 3 systems say collision. You flipping stop. Vision will drive into an ACME tunnel and wreck itself.

Igotnonamebruh42 2025-08-28 03:38

End users do not care about how the system work as long as it works and doesn’t cost them much. No one said Waymo’s AI is superior

baldycoot 2025-08-28 03:39

Most humans have multiple senses. Elon doesn’t even have one.

Armchairplum 2025-08-28 04:14

What timestamp is where it failed?

Malcompliant 2025-08-28 04:48

They do fall off vehicles. But you originally said that lidar is better than passive optical in all conditions, I'm glad to see you no longer believe that's the case. I don't think the argument is that it's too hard. I think the argument is that the added latency to process multiple sensors offsets any potential safety gains. The fact that Waymo has been struggling on the freeway for years and still doesn't allow passengers on the freeway supports this point of view. Latency matters a lot more at higher speeds.

Ahhhhhmikey 2025-08-28 05:20

100% agree. I’d love to hear someone describe a scenario where lidar actually helps. The person in the middle of the freeway at night, hiding in a shadow then jumps in front of the car maybe. Ok, so lidar solves for the edge case, but then presents false positives. And then you’ll need to train the AI to understand when all the edge case false positives are indeed false positives… If theoretically a car with vision only can match safety of the best drivers in the world, that’s success. Theoretically that should be possible, because best drivers only use eyes and ears. Only after achieving that success would it make sense to try to solve superhuman safety. Anyway, I agree with you. Everyone is so fixated on lidar helping… If I had a lidar sensor in my car beep at me anytime it saw something, I’m not sure that’d make me any better of a driver. It’d make me worse until I could somehow master when the sensor is indicating a true risk vs not.

alang 2025-08-28 06:07

That’s right! And that one input can be fooled sometimes too, so really it’s better if you just use dead reckoning! Oh wait sorry what I mean is “humans’ vision can be fooled and so can their hearing so the only solution is to either blind or deafen every driver!” Er no I mean “people almost always have different vision in each eye so the only solution is to put out one of their eyes!” Er no wait I mean the left brain and the right brain don’t always agree so…

Sharp_Technology_439 2025-08-28 07:09

Humans use eyes and brains for driving. The tesla ai is already better in some aspects. E.g. it can compute a few douzen visible cars at the same time but the cameras are still lagging behind our biological eyes.

Ambitious5uppository 2025-08-28 08:41

Waymo goes on the freeway, they don't allow passengers for regulatory reasons. Volvo uses 1 lidar, 5 radar, 8 cameras and 16 ultrasonics. They are the brand who put safety above everything. They clearly don't think latency is an issue. His argument is that it's too hard... And no, what I said doesn't change the fact that lidar is better in every CONDITION. Not perhaps every potential situation. My argument is always that you need more than one sensor type. As every other manufacturer agrees. Besides even if you really want to base your entire decision on tyre visibility. Your argument is only based on the low power 905nm lidar sensors. Volvo uses 1550nm sensors, and so will Mercedes and Audi. 1550nm sensors can see tyres.

ChoMar05 2025-08-28 08:54

Cameras can be affected by other things than lidar, like difficult lighting. Plus, it's not an absolute case. Reducing speed is an acceptable and human-compatible method when dealing with jeavy rain, fog or otherwise degraded conditions. That's the big advantage of using multiple technologies. You can use different approaches to solve different problems. It's not "either or".

maxipanda8321 2025-08-28 10:08

That s like saying you don’t want two medical opinion’s, because you wouldn’t know what to do with them.

TheRuneMeister 2025-08-28 10:16

Again, you are just plainly ignoring the whole point of my response. You continue to use terms like ‘systems in tandem’ and continue to indicate that these are 2 different systems. Same system, different sensors. Lidar in this scenario can simply be considered a different camera sensor. Just because it does better than other manufacturers in specific scenarios doesn’t mean that Elon is correct. It just means that it does better than other manufacturers who haven’t yet had time to catch up.

maxipanda8321 2025-08-28 10:23

The whole idea is to combine different systems that have different problems and benefits to create a completely functional system.

gltovar 2025-08-28 10:55

Ok now you are just being pedantic. I conceded in my reply that these systems are not working and independent systems. When I say ‘in tandem’ I am saying that we have cameras and lidar sensors that have their own firmware operations to generate bits to output to some source. Many people have a high level mental model that the software is processing data from these systems independently building some kind of world model from one another in isolation so there that would mean there is a chance of a clash when the models dont agree. BUT you and I understand that these systems are just feeding their raw data and the computer is just trained not caring what the bytes represent but just a slurry of raw world information and that slurry is just considered one input that it generates the output to decide how to drive. To be crystal clear, I said in tandem as it is very likely these cameras and sensors are products from a 3rd party from the car manufacturers and on a component level they are their own mini system, a black block to the car computer in terms of how they ‘gather’ information to turn into data for the car. So to the point of Elon being or not being correct, chuck that out the window. He is probably the largest Dunning Kruger individual in history. Regardless of that, we importantly need to define ‘success’. Personally if a car can drive 10% better than a standard human does on average per mile, I think that is a massive success. It feels like people are expecting this tech to be able to drive 100% error free to be a success, which I feel is an unreasonable goal. I would be willing to concede to a higher level of driving scrutiny like being 3x safer than a human driver. Of course defining that is a task on itself. Then if Tesla gets there we know it is possible to solve the problem vision only. If another manufacturer gets there with Lidar and vision or with any other combo of sensor system, the we have proof of that solution. If both do it then hurray for every one. It would be worth realizing if Tesla is successful that would mean less systems are needed, in which case it is cheaper and more compact to implement. Also the raw training data is easier to generate, as Tesla has shown off its Unreal Engine generation of training data. Having to generate the raw data that also has accurate lidar artifacts for volumetric particulate like dust, rain, snow, fog is likely a tougher problem to tackle that it sounds. But that is neither here nor there. Ultimately it comes down to ”Do you cameras are the only type of sensors needed to drive safer than a human, and will succeed with in 5 years?” I say yes, I feel like you might say no.

OlivencaENossa 2025-08-28 11:31

Wait FSD doesn't work in the dark??

miked4o7 2025-08-28 13:28

when a human is driving, it's possible to imagine hearing a noise that informs your behavior even if you don't visually see it.

vinylflooringkittens 2025-08-28 13:31

You could say the same about multiple cameras as part of one system.

Dr_Pippin 2025-08-28 13:57

Except two conflicting opinions are challenging to deal with. I've had clients come to me saying they don't want me to even look at past medical records and just start over.

Dr_Pippin 2025-08-28 15:16

> The system in my Tesla is laughably worse at driving compared to a competent human in many situations. You are radically overestimating the general population's driving ability. They stare straight ahead and as long as all you are asking them to do is drive in a relatively straight line, they can manage. Start throwing any curveballs at them and it all falls apart. "Competent human" behind the wheel is few and far between.

Dr_Pippin 2025-08-28 15:20

> Oh and before I get hate: my 22 Model Y has radar and ultrasonic sensors Does it though?

Dr_Pippin 2025-08-28 15:20

It does.

MidWesternClipper 2025-08-28 15:31

I don't understand how that is good, or why it's necessary. Every time I've driven a car with them, they are ALWAYS at the wrong speed, and the speed keeps changing, and it irritates the crap out of me. In addition it disconnects people even more from the task of driving, the only thing they should be doing. So many cars with no lights and wipers going paying zero attention to anything. I just believe a car and driver should be predictable, steady, and engaged. but that kind of thinking is "crazy" these days.

DoomBot5 2025-08-28 15:34

Your wants and thinking do not line up with anyone or your own words. Yes, driving should be the only task you're doing. So why are you messing with headlights and wiper controls? Those should all be automated so you can focus on the road.

Terrh 2025-08-28 15:51

>You are radically overestimating the general population's driving ability. They stare straight ahead and as long as all you are asking them to do is drive in a relatively straight line, they can manage. That's basically what AP1 can do sooooo yeah. I agree that overall, especially in NA, we have pretty low driver standards and training.

Dr_Pippin 2025-08-28 16:02

> That's basically what AP1 can do sooooo yeah. Yup. And I like AP more than FSD at times because it just drives one bloody speed and lets me decide if other traffic is an issue, rather than the constantly fluctuating speed FSD now does.

luscious_lobster 2025-08-28 16:47

How about cameras?

[deleted] 2025-08-28 17:27

[deleted]

demodulation 2025-08-28 18:16

Exactly, phantom breaks are annoying if frequent.

SirWilson919 2025-08-28 19:19

You were implying that lidar is the reason Waymo has less mistakes than Tesla. Honestly I think Waymo would be way better off if they dumped their entire Lidar budget in to on-board compute/batteries and used vision only. We really just need smarter AI and this is a solved problem

Malcompliant 2025-08-28 19:20

Please cite the specific regulatory reasons you're referring to. Waymo got regulatory approval for paid rides on freeways here more than a year ago, but haven't launched yet - https://www.axios.com/local/san-francisco/2024/03/05/waymo-self-driving-cars-california-freeway-approval Volvo does a lot of safety marketing, but Tesla is actually better when you look at crash test results.

Ambitious5uppository 2025-08-28 23:56

Ah well there you go then, Waymo are even more ahead than we knew. Do you know how insanely easy it is to make a full EV do well in standard crash tests? I mean it's beyond easy. Without an engine to work around, it's all crumple zone. BMW also do well in crash tests, but we know they cut corners literally everywhere they can get away with in terms of safety. Yes Teslas score really well, but it's not because they prioritise it, they just make sure they'll pass the test. Volvo prioritise safety from every single angle, at all different speeds, and various scenarios. They have safety systems for accident types no other car has, and probably never will have until it becomes part of the NCAP/IIHS tests for public consumption. With that said, if you just look at the scores, there's no clear winner between them when it comes to the test data, they both get top marks in all tests across all models. - So your statement that Tesla is 'better' really isn't true. One thing which is commonly noted in comparisons is that Volvo has a lot more redundancies built in, which aren't really something you can test for. Volvo safety is far from just marketing, it's absolutely ridiculous to suggest this. There is no other manufacturer who does as much testing and development as they do. - But of course they do talk about it a lot, because why wouldn't you? So ignoring alllll of that, becuase it's not really relevant... The point is, they, (who regardless of your opinion on who is 0.x% safer in any given crash scenario, clearly do care a lot about safety), managed to get multiple sensors working together, which Tesla either can't figure out, or more likely Elon just won't get his pride out of the way and admit he made a mistake (I'm betting on the latter, since everyone else has managed it).

YR2050 2025-08-29 02:22

So you're going back to human coding instead of AI, Smart. /s

cedarCrest76 2025-08-29 02:56

Don’t know about Tesla’s newer models but on my ‘21 model 3 FSD shits the bed when the sun hits the cameras just right, edge case but happens often enough. There’s solutions for these cases, other self driving companies are using infrared for low visibility situations.

Malcompliant 2025-08-29 06:38

I think your claim that Tesla doesn't prioritize safety is nonsense. I remember the Model X launch when the first thing they talked about was safety and they spent so much time talking about safety that people complained about it. If safety is so easy in an EV, why is Tesla safer than other EV's in crash tests? And look at the actual data not just the scores - Tesla is better, full stop. What do you mean by "managed to get multiple sensors working"? Last weekend I rented a Tesla with FSD and it drove > 400 miles on urban, suburban, mountain, and rural roads plus freeways perfectly with no input from me whatsoever. I cannot do that in a Volvo, because they haven't actually managed to get it working. They are far, far behind Tesla. They can't even stop at an intersection and then make a turn when the light changes to green.

BadRegEx 2025-08-29 08:29

A school project sounds totally comparable to solving autonomous driving.

VideoGameJumanji 2025-08-29 16:29

You guys are being stubborn thinking that you can have a two system array where in certain conditions one is crashing out and the other is working but that the system can decide which to trust. Your misunderstanding is thinking of vision and lidar as being redundancy systems to each other, they are not.

VideoGameJumanji 2025-08-29 16:32

I use it on farm roads with no street lights at the dead of night and it drives fine wth are you talking about

WelpSigh 2025-08-29 16:32

Sensor fusion technology has been demonstrated to work consistently in Waymo vehicles (which use three sensor types - also radar), which are doing a quarter of a million rides per week. You guys keep saying it's impossible, but that is no longer true. Machine learning has advanced and so has self-driving technology.

VideoGameJumanji 2025-08-29 16:34

Armchair experts on self driving are going wild in this thread. A senior EE project experience is a crazy way to justify what you are saying lmao, that doesn’t put you anywhere near the realm of knowledge or experience as their entire ai R&D team. If throwing sensors at self driving was the solution then waymo would be a trillion dollar company and their cars wouldn’t be dog shit

VideoGameJumanji 2025-08-29 16:46

I’ve been driving out between states in the sun for a 10 hour road trip last weekend and never got a sun warning and I’m on HW3

VideoGameJumanji 2025-08-29 16:47

Half the comments are guys going full armchair expert claiming that lidar sensor fusion is the second coming of Christ while ignoring the computation cost and complexity training an ai model to also understand what it is seeing in lidar data The guys claiming the car still needs radar need to get a grip.

wish_you_a_nice_day 2025-08-29 18:59

Good. You don’t want one. It is no fun. It is not a warning. It is an emergency take over

Business_Pollution38 2025-08-29 23:57

Aren’t there use cases where humans rely on sound than vision ?

gltovar 2025-08-30 02:21

Sorry I think you have to be more specific with this question. Are you asking if a 'standard issue human' that has good vision and hearing would outperform a deaf person in some edge cases scenarios? I sure that would have to be the case, but do we prevent the deaf from driving, no. Next question would be: would a person with good vision and hearing out perform a theoretical deaf human that has 360 degree vision that can process what they are seeing at all angles simultaneously? One can only speculate upon the answer but I would imagine that the person with out blindspots would perform better. Merging the two questions it would then come to: this theoretical 360 degree human with hearing would likely out perform the deaf one. That answer doesn't recognize an important factor, how much after safer is one over the other. Seeing as regular deaf people can drive, it is likely marginal. So this does highlight that more variety in information to sense will increase performance, if a system can be built using standard cameras alone that can drive safer than 'good' human drivers, then that is already a massive success. There may be an upper limit to how 'safe' a camera only system can perform but that might just mean diminishing returns for the cost of extra equipment and development. The question at hand becomes, can we build a standard camera only vision system that can drive safer the people within the next 5 years. Some people say yes, some people say no. 15 years ago I would have been in the 'no' camp. In my day to day usage of a HW3 Tesla using it to drive 90% of the time, I thing they will pull it off. But I also know we will see more than one solution to this problem from different engineering groups.

SirWilson919 2025-08-30 15:14

They are better than human eyes, the problem is the AI isn't always smart enough to know what it's seeing. Prime example is speed bumps. You can clearly see them, the car just doesn't understand that it's a speed bump

CyberaxIzh 2025-08-31 08:12

LIDAR can see through fog. It uses IR that is much less scattered by it (scattering scales as the fourth power of the frequency).

CyberaxIzh 2025-08-31 08:15

> Waymos are ridiculously expensive per vehicle Waymo uses Jaguar I-Pace as the base, which is by itself $80k. And they need to add a ton of custom hardware to it. That's why "unit cost" makes no real sense when talking about the Waymo vehicles right now. Once they are mass-produced, the cost is going to decrease a lot.

CyberaxIzh 2025-08-31 08:28

The current state-of-the-art in LIDARs is chirped modulation, and continuous wave amplitude/frequency modulated LIDARs. They can disambiguate between actual obstacles and fleeting reflections from snowflakes. And they are _especially_ good at finding moving objects because they can do direct Doppler measurement. And since they're using infrared, they can do _better_ than visual-spectrum systems. And the best thing, they don't really need complicated hardware, just better detectors and some electronics to modulate the laser.

a9udn9u 2025-08-31 15:23

I really don't care which works better under what circumstances. Tesla is doing pure vision, the Chinese are doing lidar + camera. I'm 100% sure there are pros and cons in both paths. Whoever makes it work first is correct.

soapinmouth 2025-08-31 15:43

Do you have a source for what you are referencing? My understanding is that depending on severity performance is still impacted by fog and rain, it can not as a general statement "see through it". But yes it can do better than cameras depending on the specific LIDAR unit. Radar on the other hand does see through it.

CyberaxIzh 2025-08-31 16:23

LiDAR is affected by the rain and fog. It's still light, and it still gets absorbed. In general, they are affected about as badly as any vision-based systems: https://pmc.ncbi.nlm.nih.gov/articles/PMC10051412/ But longer wavelength LiDARs are less affected by fog than vision: https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9076331 (page 7066, first chart).

SynAckPooPoo 2025-08-31 17:42

Waymo works fine in dust storms. https://www.reddit.com/r/waymo/s/31Q6N0gwvX

godnorazi 2025-08-31 17:46

Thats the point of software (AI)... the more data it has to work with, the better the decisions it can make

Sudden-Ad-1217 2025-08-31 21:07

LiDAR in fact works in major conditions better than a camera. That just straight up facts. Elons continued downplaying LiDAR is getting old.

[deleted] 2025-08-31 21:11

[deleted]

Sudden-Ad-1217 2025-08-31 21:20

Mark Rober has entered the chat…..

nhorvath 2025-09-01 14:21

you're not using one or the other, you are combining data to make better sense of what the sensors are seeing under different conditions.

jv9mmm 2025-09-02 00:12

Lidar systems can be very expensive, which is the real reason.

yangcj 2025-09-02 02:46

Because human uses eyes, car should use cameras. Then human uses legs to run, cars should use legs too?

noiamholmstar 2025-09-02 15:44

I've seen both. Sometimes it continues working in conditions where I can barely see the road, and other times it panics at drizzle. Granted these were at different software versions, so not necessarily apples to apples.

MidWesternClipper 2025-09-02 16:05

the less you think about the car, and what it's doing, the less engaged you are, and the less you actually think about driving. besides autoheadlights really seem to be batting 0 during grey days and rainy days. If you're going to make an autolight, skip the light sensor. Just have the headlights be on/off with the car, or on/off with the car in D.

DoomBot5 2025-09-02 16:09

That's not how it works. In reality, the more you're engaged with these things, the more distracted you are from monitoring the road for hazards. Your reality does not match up with actual reality and laws that govern these systems.

jrherita 2025-09-11 09:37

My Tesla's cameras don't have floaters in them

MidWesternClipper 2025-09-13 15:35

I've been driving 30 years, and it wasn't till these systems became the norm that i started to see nearly 10-20% of cars with either no lights at night, or highbeams on ALL the time. 80% of vehicles with without lights on during the rain (40% if you include drl which only illuminates the front) Part of the blame is how POORLY the car makers are installing these systems. Again, the light sensors. But far worse are the dashboards that now look exactly the same with your lights on and they do with your lights off. After a while they've add a little light to tell you your lights were on. Back in the day, if it was dark and you could not see your dashboard, your lights were off. That is how God intended cars to be! We even had digital clusters in the 80's that worked that same way.

Teslaaforever 2025-09-22 01:00

Meanwhile I was cleaning my windshield and got ready steering and FSD turned off in day light 😅

QaunainM 2026-01-11 09:32

Tesla FSD also can't handle heavy snowfall so it's a bit pointless for Musk to roast Lidar when his camera based FSD had the same exact flaw.

xepion 2026-02-18 17:08

Well this aged well. Tesla is now adding lidar to the cars…. Looks to have been a cost decision after all

gltovar 2026-02-18 23:14

Source? The newest article I am finding is for an FCC filing for updated mm wave in cabin radar (passenger occupancy, not driving). Older lidar articles showcase various models with roof racks full of equipment including lidar sensors. Speculation is that this is to validate the camera only system's interpretation of world space, but is just added to engineering validation cars in their fleet, not for consumer production.

Add comment

Login is required to comment.

Login with Google