Well that’s awkward.
And this is why FSD that “you need to supervise” is fucking useless.
Credit to them for still posting the footage
Wait, but Leon said in 2016 that "[a driverless Tesla will travel from L.A. to NYC by 2017"?](https://share.google/4JkR4AaE5DkXF5tI1) Including charging! That was exactly 9 years and 2 days ago and Tesla hasn't even "solved" wireless charging. To be fair, Musk **did** manage to open the burger joint he first promised back in 2018.
It's a Fucking Shit Device
I implore you all to watch the crash with audio, I actually laughed out loud
Honestly, I turned mine off a couple updates ago. It just became too unreliable—swinging way over or under the speed limit, drifting toward the right-side lane markings for offramps then correcting wildy, and even reacting to shadows from passing cars or trees. More than once it slammed on the brakes as if there were an obstacle when nothing was there.
🎵 sad trombone 🎵
Lidar would have easily seen that. Musk is a clown 🤡
What’s more stressful? Driving leisurely yourself or hovering your palms over the steering wheel, keeping your heel torqued up over the brake, and looking at the screen to continuously make sure it isn’t about to do something stupid?
Exactly. Anyone who has supervised a learner driver knows this. Except there are cues a human learner is out of their depth. FSD will confidently drive into or over shit without warning.
Both people saw the object 8 seconds prior to impact lol
>Tesla influencers Well, that's the final determination that we've failed as a society.
5 to 10 times more views after 2 days than most of his other videos after a month. Engagement going through the roof. Lots of viewers who never heard of his channel. Probably a couple of new subscribers. Free pr everywhere, including this sub. Pretty sure Bearded Tesla is stoked he hit that thing.
\*articulated seeing the object eight seconds prior to impact. They had to have seen it before that to formulate their response.
I use FSD all the time for the past 2 years. I will say, 2 years ago it was not very good. Recently, I have been traveling around 240 Mi round trip using FSD. There are some times that I disengage because I am not sure how it's going to react, overall, I think the system works awesome, I normally never have any real issues with it. It seems to get better and better with updates. Of course that's not perfect yet but it's definitely heading in the right direction.
lmfao
people are roasting them but their eyes may have fooled them in to thinking the ladder was flatter than it looked. And they didn't realize how large it was until it was too late. That, combined with how fsd really does change your behavior i.e. you let fsd do its thing and you assume it's gonna correct itself so you don't really know if you should take over. All of this, combined with watching them go flying just makes this one of the funniest clips ever.
ten elderly grandiose telephone one chop divide mighty retire oatmeal *This post was mass deleted and anonymized with [Redact](https://redact.dev/home)*
Well said. Before my mS got totaled I had gotten to where I drove myself anywhere but on highways. Just easier mentally to go on and do things myself
Again, to be fair, that burger joint is a complete mess and is now on life support [https://www.jalopnik.com/1938650/tesla-diner-drops-most-menu-options-cuts-hours/](https://www.jalopnik.com/1938650/tesla-diner-drops-most-menu-options-cuts-hours/)
Then don't use it. We all have a choice. You only want to complain and that's ok. If you didn't complain, most won't comment.
How do you feel that you still have to monitor FSD when it is engaged? Isn't the point of FSD that you can give up control to the car and software?
A piece of cake for lidar.
It's not Tesla's fault. It turns out that driving is a very complex and difficult task. It's why the human brain requires high levels of attention and focus and why people shouldn't talk on their phones while driving. Tesla could not have known this. They thought it was comparable to say, an iRobot bumping around the house. Everyone needs to cut Elon some slack. He'll figure this out and they'll have it ready in say, I don't know, about a year.
Imagine saying that TSLA's valuation could reach [20 to 30 trillion ](https://share.google/puQl3s2rwe8HZKH8Q) yet not being able to properly run a fast food restaurant.
Restaurants are such a low margin, high competition business. I guess it was worth a shot, but it looks like the writing is on the wall. Elon needs to focus on cars and batteries.
🤣🤣🤣
The failure is simple Robotic systems are good in sealed environments with known hazards They fail to cope with open environments with changing & random hazards Humans have evolved to run multiple levels of environmental observation, tracking on conscious & unconscious levels data we can’t even program a robot to check, because we don’t realise all we do!
Yeah. I got downvoted galore for saying that using FSD was exactly the same, in my opinion, as driving. My hands are still "on the wheel" as far as I'm concerned because I'm always looking out for something to go wrong. I don't wish anyone harm, but I can't wait to see unsupervised Teslas dead on the shoulder of the highways and passengers calling an Uber to come get them.
Not even being able to hire someone else to run a fast food restaurant in one of the biggest cities in the country, either…
Why everyone should wait for Elon to solve it while the existing Lidar solution (Waymo, Appollo Go, et ) can easily handle things Iike this?
I use it everyday. Going on 6 months. I stay attentive and let FSD drive the car. Might have an issue every 100 miles of driving and never a problem on the highway. It's not perfect but it's pretty good. Better than any of the lane assists on any other car. I have a 26 Model Y Launch. Might be the HW of the car using FSD.
I was joking. Everything with Elon is always "one year away".
As long are Tesla changes the name and owners assume the fiscal responsibility of driving a faulty system that constantly changes. Have at it.
> and they'll have **it ready in** say, I don't know, **about a year.** This is essentially /s.
The experiment wasn't to see if they could react. It was to drive coast to coast on FSD. This is failure two.
Reduces the driving workload but does not eliminate it all. Supervised is Supervised. Stay attentive and be prepared to take over if required. Realize the technology limitations and use it as you want. Been using FSD on two different cars over 5 years. It gets better every year.
>There are some times that I disengage because I am not sure how it's going to react... And you just nailed the problem with using an AI black box in a safety critical system. You can never be sure of how it's going to react, because it isn't necessarily going to react the same way every time. Without that ability, it can never be part of a viable product.
>My hands are still "on the wheel" as far as I'm concerned ...and as required by Tesla's operating instructions, plus some local driving laws and common sense. Anyone using FSDS without their hands on the steering wheel is endangering themselves and everyone around them on the roads.
> the Model Y had a broken sway bar bracket and damaged suspension components To be fair, the car could have left the factory that way.
>We all have a choice. Bullshit. Branch Elonians endanger the rest of us every day with this junk.
FSD you need to supervise is D
To me it seems more difficult than driving. In addition to processing all the normal stuff outside, the "human supervisor" has the added unknown of trying to predict what the car might do.
Tinfluencers
Oh, I'm sure they give up control, whether they'll admit to it or not.
I wonder if that would cover the deductible + raised premiums b/c he probably totaled his car for it.
You make a lot of excuses, but when you see something, you slow down, and if necessary, go around it. You do not plow into it at full speed.
Imagine if your human driver had 'issues' every 100 miles, you would find a different driver after the first problem!
Even if you're driving a Tesla? Not in my experience! 😉
I swear I fell down a ledge, hit my head and woke up in an alternate dimension. >In 2016, Elon Musk infamously said that Tesla would complete a fully self-driving coast-to-coast drive between Los Angeles and New York by the end of 2017. >The idea was to livestream or film a full unedited drive coast-to-coast with the vehicle driving itself at all times. >We are in 2025 and Tesla never made that drive. >Despite the many missed autonomous driving goals, many Tesla shareholders believe that the company is on the verge of delivering unsupervised self-driving following the rollout of its ‘Robotaxi’ fleet in Austin, which requires supervision from Tesla employees inside the vehicles, and improvements to its “Full Self-Driving” (FSD) systems inside consumer vehicles, which is still only a level 2 driver assist system that requires driver attention at all times as per Tesla. Electrek is now saying stuff like this? What happened, Fred? Never got his Roadster from all those referral points, did he?
Full steam ahead!!!!
Regardless, if I see something in the road I try my best to drive around it. If your theory was correct, there’d be a line of cars pulled off that just hit the ladder.
Predictability really is important; that's when most human on human crashes happen: when someone does something unexpected, and then you have this entire unscripted melodrama about how things will unfold with fast moving objects in a limited space and unscripted reactions.
They also admitted 100-% fault to insurance as well.
So you use it as a good cruise control?
Watch the video. The camera based FSD simply did not register the obstacle when the drivers did. And the Tesla isn‘t better than in any other car. That‘s objectively false and is proven by many car magazines especially in Europe. Tesla is always behind carmakers like Mercedes, Kia or BYD, because camera-based-only FSD is shit.
Debris on the road, that's an edge case!!! That'll like almost never happen. /s
The problem is the rest of us *don't* get to choose when you put our kids' lives in danger because your fucking space car doesn't know how to stop for school buses.
Wait until they learn about fog, snow, heavy rain, night - or even pedestrians and cyclists!
Muskfans have been hating on Fred for years because they believe he’s anti-Tesla.
Later this year.
>Might have an issue every 100 miles of driving I fail to see the point of FSD, if it doesn't...well: actually work.
It's the same as the Cybertruck "crossing" the Rubicon Trail video. Even if it's a complete debacle, their identity/income source is Teslaworld so not posting it is objectively worse for them. If not for content that they try to monetize, they're just sad people who are all on Elon's nuts for some reason.
judging by how they ran on the shoulder of a highway, I can't compliment them on an overabundance of intelligence, hence why they posted this.
> Tesla hasn't even "solved" wireless charging. He meant the snake chargers, that would auto-deploy when you pulled up to a tesla charger.
because the cop told them that, in order to claim insurance they'd have to have a police report, and the officer said they were at fault.
A traditional radio sensor would be enough.
We all get feedback loops with the videos recommended to us, etc...so it might seem like he's getting a ton of views. But I looked at a 6 day old video of his - 9.6k views and 85 comments vs "Part 1" of the coast to coast drive - 7.1k views and 69 comments I guess "Part 2" is where he hit pay dirt, with 50k views and 446 comments, but I wouldn't call that "viral". His subsequent videos have already died down to under 10k views, so this was a fleeting moment for his channel.
Thanks for this. That was pretty damn funny.
Crashed before 60 miles = Rise 4% in stock.
100% this. I've tried the FSD trials.... it's incredibly stressful. Much more so than driving myself.
>I stay attentive No, you don't. I'm sure you think you do, but these two jackasses were "paying attention" and watched the car drive right into an obstacle that they themselves had noted with more than adequate time to react. System monitoring is not even a little bit the same as driving.
Compare the Mercedes Drive Pilot to Tesla FSD. There are few opportunities to experience Level 3 as it's mostly lane assist. I'm not saying FSD is perfect. It's not but it's pretty good and yes better today than Mercedes Drive Pilot. Kia not even close
And I agree. Camera is not the best solution for FSD for several reasons
That's not paying attention.
yeah I put 2000 miles on it this June in a week. It was fine, but the OP scenario is my #1 fear since I know it doesn't dodge potholes etc. yet.
Simple solution. Don't pay for it and don't use. Save a lot of time complaining about it.
That's why the stock is up ONLY 4% today./s
If it's so useless, don't use it. Tell me another car with technology as advanced as that. Yes, it is still a work in progress. They have come a long way since the beginning. It will take some time, but there's no other car company that has the technology like Tesla. You may not like him or his products but truth is truth
The cherry on top is that owners pay for a subscription for FSD and owns all the liability.
Ride across country with hands off wheel - take a Greyhound bus. Ride across country with hands on the wheel - take an ICE or hybrid and not the Tesla.
I only use autopilot now, it’s easier to just let it handle the easy stuff
And this is why 5 years or 10 ago, electrek was a useless publication when it was only Elon Elon Elon
If you actually watch the video… the driver is at fault here. They thought it was “roadkill” and wanted to drive over it. Even then, they could have switched lanes once they saw what it was… They were being idiots.
Faux Self Driving
Exactly. They were being idiots..
car ran over road debris. car continued down the road for half a mile in its own lane until the driver pulled the car over. damage was done to the suspension. a quick fix at a muffler shop had them going again. but it appears the charging system was affected and so they are taking the car to a tesla service center for a complete examination. end of story. geesh! and as usual, people always assume an unfortunate accident would have been avoided if they were in control.. but there's no reason to believe that the driver would have done anything different. and if he did, he could have tried to avoid the debris, ran directly over it, unlike FSD, cut a tire and run off the road. the bottom line is that the car continued safely down its lane after running over the object. not all encounters with road debris end so well. if it had been a plastic bag blowing out from the side of the road, would you want the car to swerve violently to avoid it? ...no.
Full Self Derrr Fi-sting
If I were the insurance company, I'd refuse payment on the grounds of gross negligence on his part.
Diamondhand bagholders
Thankfully its just a ( mobilized surveillance utility) car and not a spaceship from Frederick pohl's Gateway.
Well that wouldn’t have been FSD. The entire point of their trip.
I'll never understand this pseudo-FSD thing. I actually enjoy the act of driving, so I can't grasp the notion of "hey, the car is driving itself, but not really, so you can just sit there but be ready to take control at any random moment." Why not just regular drive if the computer could randomly drive you off a cliff.
> people are roasting them but their eyes may have fooled them in to thinking the ladder was flatter than it looked. They're roasting them because only an idiot runs over *anything*. I don't run over obvious cardboard, if there's any possibility I can safely miss it. >And they didn't realize how large it was until it was too late. Which is why you don't wait to react. Brake hard and look for a place to direct the vehicle. Don't spent seconds analyzing what it is, assume it's dangerous and react accordingly. >That, combined with how fsd really does change your behavior i.e. you let fsd do its thing and you assume it's gonna correct itself so you don't really know if you should take over. Great, so put more trust in a fundamentally broken autonomous system, rather than trusting your own judgement to be careful.
FSD has really gotten better with HW4 cars. No I'm not a fanboi as I have been kicked out of all other Tesla subs. But you still have to watch for things on the road. These guys are idiots.
Snow, they don't get snow in Texas!
Musk is a fraudster that has too much money & power. End of story.
None of the other road users agreed to be beta testers...
Why? They're an AI and robotics company LOL
I wonder how many Twitter influencers are familiar with the snake charger.
Well, it drove itself. Promise fulfilled!
Absolute Flim Flam. I'll give them props for posting the video anyway.
**[A Driverless Tesla Will Travel From L.A. to NYC by 2017, Says Musk](https://www.nbcnews.com/news/amp/ncna670206)** (2016) > By the **end of next year**, said Musk, Tesla would demonstrate a **fully autonomous** drive from, say, “a home in **L.A.**, to **Times Square** ... without the need for a **single touch**, including the charging.”
>The hard part has just started. And there’s no telling how long it will take to get there. If someone is telling you that they know, they are lying. I don’t know. ***My best estimate is approximately 2-3 years and a new hardware suite.*** Oh, Fred. <shrug>
Rented a Nissan Rogue with lane assist, and have to say I prefer to drive the car myself than deal with the lane nagging and wheel twitching under my hands. Think I'll wait for something a bit beyond level 2, thanks very much.
Thanks for risking the lives of the drivers around you!
Do you use this braindead logic for anything else or is it only when it comes to Tesla? Why would we not talk about something that might affect all of us?
Anyone in this group actually ever own a Tesla? Never seen this group before and wondering what all the hate is about. They are 1 of the top 3 most purchased cars in the US and likely the safest automobiles on the road. I dont get it. Is this just more jealousy?
And the stock goes up.
No, supervised autonomous driving is one of the necessary steps towards progressing to actual FSD. Tesla calling it FSD, sure that is useless. But this concept itself is not.
If you actually read the article you'd know the point was to use FSD only due to Musk's promises in 2016/2017. Running this test 8 years later FSD only, was to test how BS he claims were. If it were so easily detectable the car should've easily maneuvered around it. Yes, they could've manually avoided it but that screws up the whole premise.
Up 4%.
Fair point. Fine. lol
Road debris, ftw....
They lied to the officer and both said they thought it was a piece of paper in the road, so unless the insurer sees the video where they see the debris way in advance and say they think it's an animal, but choose to run it over anyway, they should be good.
(Jeremy Clarkson: Oh Know, anyway)
Oh, so they filed a false report with the police, recorded it, and posted it on YouTube? For fuck's sake, they just keep getting dumber.
Appreciate you considering a different view!
I’m convinced the only people that aren’t stressed by it are people that are just getting their licenses. Half the positive reviews online can’t be real people. Unless someone posts their vin and receipt with their “i love the car” post I don’t believe them. Tesla has tons of employees that they ask to write puffery reviews under multiple accounts.
Just inject some tax dollar funding from SpaceX rerouted through "totally real" people's PayPal and DogeCoin accounts, and oh yeah...
The biggest problem Tesla has with “Full Self Driving (Beta)” is its name! If they simply gave it the honest name “Tesla Advanced Driver Assistance”, 99% of their marketing and regulatory problems would disappear. Their leader and marketing team are just not willing to do the right thing here.
Musk garglers
They lied to the officer verbally, when asked if they considered moving out of the way, but I think they declined to to have a police report written at that time. The officer said he'd be happy to write one, but suggested contacting their insurance company first and seeing what they wanted. Either way, it could violate California Vehicle Code §31 - False Information to an Officer. They might slip out of it by doubling down on the lie, by lying that they did think it was paper, but only said they thought it was an animal out loud, each keeping their personal paper theory to themselves.
He started this when he was 44 years old he’s now 54 eligible for a senior discount AARP it will be eligible for Social Security before this is working
That's great, that means they have a real chance to make it in Panama
They're Elon acolytes in California. I'm guessing they're on Tesla insurance which is spending $1.21 on top of every $1 they collect in premiums. If there was ever an insurer with an incentive/data to drop coverage for gross negligence, it'd be Tesla.
What kind of person becomes a Tesla influencer?
They know it doesn't work. It's just to push blame off them.
Autopilot often feels like supervising a student driver who lacks an instinct for self-preservation. Stressful.
It's also an oxymoron. "Full Self-Driving (supervised)" is like saying "Free Beer! ($9.95)"
Well, we do know that a human driver without "FSD" could easily strike this object since we have evidence that one did (the trucker who hit it after them). It's hard to tell from the video since I feel like wide angle videos tend to flatten things, but it looked to me like it really blended in with the road and could have perhaps been a bit of an optical illusion, like the shape of the ladder was weird (a really flat rhombus), and since we seldom see things that shape it could be hard to make sense of what it was. Still a massive fail for Tesla, but I think it's more embarrassing for the claims of "FSD" than it is for these two guys specifically.
The guy's name is "BeardedTeslaGuy". People who make a beard or driving a Tesla their personality are already insufferable, let alone together!
Musk propped it up with a billion dollar buy - basically can't go tits up!
Probably not, but remember when someone intentionally crashed a plane in a fake accident for "content"?
Almost 10 years ago (2016) Tesla released a video of a Tesla car driving autonomously with a caption that said more or less that the only reason why there was a human driver in the car was because regulations required it. It's 2025 and there have been cars from other companies allowed to operate with ZERO human employees inside the cars for a few years now. Tesla still can't do that, not because they are not allowed, but because their tech is not capable of doing that. Let that sink in.
Other than the Nazi thing, this self-drive is a pipe dream and Tesla are no longer leaders.
If this doesn’t work in highways how can it work in city driving..
It is easy to post a video where it competed a trip successfully. But I suspect for every video where it failed, there are 100 that people simply did not post. After all who wants to post their failures?
Break and gas in case of phantom breaking
It doesn’t
If you have issues every 100 miles with it, that shits nowhere near being ready for unsupervised use. It’ll likely never be.
Well I know here in Australia now I’m an unwilling participant to its testing, just from the fact I have to share the road with Teslas.
Man, how much did they spend on that thing just for it to be a failure?
certainly not a road kill there
Not on life support. Tesla sycophants still going like weirdos to enjoy a weirdo experience while bothering the neighbors who have to live with the 24hr monstrosity. So genius to combine old style diner experience with futuristic (failing) tech. Really novel, no one ever thought of that.
Honestly pathetic for Musk to keep bragging about his technology. Guy is smart, but his ego is his detriment. When will TSLA fold like the house of cards it is! At this point its mostly lies and speculation bolstering its valuation.
It's been noted for years now that Level 2+, Level 3, and to some limited extent imperfect Level 4 autonomy is actually the most dangerous type of autonomy. Precisely because it's "smart" enough to disarm you. It *can* drive itself in limited extents, and in some niche cases even end to end. It makes you think it's more capable than it actually is. And as a result, you wind up more on edge trying to prepare yourself to take control or you trust it too much and fail to realize when you're in danger. Level 1 stuff like cruise control was perfect because it didn't require *not* driving. You were still in full control of the vehicle; the vehicle simply automated certain aspects related to basic functions to make it easier and *less* brain taxing. Lane assist was also good because it could help keep you from veering into traffic or losing track of the road. Level 4+ and 5 will be cool for the opposite reason— you genuinely *don't* have to drive, since it might as well be a rolling embodied AGI, like an actual digital chauffeur. It's like worrying about controlling a tram or train cart as a passenger. Level 3, though, all the middle levels, is just conceptually awful. It's way too taxing on the brain, relies way too much on fooling you, and is too easy to trust with awful risks. It's actually very similar to the current state of AI as well, with how LLMS are just good enough to convince you they're intelligent but are so brittle that even wording the same prompt slightly differently can result in wildly different levels of quality. But people unaware of these limitations get disarmed and think it's actually intelligent, or they get frustrated but keep insanely trying to make it work because it *can* sometimes work.
It's more the antsy "do I, do I not" of level 2+ and level 3 autonomy, because it's like a Potemkin village version of autonomy (how many times have I used the term Potemkin village to describe contemporary AI these days?). It *can* drive itself.... but in niche conditions, and if even a single thing breaks, you have to take over. And the ephemerality of life means you can never know *when* that break will happen. But because it can appear to drive itself so we'll often enough, you get disarmed and relax and lower your guard. And then *bam,* now you're paying both insurance premiums and doctor's bills.
Ok It’s useless to the end user. Training the thing to create a usable product is teslas problem. Not the customer’s.
There are science experiments of linked hamster wheels that show that the hamsters in control of the wheel have far less stress than the hamsters that don’t have control, despite undergoing the same exercises and stressors.
Thats like getting hit by a wave in the ocean, chance off one in a million!
Brother didn't even read the post title.
Right. FSD is absolutely pointless.
OMG! What's more stressful, driving yourself or using adaptive cruise control? Yup, it's the former. Self driving isn't what is advertised, but it's not the nightmare you imagine as it's mostly just ACC. The branding of the product, and its price is the problem, not the comfort it adds to highway driving >What’s more stressful? Driving leisurely yourself or hovering your palms over the steering wheel, keeping your heel torqued up over the brake, and looking at the screen to continuously make sure it isn’t about to do something stupid? Sheesh
When will Tesla fanboys wake up to Elon chronically over-promising and under-delivering!?!
Brake*, braking*. Unless you're talking about the brakes breaking
Tesla's self driving abilities are superior to a ~~human~~ stupid driver's.
What the hell is a potemkin village anyways?
Snake charmers? Snake oil charmers? Snake oil salesmen?
I think it will be Indian guys guiding the charging cable to your car with a flute
😆
Insurance companies probably use AI lawyers, who are about equal to self-driving cars. AI will make some people a lot more efficient, and that will displace a lot of workers, but it won't completely drive people out of the market.
automotive engineer here… I really doubt that a simple automotive radar setup would have detectes this. There might be solution, yea. BUT generally the ADAS radas systems used nowdays generally ignore objects that have zero velocity (are stationary) and don’t have the dimensions of an other vehicle. They do this to prevent phantom brakings. Yes, i know it sounds counterintuitive, but this is how it actually works. The other thing is that they just omit (dont even detect) objects directly on or near the surface of the road because it would also lead to a lot phantom braking. This metal object on the video could have been detected more reliably by a LIDAR. In fact I’m 99% sure that only a camera + Lidar combo could have reliably detected this as a dangerous obstacle.
Thank you. So, a radio sensor won't detect a wall unless the car keep moving towards it, right?
So what a sensor can detect and what the SW makes out of the information is 2 different things. It also depends on the type of the radar. What you mentioned in your question might be the case for a "Doppler" only (possibly continuous wave) radar. But that alone is not really used in ADAS systems because its limitations. Anyways, there is a HUGE difference between a wall and an object lying on the road, a wall has a large area and extends upwards. Tesla could have trained their object classification models to recognize the kind of stuff like on the video and any company could make a radar that detects objects like this one. (I explain it a bit more in details in the hope someone reads it) In both cases this would have lead to a lot of phantom braking because of many harmless situations that look like this one for the sensors. In case of the radar solution, a large portion of the radar signal would be reflected by any small objects AND even by the road itself! Bumps, speedbump, potholes even leaves! It's important to mention that radar doesn't give you a high resolution 2D image as a result! It gives you a lot of data but it is actually already pre-processed by the radar's HW. What you get is something like: information about 5 objects in front of the vehicle like relative speed, size (or radar cross-secion), distance and and angle (usually JUST horizontal angle!!): Object at 50m (+-5m), Velocity -20ms/s, Area is 5 (whatever unit they use) at angle 0 degrees (let's say it's in front of the car). That's it. That's the exact reason why it is always combined with at least one on-board camera that can provide more exact information like: shape, vertical position, and inputs for Optical Flow calculations and object classification. On the other hand using camera ONLY is also a shi\*\*y solution because it can't directly measure anything. It can't measure exact distances, it can't measure relative speeds alone. Even in dual camera (stereo) setup it has its weaknesses: Rain, sunshine, low brightness, too high brightness, brightness changes (tunnel in and out), dirt on camera, resolution limitations, motion blur, camera sensor noise and so on. So the conclusion is that you can create camera based and radar based solutions that detect walls (perfect scenario for the radar based systems) and systems that detect the metal thingy that was in the video. However in the metal part case the problem is that it's lying on the road. A radar based system can be tuned to be as sensitive and shoot the beam at the needed angle, but that would lead to a lot (and I mean a LOT) of phantom braking, it would render it unusable in many-many cases (potholes, mountain roads, ramps, ...). I would say it's the same situation as with military radars where there is a minimum angle/altitude where it is still usable. You can imagine it like this object in the video would "fly under the radar". A purely camera based system could be tuned and trained to recognize this but like I mentioned this would surely lead to phantom braking too. However in my personal opinion in this special case it might work actually SOMEWHAT better (under the weather condition like in the video) as radar solution. A LIDAR based solution (obviously it has cameras, LIDAR and radars too) would work the best. Lidar supplies exact geometry data about the surroundings of the vehicle. Combine this with the information from the cameras and radars would possibly be the most reliable. BUT one must not forget that even a lidar has limited resolution. The smaller the object / the further the object is the harder it is to detect.
Thank you very much. I hit a gold mine of a response.
Lol. Sure. Im glad it helped.
CA takes that seriously. My MIL’s doctor went to jail for 6 months for lying to a police officer. (He said his buddy was with him while a crime happened, turned out hospital records showed he was doing a surgery at that moment. Whoops. Couldn’t press charges because of his lie so cops were pissed)
The car was completely disabled because of that? My Volkswagen Rabbit in high school ran over a ladder and was fine.
https://en.wikipedia.org/wiki/Potemkin_village >In politics and economics, a Potemkin village is a construction (literal or figurative) the purpose of which is to provide an external façade to a situation, to make people believe that the situation is better than it actually is. So when I call LLMs a Potemkin village version of AGI, what I'm saying is that they're dressed up like they're these general-purpose AIs that can do anything and everything. They have the scaffolding, they have the front of it, you can ask ChatGPT to do a variety of different things that even a bundle of AIs ten years ago couldn't hope to do even 100th as well as it can. With the slightest push, the entire thing falls apart. They can generate essays and even whole novellas... until you actually read them, and realize that it's obliterated by snowclones, appositive phrases, nominative absolute phrases ("noun did [X], a [Y]," or "[X] said [Y], his tone [Z]"), negation phrases ("it's not X, it's Y") etc. and never even seems to say much of anything, despite trying its damnedest with sophistic pseudo-profound "quote-bait" dialog. You generate an AI slop image, it looks amazing, until you zoom in and see that the fingers are bent wrong, the eyes are fuzzy and the iris seems to be bleeding its color, polyester fabric seems to be made out of latex, and also there's a dog shaped like Cthulhu in the background You try generating AI music, and you hear an eargasmic lick, and you want more of that. Good luck, it's never going to generate what's in your head no matter how many times you reroll it or describe it. All this and more. It all fosters this sense that the AIs we have now aren't just broken or anything but rather outright frustrating, because it's like we *can* automate some tasks... but arbitrarily not others. The automation works to a limited extent, but breaks very quickly. It's supposed to augment human capability, but it works enough that people don't realize how limited it really is and rely on it end to end (see "vibe coding"). And then everything breaks down, like the façade of a building falling on in itself after someone leans on it.
Its a shame becuz it has beem this way since i joined the ai hype cycle literally two fucking years ago
I'll never be Nazi-friendly enough to give money to fucking Tesla. That's all on you, capitalists.
Login is required to comment.
Login with Google