Brian Lovin
/
Hacker News
Daily Digest email

Get the top HN stories in your inbox every day.

keeda

I strongly believe LIDAR is the way to go and that Elon's vision-only move was extremely "short-sighted" (heheh). There are many reasons but that drives it home for me multiple times a week is that my Tesla's wipers will randomly sweep the windshield for absolutely no reason.

This is because the vision system thinks there is something obstructing its view when in reality it is usually bright sunlight -- and sometimes, absolutely nothing that I can see.

The wipers are, of course, the most harmless way this goes wrong. The more dangerous type is when it phantom-brakes at highway speeds with no warning on a clear road and a clear day. I've had multiple other scary incidents of different types (swerving back and forth at exits is a fun one), but phantom braking is the one that happens quasi-regularly. Twice when another car was right behind me.

As an engineer, this tells me volumes about what's going on in the computer vision system, and it's pretty scary. Basically, the system detects patterns that are inferred as its vision being obstructed, and so it is programmed to brush away some (non-existent) debris. Like, it thinks there could be a physical object where there is none. If this was an LLM you would call it a hallucination.

But if it's hallucinating crud on a windshield, it can also hallucinate objects on the road. And it could be doing it every so often! So maybe there are filters to disregard unlikely objects as irrelevant, which act as guardrails against random braking. And those filters are pretty damn good -- I mean, the technology is impressive -- but they can probabistically fail, resulting in things that we've already seen, such as phantom-braking, or worse, driving through actual things.

This raises so many questions: What other things is it hallucinating? And how many hardcoded guardrails are in place against these edge cases? And what else can it hallucinate against which there are no guardrails yet?

And why not just use LIDAR that can literally see around corners in 3D?

jqpabc123

Engineering reliability is primarily achieved through redundancy.

There is none with Musk's "vision only" approach. Vision can fail for a multitude of reasons --- sunlight, rain, darkness, bad road markers, even glare from a dirty windshield. And when it fails, there is no backup plan -- the car is effectively driving blind.

Driving is a dynamic activity that involves a lot more than just vision. Safe automated driving can use all the help it can get.

Someone1234

I agree with everything you're saying; but even outside of Tesla, I'd just like to remind people that LIDAR as a complement to vision isn't at all straightforward. Sensor fusion adds real complexity in calibration, time sync, and modeling.

Both LIDAR and vision have edge cases where they fail. So you ideally want both, but then the challenge is reconciling disagreements with calibrated, and probabilistic fusion. People seem to be under the mistaken impression that vision is dirty input and LIDAR is somehow clean, when in reality both are noisy inputs with different strengths and weaknesses.

I guess my point is: Yes, 100% bring in LIDAR, I believe the future is LIDAR + vision. But when you do that, early iterations can regress significantly from vision-only until the fusion is tuned and calibration is tight, because you have to resolve contradictory data. Ultimately the payoff is higher robustness in exchange for more R&D and development workload (i.e. more cost).

The same reason why Tesla needed vision-only to work (cost & timeline) is the same reason why vision+LIDAR is so challenging.

ethbr1

The primary benefit of multiple sensor fusion from a safety standpoint isn't an absolute decrease in errors.

It's the ability to detect sensor disagreements at all.

With single modality sensors, you have no way of truly detecting failures in that modality, other than hacks like time-series normalizing (aka expected scenarios).

If multiple sensor modalities disagree, even without sensor fusion, you can at least assume something might be awry and drop into a maximum safety operation mode.

But we'd think that the budget config of the Boeing 737 MAX would have taught us that tying safety critical systems to single sources of truth is a bad idea... (in that case, critical modality / single physical sensor)

jqpabc123

The same reason why Tesla needed vision-only to work (cost & timeline)

But vision only hasn't worked --- not as promised, not after a decade's worth of timeline. And it probably won't any time soon either --- for valid engineering reasons.

Engineering 101 --- *needing* something to work doesn't make it possible or practical.

ra7

The complexity argument rings hollow to me. It’s a bit like saying distributed databases are complex because you have to deal with CAP guarantees. Yes, but people still develop them because it has real benefits.

It was maybe a valid argument 10 years ago, but in 2025 many companies have shown sensor fusion works just fine. I mean, Waymo has clocked 100M+ miles, so it works. The AV industry has moved on to more interesting problems, while Tesla and Musk are still stuck in the past arguing about sensor choices.

microtherion

> but then the challenge is reconciling disagreements with calibrated, and probabilistic fusion

I keep reading arguments like this, but I really don't understand what the problem here is supposed to be. Yes, in a rule based system, this is a challenge, but in an end-to-end neural network, another sensor is just another input, regardless of whether it's another camera, LIDAR, or a sensor measuring the adrenaline level of the driver.

If you have enough training data, the model training will converge to a reasonable set of weights for various scenarios. In fact, training data with a richer set of sensors would also allow you to determine whether some of the sensors do not in fact contribute meaningfully to overall performance.

overfeed

> cost & timeline

It's really hard to accept cost as the reason when Tesla is preparing a trillion dollar package. I suppose that can be reconciled if one considers the venture to be a vehicle (ha!) to shovel as much money as possible from investors and buyers into Elon's pockets, I imagine the prospect of being the worlds first trillionare is appealing.

Earw0rm

There's no particular reason to use RGB for this kind of machine vision - cognition problem either.

Infra-red of a few different wavelengths as well as optical light ranges seems like it'd give a superior result?

overfeed

> cost & timeline

It's really hard to accept cost as the reason when Tesla is preparing a trillion dollar package. I suppose that can be reconciled if the venture is a vehicle (ha!) to shovel money from investors and buyers into Elon's pockets, I imagine the prospect of being the worlds first trillionare is appealing.

maxlin

I think you hit the nail on the head - Obviously when Tesla have saturated the potential of vision, they should bring in LiDAR if it can be reasonably added from a hardware point of view. Their current arguments make this clear - it would be surface-level thinking to add LiDAR and the kitchen sink now, complicating the system's evolution and axing scalability.

But we're far from plateauing on what can be done with vision - Humans can drive quite well with essentially just sight, so we're far from extinguishing what can be done with it.

jmpman

Tesla has redundant front facing cameras on their cars. In my 2019 Model 3, there are three front facing cameras, each with varying angles of view, all three behind the rear view mirror, all encased in a small area lined with anti reflective material. Living in an extremely hot climate, that small area, with its anti reflective fuzz have degraded, depositing a film on the window, only in front of the cameras, obscuring all three cameras at the same time. Now, my Tesla just recently started complaining when the sensors were obscured with this deposit, but that wasn’t always the case. I used to be driving down the freeway with autopilot on, and it could barely track. Eventually I looked at the saved video footage and discovered my Tesla was virtually blind, while driving me down the freeway at 85mph. At least now, with recent updates, it warns me that it can’t see very well. However I question the resolution of the sensors. To drive legally in my state, you must have 20/40 vision. When I move my head around, I effectively have 20/40 vision all around my car. If I close 1 eye, I still have 20/40 vision. Does Tesla have effectively 20/40 vision in all 360 degrees? Maybe one of the front facing cameras has optical resolution equal to 20/40, but do the rest of them? I’m skeptical, and expect I’m being driven by what’s equivalent to a human who couldn’t pass the vision test, or at best, a human with just one eye that can pass the vision test. This isn’t even getting into redundancy in the electronics boards, connectivity from the electronics to the CPU, and redundancy in the processsing. We are being asked to put our faith/lives in these non redundant systems, but they’re not designed like Class-A flight critical systems on airplanes.

SoftTalker

> Vision can fail for a multitude of reasons --- sunlight, rain, darkness, bad road markers, even glare from a dirty windshield. And when it fails, there is no backup plan

So like a human driver. Problem is, automatic drivers need to be substantially better than humans to be accepted.

tarsinge

Humans have a brain though. Current AI is nowhere near that as every engineer know it but common people seem to forget it with all the PR.

brandonagr2

Lidar is not a backup to vision, in a waymo both lidar and vision must be working, so you actually have less reliability as now you have two single points of failure.

Ocha

Yeap. Same mistake that Boeing did with making redundancy optional upgrade on max8.

jqpabc123

Another example of what happens when management starts making engineering decisions.

CjHuber

Just imagine Tesla would have subventioned passive LIDAR on every car they ship to collect data. Wow that dataset would be crazy, and would even improve their vision models by having ground truth to train on. He’s such a moron

nolist_policy

This. It's also the reason Waymo is ahead, they have tons of high quality training data being constantly fed into their pipeline.

wombat-man

I think LIDAR was and maybe still is way more expensive. Initially running 75k. Now they're more around 10k which is better.

hoytschermerhrn

The new electric Volvos have LIDAR, proving that the technology has (at least now) approached mass-market feasibility.

kibwen

This is off by orders of magnitude. BYD is buying LIDAR units for their cars for $140.

realo

My floor-cleaning robot has a lidar and i am pretty certain that part did not cost 10k$.

CMay

Even if Tesla wasn't using LIDAR, I think they did still use radar and ultrasonic detection for a while, which I'm sure contributed to their models some.

undefined

[deleted]

amelius

Your comparison to hallucination is spot on.

LLMs have shown the general public how AI can be plain wrong and shouldn't be trusted for everything. Maybe this influences how they, and regulators, will think about self driving cars.

b112

Well I wish this was true. But loads of DEVs on here will claim LLMs are infallible.

And the general public?! No way. Most are completely unaware of the foibles of LLMs.

Cornbilly

HN posters know better but a lot of them won’t be honest because they want to protect their investments and/or their employer.

jama211

No they don’t. Don’t lie.

BoiledCabbage

> Well I wish this was true. But loads of DEVs on here will claim LLMs are infallible.

No the don't. You're making a straw man rather than trying to put forth an actual argument in support of your view.

If you feel can't support your point, then don't try to make it.

danans

> And why not just use LIDAR that can literally see around corners in 3D?

Based on what I've read over the years: it costs too much for a consumer vehicle, it creates unwanted "bumps" in the vehicle visual design, and the great man said it wasn't needed.

Yes, those reasons are not for technology or safety. They are based on cost, marketing, and personality (of the CEO and fans of the brand).

fooblaster

Lidar is being manufactured in china in the volume of millions a year by robosense, Huawei, and hesai. Bom cost is on the order of a few hundred dollars - slightly more than automotive radar. The situation is a lot different in 2025 than in 2017.

beng-nl

I’ve always wondered about LiDAR - how can multiple units sweep a scene at the same time (as would be the case for multiple cars driving close together, all using lidar)? One unit can’t distinguish return signals between itself and other units, can it?

m4rtink

I think it can - I think it might encode something in the beam, use a slightly different wavelength or even just pulse the laser so that the returns don't overlap if timed right.

homefree

I use FSD in my Model S daily to commute from SF to Palo Alto along with most of my other Bay Area driving. It does a better job currently than most people and it drives me 95% of the time now I haven't had the phantom braking.

I'm in a 2025 with HW4, but it's dramatic improvement over the last couple of years (previously had a 2018 Model 3) increased my confidence that Elon was right to focus on vision. It wasn't until late last year where I found myself using it more than not, now I use it almost every drive point to point (Cupertino to SF) and it does it.

I think people are generally sleeping on how good it is and the politicization means people are under valuing it for stupid reasons. I wouldn't consider a non Tesla because of this (unless it was a stick shift sports car, but that's for different reasons).

Their lead is so crazy far ahead it's weird to see this reality and then see the comments on hn that are so wrong. Though I guess it's been that way for years.

The position against lidar was that it traps you in a local max, that humans use vision, that roads and signs are designed for vision so you're going to have to solve that problem and when you do lidar becomes a redundant waste. The investment in lidar wastes time from training vision and may make it harder to do so. That's still the case. I love Waymo, but it's doomed to be localized to populated areas with high-res mapping - that's a great business, but it doesn't solve the general problem.

If Tesla keeps jumping on the vision lever and solves it they'll win it all. There's nothing in physics that makes that impossible so I think they'll pull it off.

I'd really encourage people to here with a bias to dismiss to ignore the comments and just go in real life to try it out for yourself.

cpuguy83

This is extremely narrow minded. As another commenter pointed out, you are driving on easy mode in terms of environment and where a majority of the training was done.

This is not a general solution, it is an SF one... at best.

Most humans also don't get in accidents or have problems with phantom breaking within the timeframe that you mentioned.

brandonagr2

> Most humans also don't get in accidents

Have you met any humans? Or seen people driving?

homefree

Oh please - people excuse and dismiss major accomplishments, you can send a skyscraper to mars and people on HN will still be calling you a fraud.

The Bay Area has massive traffic, complex interchanges, SF has tight difficult roads with heavy fog. Sometimes there’s heavy rain on 280. 17 is also non trivial.

What Tesla has done is not trivial and roads outside the bay are often easier.

People can ignore this to serve their own petty cognitive bias, but others reading their comments should go look at it for themselves.

oblio

You're basically driving on easy mode, in the Bay Area. Dry climate, sunshine all year round, pretty solid developed country infrastructure.

a123b456c

OK so you believe "Elon was right" and people should "ignore the comments" Hmm very interesting.

nova22033

> politicization

How is it politicization when TESLA THE COMPANY is saying Full Self Driving doesn't mean "Full" "Self" Driving?

If it is as good as you claim, why doesn't Tesla claim it's Full Self Driving?

ponector

>> it drives me 95% of the time now

But what is the point to use it everywhere if you still need to pay attention to the road, keep hands on the steering wheel?

homefree

You don’t need hands on the wheel anymore, just looking out the window. It’s way more relaxed.

It’ll be nice when that’s not required anymore, but even today it’s way more comfortable.

kolanos

HW4 is really a game changer. I was absolutely floored by HW4 FSD during a recent test drive. Tesla is accomplishing some truly groundbreaking technical achievements here. But you wouldn't know it through all the Elon Musk noise (pro and con). I'd encourage anyone to take a test drive and put FSD through its paces. I went in with a super critical mindset and walked away stunned.

homefree

Yeah it’s amazing

anthem2025

I’m gonna go ahead and guess that by “super critical” you actually mean that you went in an Elon worshipper and left an Elon worshipper.

anthem2025

What lead? They are way behind Waymo.

Why would anyone listen to the opinion of someone who bought a Tesla in 2025?

The only people still buying them are musk fanboys.

pbhjpbhj

[flagged]

homefree

Thank you for exemplifying what I’m talking about. I should really buy more TSLA.

sixQuarks

So he does nazi salutes and is totally buddy buddy with Netanyahu. Ok

teleforce

>why not just use LIDAR that can literally see around corners in 3D?

LIDAR requires line-of-sight (LoS) hence cannot see around conner, but RADAR probably can.

It's interesting to note that the all time 2nd most popular post on Tesla is 9 years ago on its full self driving hardware (just 2nd after the controversial Cybertruck) [1].

>Elon's vision-only move was extremely "short-sighted"

Elon's vision was misguided because some of the technologists at the time including him seem to really truly believed that AGI is just around the corner (pun attended). Now most of the tech people gave up on AGI claim blaming on the blurry definition of AGI but for me the truly killer AGI application is always full autonomous level 5 driving with only human level sensor perceptions minus the LIDAR and RADAR. But the complexity of the goal is very complicated that I really truly believe it will not be achieved in foreseeable future.

[1] All Tesla Cars Being Produced Now Have Full Self-Driving Hardware (2016 - 1090 comments):

https://news.ycombinator.com/item?id=12748863

UltraSane

Camera only might work better if you used regular digital cameras along with more advanced cameras like event based cameras that send pixels as soon as they change brightness and have microsecond latency and\or Single Photon Avalanche Diode (SPAD) sensors which can detect single photons. Having the same footage from all 3 of these would enable some fascinating training options.

But Tesla didn't do this.

dlcarrier

This looks to me like they are acknowledging that their claims were premature, possibly due to claims of false advertising, but are otherwise carrying forward as they were.

Maybe they'll reach level 4 or higher automation, and will be able to claim full self driving, but like fusion power and post-singularity AI, it seems to be one of those things where the closer we get to it, the further away it is.

sschueller

Premature? Is that what we call this now? It's straight up fraud!

Others are in prison for far less.

tombert

I was about to say this. Elon would go on stage and say something like “and this is something we can do today”, or “coming next year” in 2018. The crowd goes wild, the stock price shoots up.

The first time could be an honest mistake, but after a certain point we have to assume that it’s just a lie to boost the stock price.

mlindner

The stock price hasn't dropped though, the opposite rather.

tejohnso

I'm not sure it's fraud because there was always the fine print. But a company selling a car with a feature called Full Self Driving that does not in fact fully self drive, well, that's a company I don't buy from. Unfortunately others don't seem as offended and happily pay for the product, encouraging further b.s. marketing hype culture.

Just like politicians, it seems there's no repercussions for CEO's lying as long as it's fleecing the peons and not the elite.

dlcarrier

Not in the US. There's a whole bureaucracy of advertising boards where a false advertising case can heard and appealed before anyone with legal authority would even look at it, which pretty much never happens. Even then, it's a tort, so punishment outside of fines is pretty much non existent.

solardev

It's only illegal when the insufficiently rich do it.

DrillShopper

Always has been

It's a big club, and we ain't in it.

dreamcompiler

Not gonna happen as long as Musk is CEO. He's hard over on a vision-only approach without lidar or radar, and it won't work. Companies like Waymo that use these sensors and understand sensor fusion are already eating Tesla's lunch. Tesla will never catch up with vision alone.

Rohansi

While I don't think vision-only is hopeless (it works for human drivers) the cameras on Teslas are not at all reliable enough for FSD. They have little to no redundancy and only the forward facing camera can (barely) clean itself. Even if they got their vision-only FSD to work nicely it'll need another hardware revision to resolve this.

vbezhenar

I feel like our AI research at physical world falls so much behind language-level AI, that our reasoning might be clouded.

Compare Boston Dynamics and cat. They are on the absolutely different levels for their bodies and their ability to manipulate their bodies.

I have no doubts, that using cameras-only would absolutely work for AI cars, but at the same time I'm feel that this kind of AI is not there. And if we want autonomous cars, it might be possible, but we need to equip them with as much sensors as necessary, not setting any artificial boundaries.

moogly

> While I don't think vision-only is hopeless (it works for human drivers)

I guess you don't drive? You use more senses than just vision when driving a car.

bkettle

> it works for human drivers

Sure, for some definition of "works"...

https://www.iihs.org/research-areas/fatality-statistics/deta...

SalmoShalazar

Such utter drivel. A camera is not the equivalent of human eyes and sensory processing, let alone an entire human being engaging with the physical world.

mbrochh

Uh... why don't they put the cameras... into the car (it works for human drivers)???

formercoder

Humans drive without LIDAR. Why can’t robots?

cannonpr

Because human vision has very little in common with camera vision and is a far more advanced sensor, on a far more advanced platform (ability to scan and pivot etc), with a lot more compute available to it.

phire

Why tie your hands behind your back?

LIDAR based self-driving cars will always massively exceed the safety and performance of vision-only self driving cars.

Current Tesla cameras+computer vision is nowhere near as good as humans. But LIDAR based self-driving cars already have way better situational awareness in many scenarios. They are way closer to actually delivering.

Sharlin

And bird fly without radar. Still we equip planes with them.

apparent

The human processing unit understands semantics much better than the Tesla's processing unit. This helps avoid what humans would consider stupid mistakes, but which might be very tricky for Teslas to reliably avoid.

randerson

Even if they could: Why settle for a car that is only as good as a human when the competitors are making cars that are better than a human?

systemswizard

Because our eyes work better than the cheap cameras Tesla uses?

dreamcompiler

Chimpanzees have binocular color vision with similar acuity to humans. Yet we don't let them drive taxis. Why?

matthewdgreen

I drove into the setting sun the other day and needed to shift the window shade and move my head carefully to avoid having the sun directly in my field of vision. I also had to run the wipers to clean off a thin film of dust that made my windshield difficult to see through. And then I still drove slowly and moved my head a bit to make sure I could see every obstacle. My Tesla doesn’t necessarily have the means to do all of these things for each of its cameras. Maybe they’ll figure that out.

dzhiurgis

So robotaxi trial thats happening already is some sort of rendering, ai slop and rides we see aren’t real?

crooked-v

So does anyone who previously bought it on claims that actual full self-driving would be "coming soon" get refunds?

garbagewoman

Hopefully not. They might learn a lesson from the experience.

blackoil

Hmm, you want to penalize company and teach a lesson to customers,so give the money to Ford shareholders.

epolanski

This is fraud he went in front of investors and said multiple times it was around the corner.

He said consumers, just buy the car and it will come with an updated. It didn't.

This is a scam, end of story.

7 years of it.

insane_dreamer

Surprising that there hasn't been a class-action suit yet.

jojobas

>false advertising

I think you mean "securities fraud", at gargantuan scale at that. Theranos and Nikola were nowhere near that scale.

paulryanrogers

It is strange how Elon and Tesla get a pass on this. Tesla has contributed to the death of more people than Thernos. I guess he didn't rip off rich investors, except maybe the ones who died in their Teslas.

Perhaps it's that cars are more sacred than healthcare.

jeffbee

> Maybe they'll reach level 4 or higher automation

There is little to suggest that Tesla is any closer to level 4 automation than Nabisco is. The Dojo supercomputer that was going to get them there? Never existed.

ascorbic

And their H100s were diverted to build MechaHitler instead

gitaarik

But, they're changing the meaning of FSD to FSD (Supervised). So that means they don't make any promises for unsupervised FSD in the future anymore. They'll of course say that they keep working on it and that stuff is progressing. But they don't have to deliver anymore. Just like they say to people getting into accidents that they should keep their arms on the wheel or else it's your own responsibility.

standardUser

What does Waymo lack in your opinion to not be considered "full self driving"?

The persistent problem seems to be severe weather, but the gap between the weather a human shouldn't drive in and weather a robot can't drive in will only get smaller. In the end, the reason to own a self-driven vehicle may come down to how many severe weather days you have to endure in your locale.

mkl

Waymo is very restricted on the locations it drives (limit parts of limited cities, I think no freeways still), and uses remote operators to make decisions in unusual situations and when it gets stuck. This article from last year has quite a bit of information: https://arstechnica.com/cars/2024/05/on-self-driving-waymo-i...

panarky

Waymo never allows a remote human to drive the car. If it gets stuck, a remote operator can assess the situation and tell the car where it should go, but all driving is always handled locally by the onboard system in the vehicle.

Interesting that Waymo now operates just fine in SF fog, and is expanding to Seattle (rain) and Denver (snow and ice).

phire

Geofencing and occasional human override meets the definition of "Level 4 self driving". Especially when it's a remote human override.

But is Level 4 enough to count as "Full Self Driving"? I'd argue it really depends on how big the geofence area is, and how rare interventions are. A car that can drive on 95% of public roads might as well be FSD from the perspective of the average drive, even if it falls short of being Level 5 (which requires zero geofencing and zero human intervention).

zer00eyz

Waymo has been testing freeway driving for a bit:

https://www.reddit.com/r/waymo/comments/1gsv4d7/waymo_spotte...

> and uses remote operators to make decisions in unusual situations and when it gets stuck.

This is why its limited markets and areas of service: connectivity for this sort of thing matters. Your robotaxi crashing cause the human backup lost 5g connectivity is gonna be a real real bad look. NO one is talking about their intervention stats. IF they were good I would assume that someone would publish them for marketing reasons.

FireBeyond

> I think no freeways still

California granted Waymo the right to operate on highways and freeways in March 2024.

standardUser

All cars were once restricted in the locations they could drive. EVs are restricted today. I don't see why universal access is a requirement for a commercially viable autonomous taxi service, which is what Waymo is currently. And the need for human operators seems obvious for any business, no matter how autonomous, let alone a business operating in a cutting edge and frankly dangerous space.

gerdesj

No one does FSD yet - properly.

It initially seems mad that a human, inside the box can outperform the "finest" efforts of a multi zillion dollar company. The human has all their sensors inside the box and most of them stymied by the non transparent parts. Bad weather makes it worse.

However, look at the sensors and compute being deployed on cars. Its all minimums and cost focused - basically MVP, with deaths as a costed variable in an equation.

A car could have cameras with views everywhere for optical, LIDAR, RADAR, even a form of SONAR if it can be useful, microwave and way more. Accellerometers and all sorts too, all feeding into a model.

As a driver, I've come up with strategies such as "look left, listen right". I'm British so drive on the left and sit on the right side of my car. When turning right and I have the window wound down, I can watch the left for a gap and listen for cars to the right. I use it as a negative and never a positive - so if I see a gap on the left and I hear a car to my right, I stay put. If I see a gap to the left but hear no sound on my right, I turn my head to confirm that there is a space and do a final quick go/no go (which involves another check left and right). This strategy saves quite a lot of head swings and if done properly is safe.

I now drive an EV: One year so far - a Seic MG4, with cameras on all four sides, that I can't record from but can use. It has lane assist (so lateral control, which craps out on many A road sections but is fine on motorway class roads) and cruise control that will keep a safe distance from other vehicles (that works well on most roads and very well on motorways, there are restrictions).

Recently I was driving and a really heavy rain shower hit as I was overtaking a lorry. I immediately dived back into lane one, behind the lorry and put cruise on. I could just see the edge white line, so I dealt with left/right and the car sorted out forward/backward. I can easily deal with both but its quite nice to be able carefully abrogate responsibilities.

panick21_

Put a Waymo on random road in the world, can it drive it?

standardUser

For a couple decades you couldn't even bring your cell phone anywhere in the world and use it. Transformational technologies don't have to be available universally and simultaneously to be viable. Even when the gas car was created you couldn't use it anywhere that didn't have gasoline and paved roads, plus a mechanic and access to parts.

Kye

That's the real issue. If "can navigate roads" is enough then we've had full self-driving for a while. There needs to be some base level of general purpose capability or it's just a neat regional curiosity.

cryptoz

Many humans couldn't.

an0malous

How have they gotten away with such obvious misadvertising for this long? It’s undeniably misled customers and inflated their stock value

dreamcompiler

Normally the Board of Directors would fire any CEO that destroyed as much of the company's value as Musk has. But Tesla's board is full of Musk syncophants and family members who refuse to stand up to him.

enaaem

The only reason why Tesla is selling at 190 PE is because of Musk. Musk is really good at selling stock to retail investors.

sidcool

Board cares mostly about market cap and stock prices. Both have done well.

utyop22

Poor corporate governance is rife.

zpeti

> destroyed as much of the company's value as Musk has

Please, post numbers to back this up… please…

undefined

[deleted]

MangoToupe

Ah who needs numbers when we have time? Be patient

Eddy_Viscosity2

Who was going to stop them from lying?

vlovich123

SEC and FTC would be obvious candidates who historically would do this. States also have the ability to prosecute this via UDAP (unfair and deceptive practices) laws.

Probably Tesla being the only major domestic EV manufacturer + historically Musk not wading into politics + Musk/Tesla being widely popular for a time is probably why no one has gone after him. Not sure how this changes going forward with Musk being a very polarizing figure now.

1over137

>SEC and FTC would be obvious candidates who historically would do this.

Yeah, historically, as in: before many people here were born. It's been so long since SEC and FTC did such things.

MangoToupe

Not to mention there's got to have been insane pressure from the hill not to kill the golden goose.

randallsquared

The previous two administrations (Trump I and Biden) being somewhat anti-Tesla or anti-Musk was some part of what prompted Musk to get into politics in the first place. Given the Biden admin's hostility, I would have expected the SEC and FTC to have been directed to do all they could against him within bounds, and so my first guess would be that they did, in fact, do everything justifiable.

barbazoo

Maybe that’s what happens in late stage capitalism. The billionaires get so powerful that they become untouchable. He’s already shown that he uses his fortune to steer political outcomes.

SequoiaHope

SEC is one possibility

greekrich92

2007 called...

AbrahamParangi

I use self-driving every single day in Boston and I haven’t needed to intervene in about 8 months. Most interventions are due to me wanting to go a different route.

Based on the rate of progress alone I would expect functional vision-only self-driving to be very close. I expect people will continue to say LIDAR is required right up until the moment that Tesla is shipping level 4/5 self-driving.

rogerrogerr

Same experience in a mix of city/suburban/rural driving, on a HW3 car. Seeing my car drive itself through complex scenarios without intervention, and then reading smart people saying it can’t without hardware it doesn’t have, gives major mental whiplash.

rootusrootus

I would like to get my experience more in line with yours. I can go a few miles without intervention, but that's about it, before it does something that will result in damage if I don't take over. I'm envious that some people can go months when I can't go a full day.

mauvehaus

Where are you driving?! If the person you're replying to has gone 8 months in Boston without having to intervene, I'm impressed. Boston is the craziest place to drive that I've ever driven.

Pro tip if you get stuck in a warren of tiny little back streets in the area. Latch on to the back of a cab; they're generally on their way to a major road to get their fare where they're going and they usually know a good way to get to one. I've pulled this trick multiple times around city hall, Government Center, the old state house, etc.

meroes

Or when. Driving during peak commute hours really makes you a sardine in a box and it's harder for there to be intervene-worthy events just by nature of dense traffic.

diebeforei485

I am curious what vehicle you're driving and whereabouts you're driving it.

FollowingTheDao

Self driving is not the same as "autonomy". Musk lied to everyone with the Tesla self driving, the Boring Company, DOGE...wake up people...

globular-toast

Every company that does marketing lies to you.

FollowingTheDao

I agree, but no one is more obvious with Musk yet people still keep falling for it. Specifically his “investors”.

herbturbo

> Based on the rate of progress alone I would expect functional vision-only self-driving to be very close.

So close yet so far, which is ironically the problem vision based self-driving has. No concrete information just a guess based on the simplest surface data.

potato3732842

On a scale from "student driver" to "safelite guy (or any other professional who drives around as part of their job) running late" how does it handle storrow and similiar?

Like does it get naively caught in stopped traffic for turns it could lane change out or does it fucking send it?

rogerrogerr

I don't drive in Boston, but there is some impatience factor and it will make human-like moves out of correct-but-stopped lanes into moving ones. It'll merge into gaps that feel very small when it doesn't have other options.

Nitsua007

Small correction: LiDAR can’t literally see around corners — it’s still a line-of-sight sensor. What it can do is build an extremely precise 3D point cloud of what it can see, in all lighting conditions, and with far less susceptibility to “hallucinations” from things like glare, shadows, or visual artifacts that trip up purely vision-based systems.

The problem you’re describing — phantom braking, random wiper sweeps — is exactly what happens when the perception system’s “eyes” (cameras) feed imperfect data into a “brain” (compute + AI) that has no independent cross-check from another modality. Cameras are amazing at recognizing texture and color but they’re passive sensors, easily fooled by lighting, contrast, weather, or optical illusions. LiDAR adds active depth sensing, which directly measures distance and object geometry rather than inferring it.

But LiDAR alone isn’t the endgame either. The real magic happens in sensor fusion — combining LiDAR, radar, cameras, GNSS, and ultrasonic so each sensor covers the others’ blind spots, and then fusing data at the perception level. This reduces false positives, filters out improbable hazards before they trigger braking, and keeps the system robust in edge cases.

And there’s another piece that rarely gets mentioned in these debates: connected infrastructure. If the vehicle can also receive data from roadside units, traffic signals, and other connected objects (V2X), it doesn’t have to rely solely on its onboard sensors. You’re effectively extending the vehicle’s situational awareness beyond its physical line of sight.

Vision-only autonomy is like trying to navigate with one sense while ignoring the others. LiDAR + fusion + connectivity is like having multiple senses and a heads-up from the world around you.

moomoo11

Honestly at that point of complexity I hope automakers just quit chasing FSD and go back to making actually good cars again.

Let the automated trucks figure it out if it’s an actual problem worth solving or we can just use trains or let truck driving be a decent middle class job.

jesenpaul

They made tons of money on the Scam of the Decade™ from Oct 2016 (See their "Driver is just there for legal reasons" video) to Apr 2024 (when they officially changed it to Supervised FSD) and now its not even that.

mettamage

I’m not surprised. As a former Elon fan, it never struck me that he thought about this from first principles, whereas for SpaceX he did.

For as long as we can’t understand AI systems as well as we understand normal code, first principles thinking is out of reach.

It may be possible to get FSD another way but Elon’s edge is gone here.

fsmv

SpaceX is a success despite Elon. Maybe setting an extremely lofty goal helped somewhat but Gwynne Shotwell and all the actual engineers at SpaceX deserve the credit for their success.

tim333

Wikipedia says Gwynne Shotwell joined after the first successful launch. Elon founding it and getting a rocket up must count for something.

mettamage

How is it despite Elon? I don't know the history too well.

I agree that the team deserves most of the success. I think that's the case in general. At best, a CEO puts down good framing/structure, that's it. ICs do the actual innovative work.

johnthewise

Napoleon didn't fight much at all in battles, too.

rlt

I used to think engineers should get all the credit for the successes of big engineering projects, and they should probably get more credit than they do, but over time I’ve realized how important good leadership is. SpaceX and Tesla absolutely would not be where they are today without Elon. Anyone who claims otherwise either hasn’t paid attention or is being disingenuous because they don’t like him.

SmarsJerry

Saying despite Elon is delusional. You probably watched some left wing propaganda at some point to come to that conclusion. Just because he supported Trump doesn’t take away from all his achievements.

ciconia

War Is Peace. Freedom Is Slavery. Ignorance Is Strength. FSD is... whatever Elon says it is.

goloroden

I think I’d call what Tesla did fraud. Or scam. Or both.

shadowgovt

"Full Self Driving (Supervised)." In other words: you can take your mind off the road as long as you keep your mind on the road. Classic.

Tesla is kind of a joke in the FSD community these days. People working on this problem a lot longer than Musk's folk have been saying for years that their approach is fundamentally ignoring decades of research on the topic. Sounds like Tesla finally got the memo. I mostly feel sorry for their engineers (both the ones who bought the hype and thought they'd discover the secret sauce that a quarter-century-plus of full-time academic research couldn't find and the old salts who knew this was doomed but soldiered on anyway... but only so sorry, since I'm sure the checks kept clearing).

arijun

Until very recently I worked in the FSD community, and I wouldn’t say I viewed it as a joke. I don’t know if I believed they would get to level 5 without any lidar, it’s pretty good for what’s available in the consumer market.

shadowgovt

That's what I mean. Nobody I know thought there'd be a chance of getting to L4 (much less L5) without LIDAR. They doomed the goal from the gate and basically lied to people for years about the technological possibilities to pad their bottom line.

It's two steps from selling snake-oil, basically. Not that L4 or L5 are impossible, but people who knew the problem domain looked at how they were approaching it hardware-wise and went "... uhuh."

FireBeyond

> In other words: you can take your mind off the road as long as you keep your mind on the road.

They literally did this with Summon. "Have your car come to you while dealing with a fussy child" - buried far further down the page in light grey, "pay full attention to the vehicle at all times" (you know, other than your "fussy child").

IgorPartola

I don’t need self driving cars that can navigate alleys in Florence, Italy and also parkways in New England. Here is what we really need: put transponders into the roadway on freeways and use those for navigation and lane positioning. Then you would be responsible for getting onto the freeway and getting off the exit but can take a nap between. This would be something that would be do e by the DOT, supported by all car makers, and benefit everyone. LIDAR could be used for obstacle detection but not for navigation. And whoever figures out how to do the transponders and land a government contract and get at least one major car manufacturer on board would make bank.

hedora

We live in an area with sort of challenging roads, and I strongly disagree.

There’s an increasing number of drivers that can barely drive on the freeways. When they hit our area they cannot even stay on their side of the road, slow down for blind curves (when they’re on the wrong side of the road!), maintain 50% the normal speed of other drivers, etc. I won’t order uber or lyft anymore because I inevitably get one of these people as my driver (and then watch them struggle on straight stretches of freeway).

Imagine how much worse this will get when they start exclusively using lane keeping on easy roads. It’ll go from “oh my god I have to work the round wheel thingy and the foot levers at the same time!” to “I’ve never steered this car at speeds above 11”.

I’d much rather self driving focused on driving safely on challenging roads so that these people don’t immediately flip their cars (not an exaggeration; this is a regular occurrence!) when the driver assistance disables itself on our residential street.

I don’t think addressing this use case is particularly hard (basically no pedestrians, there’s a double yellow line, the computer should be able to compute stopping distance and visibility distance around blind curves, typical speeds are 25mph, suicidal deer aren’t going to be the computer’s fault anyway), but there’s not much money in it. However, if you can’t drive our road, you certainly cannot handle unexpected stuff in the city.

IgorPartola

You describe it as challenging but it sounds like realistically it is just badly designed roads. But fixing that aside, nothing really stops you from outfitting secondary roads with transponders in principle. In practice, it is easier to start with freeways because (a) they are much more uniform, (b) the impact of an accident at freeways speeds is much more deadly, (c) no pedestrians, bicycles, etc., and (d) the federal government has control over the freeways (it is a complex relationship but ultimately the feds have a say) which means they can mandate installing the transponders and pay for it. Once the system functions there it can be expanded until every driveway and parking spot is outlined.

hedora

They’re badly designed (outskirts of Silicon Valley). So is the electricity and internet, so the transponders placed in shady spots would need something like a 30 day battery backup and network other than wired, cell, wisp or starlink.

Vision based systems would be more than adequate. Lidar or (god forbid) ultrasonic chirps would easily lead to superhuman safety and speeds.

I’m skeptical of transponder or network based systems. What happens during a natural disaster? Do the 10% of cars that lack drivers or steering wheels just stop and block the evacuation routes? That’d kill a lot of people in very graphic / high profile ways.

heeton

We already have transponders on freeways. They’re technically passive reflectors, but they reflect a high proportion of incident EM waves, in the visible spectrum, and exist between lanes on every major road in the US. Also known as white paint.

gilbetron

Following roads and lane markers and signs and signals is the "easy" part of autonomous driving. You could do everything you say and it wouldn't result in something that is any better than the current state of the art. Dealing with others on the road is the main problem (weather comes in close second). Your solution solves nothing, I'm afraid.

randunel

How would you know which signals to trust and which to ignore?

uoaei

Physics prevents detected objects from jumping unrealistically. Current systems seem not to account for that at all, reacting to objects which appear and disappear spontaneously. Sensor fusion is exactly the solution to this: use a variety of sensors as input to reliably identify actual obstacles. To fake all the sensors at once you'd need to fake vision, lidar, and transponder locations simultaneously.

jijijijij

Blockchain.

Just kidding.

Wait, no! Please. No!

How do I delete this???

guluarte

I thought we would have almost AGI by now? https://x.com/elonmusk/status/1858747684972048695

Daily Digest email

Get the top HN stories in your inbox every day.

Tesla changes meaning of 'Full Self-Driving', gives up on promise of autonomy - Hacker News