Tesla Vision Update: Replacing Ultrasonic Sensors with Tesla Vision
348 comments·October 4, 2022
With the removal of radar, they showed some pretty convincing data that the radar was too noisy to be useful, especially with discriminating things like highway overpasses vs. stopped cars. They showed that vision could already outperform radar.
With these parking sensors it's different. They have no current alternative and it will cause a pretty significant loss of functionality. Disappointing move.
> They showed that vision could already outperform radar.
important to note that much like there is a wide range from bad cameras (“filmed with a potato”) to high resolution cameras, there is a wide variety of radars with different capabilities.
the radar unit in Teslas was pretty limited (basically designed for adaptive cruise control), and they showed that vision could outperform that radar (and have no interest in exploring non-vision approaches because “humans can drive with just eyes”)
Yeah, (un)fortunately compared with cameras, eyes have really high resolution and (most relevant) a truly excellent dynamic range.
I'd love to see the convincing data RE: radar - they're using the same radars as every other car that has emergency braking from the car in front of the car that's in front of you. I've not heard of many "phantom braking" sessions from these other vehicles - I may have missed something.
Allow me to take you for a drive in my Volvo V90. I turned the auto-braking feature off entirely because it spits false positives like crazy. Parked cars, signs, birds, cars travelling through roundabouts across my nose, nothing at all - they all trigger it, and the car stomps the brakes extremely hard. The 2017 VW Golf it replaced was FAR better - maybe two false positives in 5 years. The Volvo? Daily.
Karpathy covered it during a conference keynote here: https://youtu.be/g6bOwQdCJrc?t=1368
It also is pretty common with other vehicles, they just don't get as much press as Tesla:
I got a ton of phantom braking just two weeks ago when I rented a 2020 Toyota Corolla with this feature. I've also gotten it in a 2019 Tesla Model 3, of course.
I drive a 2018 Hyundai IONIQ EV. The forward radar is hit-and-miss in heavy rain: the car will often barrel quite happily into stopped traffic when it’s raining hard, requiring a stomp on the brake pedal.
Just a couple weeks ago I was driving on the motorway around 4am, no other traffic for miles, and the forward collision alert sounded and the car slammed on the brakes for ~quarter of a second and then released as if nothing had happened and we merrily resumed our journey. Having the brakes slam on at a smidge over 100 km/h gave me a sore shoulder and quite the rush of adrenaline, quite the rush at 4am!
So yes. The radar does have a mind of its own.
Not braking per se, but on all cars I've driven recently (ie some Seat Alteca, our BMW F10 5-series and maybe 2 more a bit longer ago) there is sometimes collision warning in situation where no collision is about to happen.
Seat was pretty bad in this, had it rented for 2 weeks in Sicily few weeks ago and first few times all screens start flashing like crazy in normal situations is very distracting (since I wasn't sure what the heck is happening, but it looked sinister). BMW is pretty subtle with this, and only happened few times in past year.
I imagine if this warning system would be also connected to breaks, bad things would be happening.
They were using the radar for lane guidance. That's why they had to have a whitelist of locations where the radar was false triggered by the roadside environment. It never actually worked properly and they just papered it over with a hack to temporarily blind their input.
Radar has a longer wavelength than visible light. Thus making it better at some stuff and worse at others. It's a crime to not be using as much of the EM spectrum as possible for an application like this.
Some pretty convincing data that they handpicked to make themselves look good and pretend they don't need a radar, a lidar and that vision only is good enough.
Perpetuating a pattern of lies from Tesla, ever since they started on self driving.
You don’t need to take Tesla’s word for anything. Their latest vision stack is in the hands of ten thousand customers, some of whom regularly post their drives on YouTube.
More data shouldn't actually hurt, it can be use as bayesian prior on the camera data when the vision stuff is uncertain if nothing else.
But that’s the thing. Tesla wasn’t able to tell when the radar was “uncertain”. So when do you trust one data source more than the other? If it was so easy to label misinformation as such, the misinformation would probably not have been communicated in the first place. Tesla explained this at AI day 1.
> With these parking sensors it's different. They have no current alternative and it will cause a pretty significant loss of functionality. Disappointing move.
I'm especially worried by dark garages / parkings. Tesla cams have already enough problems when outside is too dark.
> They showed that vision could already outperform radar.
Except on emergency vehicles parked in a lane at an oblique angle, which Teslas did not recognize, and plowed into at speed. I wonder what unknown secondary effects this change will bring.
> Once the hood has occluded the view from the windshield cameras, these obstructions will cease to exist.
I have observed that this is how vision is currently implemented, but does it have to be this way? I can pull up to a concrete wheel stop in my Toyota without vision and without the sonar enabled even though my eyesight is occluded by the hood because I know where it was and how far I have moved. Concrete wheel stops do not flicker out of existence when you stop looking at them, they should be able to monitor the wheel speed sensors and shift the 3D map of the world into your camera blind spots, perhaps showing a "hidden" wireframe on the cameras.
It would be inconvenient if you were unable to pull forward because a tumbleweed or (more likely) plastic bag rolled through, but you could back up and try again or the human could decide to ignore the beeping.
Granted, I'm not a domain expert, but we do this in my field of industrial robotics when building models with 3D vision. The computer can composite multiple profiles into a single higher-resolution image, can return data about that model when the camera on the EOAT has moved such that the field of view is limited, and can provide faults when the model does not match a previous image because something has been added or removed while the camera wasn't watching.
Possible, sure. But not trivial, and there is no pressing need to get rid of these sensors right now. Why the rush, if there is no supply chain issue?
The inverse is an issue though.
The car parks and turns off, with no obstacles in front.
While parked a child/dog/bucket of concrete materializes directly in the front blindspot.
Next time the car drives, it has no knowledge of this obstruction.
This isn't a huge problem for a car driven by a human, since you can and should check around the car before driving, or will have other context clues that there is a child or dog or whatever running around your car, but this seems incompatible with the stated goal of making all current cars into Robotaxis.
> The parking sensors on the Tesla are really good and very useful.
I respectfully disagree.
I think their primary use case is parking. Tesla does a really good job with automatic parking, and it IS useful especially for parallel parking.
But honestly, I have curbed my rims and hung up my underbody front spoiler on a curb and they did not help.
Really there is an opportunity for cameras to protect the car from nearby curbs and park in the same way.
But like you said, current telsa camera placement doesn't have enough coverage, especially in front.
At a minimum, I think they should add a front camera if they're going to remove the ultrasonic sensors.
and realistically, they should solve the ultrasonic problem of curbs with the cameras.
Tesla is probably the WORST at automatic parking. Here’s a comparison against some other competitors.
My own model Y has maybe successfully automatically parked 2/10 times. Most of my friends and coworkers have similarly low success rates.
Lol that's horrendous. I had a Mercedes with automatic parking assist back in 2016 that would do it perfectly 10/10 times, used it all the time too(narrow streets in the UK). What's shown in the video is just hilariously bad.
This is pure comedy, thanks for posting.
I'm surprised the they pulled out an older Tesla and that parked quite well after all the duds.
If yo are that bad in parallel parking, you don't pass your drivers test in Germany.
autopilot v1 does pretty well, and it does it entirely via the ultrasonic sensors afaik. I don't know how v2 and v3 work in practical comparison.
I don't use the automatic parking, but I find the ultrasonics quite useful in my own parking, maybe I've been more lucky or just more hyper careful, but I've never scratched the front or rims on a curb. The feedback about the shape of the obstruction and the distance it is away from you is great, much better than my 2014 era vehicle. I would like to see some lower sensors, maybe a single pixel lidar to look for curbs or something, but I think that is going the wrong way with component count for them.
To just say that these features that are pretty standard on any other car at this price point are "coming soon" is laughable with Telsa's delivery cadence. I'll wait for vision based parking, I'm sure it's coming right on the heels of full self driving in 2019 (not here,) smart summon in 2019 (delivered in what, 2021, and the only thing I've seen it do is nervously back out of a spot then try to merge into a surface street instead of pick me up,) The Tesla Semi, Tesla Cyber truck, and now a robot, which is hilarious because they couldn't even get the million dollar industrial robots to work on their line, can't say I've got a lot of faith in some 20k robot from Tesla.
No, they overpromise and basically just don't deliver, why anyone would take Tesla's word for this I don't know.
Having slightly more than average (researched the industry as an outsider) knowledge of industrial robotics. It turns out million dollar industrial robots are kind of like enterprise software. You "buy" an ERP/CRM/etc but it doesn't just work out of the box, theres weeks and months of work-hours needed to try and get it actually integrated into a company, and it still might not work since software has bugs, humans are fallible, and huge software with endless features sometimes forgets which ones still work properly in combination with others.
Industrial robots are kinda like this too. You can buy an arm based on physical specs like how large it is, how precise it is with certain weights of objects (because it may be less precise with larger objects), how quickly it can move around, how quickly it can get from precision point A to precision point B (because precision and speed are a tradeoff once weight is involved due to momentum), how much power it uses, is it hydraulically powered or electrically powered, available off the shelf end effectors provided by the manufacturer... etc.
Then you have to install it, as in mechanically bolt it to the ground (which might have issues with load cantilevering), mark safety areas humans cant be in when its operating (that could require redesigning the entire floor plan of a building depending on how much spare room there is around everything because people still have to get around to other important things when the robot is doing its job), then you have to program the robot (which can in some ways be simple xyz coordinate motion, but also you need to tie it into some form of process control software so it knows when to do its job, and other things around it know when to do theirs, and process control systems are software, and may not be compatible, or have bugs, etc), then you have to maintain it (spare parts, breakdown rates, warranty turnaround times, industrial equipment built to be used, is sometimes used to do things that will wear them out, but its at a known rate that was planned for, but is not lived up to, requiring replanning or additional costs)
Its a web of complexity that can lead to what seems simple "just buy a robot to lift this thing from here to there and then buy another robot to weld it to the car" ... into a project that takes piles of money, months of work-hours by multiple different disciplines from construction to programming, and anything going wrong dominos down the line making it take even longer and cost more...
I wouldn't hold failing to get industrial robots working against them. Feel free to continue holding all the other opinions though :) everyone is entitled to our opinions after all... i just had something relevant to say about the robots.
Automatic parking is part of a premium package, I don't have it. I do rely on the sensors often, particularly on the front ones.
This removal is mindboggling without adding a front camera.
> downright dishonest to characterize it as some kind of leap forward
Does anyone expect anything else from Tesla at this point?
Does anyone expect anything else from any corporation? Of course they are not gonna say "I'm sorry, we had to make your experience worse", of course they will try to spin it into something positive. Almost every for-profit company does this. Not to take away that it's shitty, but it's reality.
> The parking sensors on the Tesla are really good and very useful. They show a rough outline of what is around you and are displayed very nicely compared to some other cars. It is a real shame to remove this, and downright dishonest to characterize it as some kind of leap forward.
My 4 year old Ford Focus doesn't show an outline of obstacles around the car, but still provides Park Assist and Autopark. Especially Park Assist is something that everyone expects from every reasonably equipped car nowadays. So yeah, maybe understandable if it's due to supply difficulties, but still a bad move...
My 2021 Jetta GLI doesn't have these features. :(
It looks like they are transparently trying to raise their margins. I guess they are more supply-constrained than they thought. The "smart car of the future" from Tesla now has no sensors other than cameras, and not very many of those.
> It looks like they are transparently trying to raise their margins. I guess they are more supply-constrained than they thought. The "smart car of the future" from Tesla now has no sensors other than cameras, and not very many of those.
Don't worry, it won't be a problem as long as they can keep the marketing budget up.
Tesla's marketing budget is actually very small and even decreased last year.
> There are multiple tells that this move, just like the removal of radar, is supply-chain and cost-driven and not actually driven by engineering.
I like the idea of coalescing multiple sensors into one, but I can't shake off the idea that relying on vision alone when you can sense depth through US or LIDAR is a terrible idea. You can fake depth data from multiple cameras, but it takes more processing and additional input should be a good thing. Did anyone ever try to fool a Tesla using the Looney Tunes painted tunnel trick?
> Once the hood has occluded the view from the windshield cameras, these obstructions will cease to exist.
Any sensible navigation software should "know" objects don't cease to exist. I hope Tesla's does it.
High-end as defined by what besides price? Not necessarily snark, I haven't heard too much good about fit and finish...
People who have never driven a car worth more than $50,000 love the luxury feel of their Teslas. It's been really interesting to see how many people had a Tesla as their first expensive car.
As a person who’s owned some luxury cars in their life, going from a BMW to a Model3 feels like a really big downgrade in so many ways.
I can give a list of reasons why I didn’t buy an M3P at the time but quality, lack of repair options and really shitty interior topped the list.
Not having carplay/android auto or satellite radio were minor but still annoying in a car that cost so much money.
There was a fun stat that model S cars when then first came out had bay far the largest proportion of first-time-in-this-price range buyers. I can't fathom that someone coming from his/her 10th s class would characterize Teslas as luxury.
This is the thing people dont realize, as someone who has owned a luxury car before, the Tesla Model 3 is not what i would call a "luxury car" despite them playing in the same price space as the BMW 3 Series.
It's still one of the best cars i've ever owned but the fit, finish and feel are not luxury. My 2013 Ford Focus felt more solid than the model 3 i own.
I guess they think their computer vision and AI can catch up to and exceed the visual processing capabilities of vertebrates with their millions of years of evolution. I think they are wrong. Multi-media sensor data fusion is the only sane path for the foreseeable future, in my lay opinion.
They are wrong. A truly unbelievable coincidence coming across this article, I just got rear-ended by a Model 3 this morning.
Low speed, I'd figure about 10 mph tops: A softball to end all softballs for AEB, and somehow I got hit with considerable force. We've had radar based systems that would have avoided this since the 2000s.
I work in the self-driving space and while it's not an apples to apples comparison because of cost, all I could think is how our sensor stack would have allowed recognizing the car I stopped for, and probably the car it stopped for, let alone my car stopping.
I've already harped on how stupid the "FSD Beta" is, but I quite literally the absurdity of it shoved in my face just this morning only to come and read this. Why is Tesla even being allowed to run this circus at this point?
Was the Tesla being driven in the FSD mode? or was this a simple human error?
>an catch up to and exceed the visual processing capabilities of vertebrates with their millions of years of evolution
I mean tons of technology outperforms humans by a mile... plenty of other computer vision systems already outperform people so not sure why Tesla would be exempt here due to 'evolution'.
We don't have any computer vision system that outperforms mammals, let alone humans, at general vision, Tesla or not - especially on real 3D vision. We do have computer systems that outperform humans at extremely specialized tasks, such as facial recognition in photos.
Because driving isn't some narrow task. It has a lot of variability in conditions. Human brain is very adaptable, yet we require 16+ years of training before they can take the wheel.
Nah I think it's a logistics and cost thing. Adding several extra sensors dramatically increases the complexity of logistics, building the car and the overall costs.
Because if the hardware isn't available they can't deliver the car and recognize the revenue. They can wait, but if they're gonna remove it anyway it lets them make more cars with less costs in shorter time. Seems logical if it is indeed parity.
Pardon me for being out of touch, but cars park themselves now?
That's pretty neat
Lexus did it in 2006 with the LS460.
I feel like Lexus is greatly underappreciated as a technology brand. They might be behind the curve on some things, but when they're ahead of the curve, the execution is always superb.
Another thing I missed the memo for, 360 degree birds eye view cameras. Game changer for parking.
Of course also missing from the Tesla due to the lack of a front bumper camera that would allow for it...
In some weird way having multiple cameras + touchscreen, smaller mirrors and beeping parking sensors made it all much more distracting.
I used to parallel park in seconds, now it's an ordeal.
Life is difficult without backup camera tho.
It's most likely a hardware availablity issue.
They have run out of sensors, and they don't want to pause production until they have produced more.
At the same time, all other manufacturers stopped production until parts were available again. Which is the reasonable thing to do. Only they got ridiculed for how bad the supply chains are, because Tesla continued shipping, that Tesla shipped what is normally considered half finished cars was blissfully ignored.
Seems like the "upstanding" thing to do would be to ship without (and credit for the omission) and offer free retrofit once available.
That would be an extreme logistical and costly nightmare.
Tesla doesn't operate the dealership model, which all tend to double up as a service station. So it's far more difficult for customers to get to the telsa service stations. Since there's less, that also means they'd be insane waits when the product comes back in.
My understanding is that Ford has done this for non-critical features as of late.
My 2018 VW GTI locked that functionality behind a several thousand dollar interior package or a few hundred dollar dealer software update :( . I literally have the sensors and software, it's just turned off because I don't like leather seats.
I chose an EV6 over a Model 3, and every day I'm given reasons as to why that was the right choice.
These all look like push factors and are serious enough on their own.
But I don't even need them.
I'm pulled by Hyundai. Their recent design language has me and I'd argue they have the better all around packages.
Personally, Telsa aren't even the top dog anymore. It'll take time for others to see it.
I agree that Hyundai is killing it, and I will seriously consider the ioniq 6 when it is available.
However, I don't think you can discount Tesla here. They produce ~ 5x more model 3's than Hyundai can produce ioniq 5's. And the Tesla battery / motor specs are much better.
I don't see any real competition for Tesla at their price point, specs, and production capabilities. Which is frustrating, because I want an EV but I will not buy a Tesla.
Which price point? Lots of non-Tesla EVs selling for a wide range of price points right now.
Totally with you on the Ioniq 6 — very attractive, and will have better service options should something go wrong. Would also consider Polestar 2. Maybe the Ford Mustang Mach-E. Maybe.
The right answer to the problem of "Tesla not having enough sensors" is NOT to rush into the hands of a brand who can't make an engine that doesn't grenade itself.
Oh, your engine made it past 50K? Good thing you're cars are gonna get stolen since they were too cheap to include immobilizers!
With respect, it's not a good look to live based on out-of-date facts.
Do you own Ioniq? I am looking at EVs now, and top of typical things like range, reliability, etc. that you need to worry about, I also don’t want a make that is moving into “heated-seat as a service” territory. So far I haven’t heard about such crap from Hyundai, but I don’t know anyone who owns newer model.
I've worked on 3d vision systems before and have experimented using different types of sensors in multi-sensor setups, and I'd argue that overwhelmingly it's a software problem.
More sensors help, but passed a point it's extremely marginal, plus you're throwing the majority of the raw data away anyway.
I do agree though - assuming processing power isn't a performance bottleneck here, more data is always better. But there's a reason humans only have two eyes and not three or four, for example. Two eyes are really all you need.
Completely off topic, but there's a fascinating list of animals with more than two eyes. For example the four eyed fish which uses one pair of eyes to see below water and one pair to see above water, when it's swimming along the surface.
> More sensors help, but passed a point it's extremely marginal, plus you're throwing the majority of the raw data away anyway.
Unless other sensors break, then they're suddenly useful redundancy. In a safety critical product like this, that alone should preclude the removal of sensors.
> But there's a reason humans only have two eyes and not three or four, for example. Two eyes are really all you need.
You still get two, both for depth sensing and for redundancy; and some animals like spiders have 6-8 eyes.
>But there's a reason humans only have two eyes
You also have ears, a body that allows to move your face, your eyes can move too, and the brain has a model of the world including a model on how people thing and how would they act.
So will your Tesla have some eyes that can turn and look in all directions? would look cool to have snail like eyes.
this seems like a major downgrade disguised as questionable upgrade
Elon said sensor fusion doesn't work on Twitter once.
Elon musk's fat ego > an entire discipline according to the lemmings of this world I guess...
Dude if you are going 85-90 miles an hour and you aren't leaving AT LEAST 2 cars in between you, please stay off the road. You should be keeping 3 seconds of distance between you and the car in front of you. At 90 miles per hour that's a hell of a lot more than 2 car lengths.
It's possible they didn't mean at the same time. Minimum 1 in all situations, including going 5mph in traffic, where 2 might be a bit conservative.
Yeah: driving with a 2-car gap in stop-and-go traffic is NOT normal.
op is posting about the people who cut him off. If I use adaptive cruise control at 2 or 3 (stopping distance) car lengths, then I'm just cut off every time, even though there is nowhere to go.
The end to this unsafe and traffic causing behavior is something I look forward to in more self driving cars. (Fewer drivers racing dangerously for "pride" points to end up at the same traffic light in 500')
> op is posting about the people who cut him off. If I use adaptive cruise control at 2 or 3 (stopping distance) car lengths, then I'm just cut off every time, even though there is nowhere to go.
Then your entire state, or at least, the residents of it that drive on the freeway need be in prison. Or, at least, to have their licenses revoked.
Just to clarify, when it says "1 car length", the car adjusts the amount of space depending on speed. So its not literally just one car length.
Agreed. "Parity" doesn't mean in all cases, it means in the majority of cases and this seems like a edge case (where 2 car lengths isn't good enough).
> because the gap is so large
I drive with a six car gap and even that was a close-call when someone had a contact about 3 cars ahead.
The good part is that if the car in front of you flashes the brake lights, that seems to trigger the braking. And some of the newer cars blink hazards automatically when braking hard.
My Subaru is also vision only (Eyesight), which also has the cameras too close together (in my opinion) to detect a car ahead braking, but it did not slow down quickly when facing a stationary car facing the wrong way around (my feet went down before the car did anything).
All of these need more parallax, at least better than a human to do this well. But they do react to visual cues like brake lights and hazards. Though sometimes, it sees a motorcycle nearby as a car in the distance simply due to the lack of parallax.
"more sensors good" is not always a good thing, when the sensors disagree with each other. Sometimes more sensors just mean more corner cases and braking at bridges throwing shadows on the road.
So just removing something feels wrong, when I don't see them improving the thing they're putting more eggs into.
I think this is where Tesla is not credible right now, when they say they're improving something in the car, but it is a thing I cannot see, check, confirm or test.
> "more sensors good" is not always a good thing, when the sensors disagree with each other.
When they disagree, one of them is wrong. Would you rather have just the wrong sensor or both the wrong one and the good one as inputs to decide what to do?
> Vision-only vehicles are still limited to a max of 85mph and 2 car minimum following distance for autosteer while radar is 90mph and 1 car following distance.
Is that common knowledge, what's the source of that I'm curious.
What's the point of using car lengths as a metric of follow distance.
It isn't really car-length, is it speed-adjusted to scale larger at high speeds and closer at low speeds.
But the display on the screen shows a car with a line and the numbers 1-7 as you adjust it so people shorthand refer to it as "car-lengths"
I thought I remembered being taught in driver's ed that you should leave a 1 car gap for every 10mph of speed. 1-2 cars at 85mph is clearly not enough distance to stop.
That is similar to what car length adaptive cruise control does. In heavy, fast traffic, you won’t get your 7-9 car length though, due to frequent cutting in and weaving around you.
> The two car following distance constantly leads to cars cutting in-between you and the car in front of you because the gap is so large.
stop caring about this.
its like a neon sign saying that you're an unsafe driver.
I drove a new M3 in Germany last week and was blown away by how well the assisted driving features worked. Below 60 kmh it also allows you to never touch the steering wheel while on the highway in a traffic jam.
For my purposes it seemed superior to the current Tesla system and after watching some comparison videos, my conclusion seemed accurate.
Phantom braking is much improved on the FSD branch even during highway driving.
Went from 2022.16 to beta 10.69 and it was night and day difference. Haven’t had a single phantom brake in weeks.
Is Tesla planning on replacing the autopilot branch with FSD? Because right now autopilot is trash and it’s basically a shareware/free trial of FSD and I am not willing to upgrade based on my current experience with AP.
Yup 2022.24 should have a lot of the autopilot changes from the FSD branch. I haven’t tried it personally but reports are that phantom braking is reduced there too.
i have not had any phantom breaking in months also.
This could just mean it's safer now.
You've experienced many false positives (braking going off for no reason), which is not a surprise, because they're lower cost than false negatives (braking not going off when needed), so if you work in Tesla AI you're going to accept lots of them in exchange for a tiny reduction in false negatives.
But maybe 1 in 1000 people will get to experience the true positive (braking going off when needed) that would have otherwise been a false negative, and their life was saved as a result of the updates, unbeknown to them.
> Since then we often have the car slam brakes out of the blue (once waking up my sleeping family and causing both kids to cry), and a few other times.
And you still drive this car today? In the same mode?
I assume these new Teslas will have a bumper camera. No way they’re planning to rely on memory.
In that case they would need to maintain software for both loadouts or install additional camera to the cars they shipped up until now.
They do though! They can easily disable features like Autopilot, Autopark, Summon, etc fleetwide in a snap if they wanted to.
Legally, I'm pretty sure they couldn't.
Technically, this is true for any product which has a pathway for software updates. Arguably this includes nearly all models of car on sale today, which can have their software updated — even if the process is more cumbersome.
The key difference is that a Tesla vehicle can conveniently perform comprehensive whole-of-car updates while it's sitting in your garage, whereas many other cars can only have some systems updated during a service visit.
Regardless, I don't see how software updates being convenient could serve as the defining criteria for a manufacturer having "ownership over the vehicles they sell".
> Legally, I'm pretty sure they couldn't.
Somewhat related question: do Teslas have a EULA or other agreement on purchase? And if so, does it include an arbitration clause?
They specifically call that out in the doc.
"At this time, we do not plan to remove the functionality of ultrasonic sensors in our existing fleet."
The reason why they decided to remove ultrasonics is that combining contradictory signals from cameras and ultrasonic is not straightforward. What should have the higher precedence? So they ultimately decided that cameras can do what ultrasonics can do, but even better.
They won't remove ultrasonics from the delivered cars, but they will probably disable them in coming months.
>The reason why they decided to remove ultrasonics is that combining contradictory signals from cameras and ultrasonic is not straightforward.
It's fairly well known that nearly any kind of sensor fusion 'is not straightforward'. If that's a fact that was just recently discovered by Tesla -- causing a recent change in their lineup only now -- .... well, i'll just say that it speaks to Tesla's confidence being greater than their technological prowess.
I don't think that's the underlying issue here myself. I think this change is likely caused by an effort to get away from certain vendors and ship cars more rapidly.
>What should have the higher precedence?
again, this isn't a new problem and there are volumes of resources regarding that. It's a tough problem, but not new.
How could vision possibly replace the front bumper ultrasonics? Their primary function for me is showing proximity to objects at low speed, and many of those objects aren’t visible to the front camera because the hood is in the way.
I am with you on this, there is no way the vision system can go down to <1 in accuracy.
When I pull into my garage sometimes it recognizes my shelf as a car ffs....
The decision about what sensor to use is more easily made when one of the sensors provides no data, or clearly disturbed data. Like in case of fog, mud on a camera, pouring rain, etc.
My phone often signals me that I have to clean the lenses of my camera, so I'm sure it's possible to detect unreliable sensors. Sure, you have to decide on all sorts of tipping points, and in the end that is probably expensive to develop and the reason the sensors are dropped.
If fog or rain is so intense that objects within a few feet of your car are obscured from vision, I'd suggest that maybe you shouldn't be operating the car at all, regardless of any assistance technologies it might possess.
My brother's Tesla signalled that the side cameras were faulty or needed cleaning when we drove down a country lane in the dark. This did not raise my confidence in the Autopilot system.
This sounds like a problem well suited for ML, which is basically the thing most of us expect Tesla to be best at. Just throw sensor data at cases and let the situations in which certain values should get precedence manifest in the weights and biases.
I expect Waymo to be the best at it. Waymo has a bunch of sensors on their cars to provide accurate information, because they know that cameras aren't always reliable. Tesla may not be right here.
An autonomous vehicle is operating on a road network designed for humans with eyes. If we are aiming for the kind of “common sense” required for parity with a human driver it seems intuitive that the vehicle would have sensory inputs in common.
Cars do not use ultrasonic sensors for autonomy. They are for low-speed situations (such as parking) and are designed to give more information than the driver is otherwise capable of gathering.
While I generally agree to get parity you would use common sensory input, I would hope the goal is to go beyond human level and navigate in for e.g. fog conditions or heavy rain, etc. things radar would help with. Ultrasonics probably don’t buy them much at this point as long as they have good object permanence. Think about parking in a garage. If you can map objects to special memory adequately then maybe ultrasonics aren’t necessary. However if the object is out of sight out of mind, you probably need the ultrasonics to help not hit it.
It does seem rather underpowered if you compare it with the requirements of state of the art Deep Learning models in other areas, not an apples and oranges comparison though.
The output of the “occupancy network” isn’t displayed to the user right now. It doesn’t do classification, it’s just raw voxels.
The real bummer is that without ultra sonics the car can’t see the ground in front of it. Hopefully tesla adds a bumper camera.
> The output of the “occupancy network” isn’t displayed to the user right now.
I may be mis-understanding, or experiencing something else but I think it is if you're on FSD. It shows up as grey (elevated) regions around the visuals, not the standard "not a road" grey. The gates for my community began showing up in this update with it, but they look like about 2-3ft tall triangles, not a gate... though still recognized I guess?
As stated by Tesla just a few days ago, the occupancy network is not currently shown in the customer-facing visualisation. (Some unrecognised objects are represented by generic placeholders. This is derived from, but not the output of, the occupancy network.)
This video links to the relevant citation, and if you continue watching for around ten seconds you can see examples of the occupancy network visualised: