Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.
Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.
The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!
All these years, I always thought all self driving cars used LiDAR or something to see in 3D/through fog. How was this allowed on the roads for so long?
They originally the model S had front facing radar and ultrasonic sensors all round, the car combined the information to corroborate it’s visual interpretation.
According to reports years ago the radar saved Tesla’s from multiple pileups when it detected crashes multiple cars ahead (that the driver couldn’t see).
Elmo in his infinite ego demanded both the radar and ultrasonics be removed, since he could drive with out that input so the car should be able to… also it is cheaper.
Exactly, my previous car (BMW) once saved me in the fog by emergency braking for something I wasn’t able to see yet. My current car (Tesla) shuts down almost all safety features when the camera’s can’t see anything, so I doubt it will help me in such situations. The only time my Tesla works well is in perfect conditions, but I don’t live in California.
If you were driving at a speed at which the low visibility would have gotten you into into an accident due to some obstable you weren’t able to see yet, you were driving too fast. Simple, isn’t it?
While true, it’s still nice that super-human senses are looking out for the driver on their behalf. Also it’s nice if super-human senses allow for braking earlier and closer to graceful rather than standing hard on the brakes because of late notice.
Fog is one example, but sudden blinding glare could be another situation that could be mitigated by things like radar and lidar. Human driver may unexpectedly be blinded and operating at unsafe speed without any way of knowing that glare was coming in advance.
They do.
But “all self driving cars” are practically only from waymo.
Level 4 Autonomy is the point at which it’s not required that a human can intercede at any moment, and as such has to be actively paying attention and be sober.
Tesla is not there yet.
On the other hand, this is an active attack against the technology.
Mirrors or any super-absorber (possibly vantablack or similar) would fuck up LIDAR. Which is a good reason for diversifying the Sensors.
On the other hand I can understand Tesla going “Humans use visible light only, in principle that has to be sufficient for a self driving car as well”, because, in principle I agree. In practice… well, while this seems much more click-bait than an actual issue for a self-driving taxi, diversifying your Input chain makes a lot of sense in my book. On the other hand, if it would cost me 20k more down the road, and Cameras would reach the same safety, I’d be a bit pissed.
The whole idea is they should be safer than us at driving. It only takes fog (or a painted wall) to conclude that won’t be achieved with cameras only.
You had a lot of hands in this paragraph. 😀
I’m exceptionally doubtful that the related costs were anywhere near this number, and it’s inconceivable to me that cameras only could ever be as safe as having a variety of inputs.
Musk’s ethos is clear, both in business and government. He will make whatever short term decisions his greed and the ketamine tell him to make, and fuck whatever happens down the road. Let’s not work so hard to sanewash him like the media has Trump.
Well, I do still think that cameras could reach “superhuman” levels of safety.
(very dense) Fog makes the cameras useless, A self driving car would have to slow way down / shut itself off. If they are part of a variety of inputs they drop out as well, reducing the available information. How would you handle that then? If that would have to drop out/slow down as much,
you gain nothing again/e: my original interpretation is obviously wrong, you get the additional information whenever the environment permits.And for the painted wall. Cameras should be able to detect that. It’s just that Tesla presumably hasn’t implemented defenses against active attacks yet.
cost has been developing rapidly. Pretty sure several years ago (about when tesla first started announcing to be ready in a year or two) it was in the tens of thousands. But you’re right, more current estimations seem to be more in the range of $500-2000 per unit, and 0-4 units per car.
From what I remember there is distressingly little oversight for allowing self-driving-cars on the road, as long as the Company is willing to be on the hook for accidents.
tesla uses cameras only, i think waymo uses lidar.
Most non Tesla brands that have some sort of self-driving functionality use lidar and/or radar. I’ve got a BMW iX and as far as I know it uses cameras, radar, lidar, and ultrasonic sensors.
It’s the only sensible approach. Not just is the notion that “humans use just their eyes too” completely wrong (otherwise how would be able to tell that something is off with the car “with our butt”?), computers are not even remotely close to our understanding and rapid interpretation of the world around us or cooperation beyond of what’s pre-programmed, which is necessary to deal with unforeseen circumstances. Cars must offset this somehow, and the simplest way to do so is with vast sensor suites that give them as much information as possible. Of course many humans also utterly fail at cooperation and defensive driving, but that’s another problem.
I remember reading that tesla only uses cameras for it’s self driving. My 2018 Honda uses radar for the adaptive cruise so the technology exists, musk is just an idiot.
Does it? My 2023 model throws a shit fit if it’s cold and I assume the camera covers are iced over.
It probably has cameras as well, for lane guidance etc.
My Mazda complains if the windscreen is dirty for the same reason.
Radar doesn’t detect stopped objects at high speed. It’d hit the wall too on radar alone.
This has to be solved by vision and or lidar.
Unless your car is traveling faster than the speed of light, radar will detect objects in front of it. But yeah, I was trying to imply that for a complex system like self driving musk is a buffoon for relying on a single system instead of creating a more robust package of sensors.
They get filtered out and the car will not act on it because there is so much noise from stationary objects all around you.
The radar in all cars is used to detect moving objects and the change in velocity of those objects.
Radar will not prevent running into this wall at 40mph.
People can downvote me all they want, but that doesn’t change anything.
Only vison and / or lidar would stop for that wall at 40mph.
Money.
they generally do