In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.
The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.
this isn’t being fair. It’s being compared to the other- better- autopilot systems that use both LIDAR and radar in addition to daylight and infrared optical to sense the world around them.
Teslas only use daylight and infrared. LIDAR and radar systems both would not have been deceived.
So out of interest I looked it up.
The new BYD cars that are coming out also have self-driving probably to directly compete with Tesla.
However they do use lidar, and radar, and cameras, and infrared cameras, and ultrasonic sensors. All have to be working or the car won’t go into self-drive. So other companies consider even one of their sensors failing to be enough to disable self-driving capabilities yet Tesla are claiming that it’s perfectly safe to drive around with those features not even installed let alone functional.
So yeah that’s a real bad look.
I don’t see how this test demonstrates anything is better. It is a gimmick designed for a specific sensor to get an intended result. Show me a realistic test if they want to be useful, or go full looney tunes if they want to be entertaining
The cartoon wall is just an exagerated illustration of why a vision only system sucks compared to other systems.
If you watch the rest of the video, you’d see Tesla’s vision only system is inferior to cars using radar/lidar in other, more realistic situations like heavy rain and fog.