In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.
The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.
No I’m not trying to be generous, If it were just a dumb decision it would totally have been his. This is like a recent graduate engineer vendetta kind of decision. I don’t think he’s actually smart enough to try to make this bad decision.
Back in like 2016, a huge part of the grift was that Tesla was a tech company and their system would get smarter forever as it got more data.
Using an expensive sensor that can just detect objects instead of relying on computer vision and machine learning is kinda like an admission of failure.
that’s fair