In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • Simulation6@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    1 day ago

    If the disengage to avoid legal consequences feature does exist, then you would think there would be some false positive incidences where it turns off for no apparent reason. I found some with a search, which are attributed to bad software. Owners are discussing new patches fixing some problems and introducing new ones. None of the incidences caused an accident, so maybe the owners never hit the malicious code.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      18 minutes ago

      The given reason is simply that it will return control to the driver if it can’t figure out what to do, and all evidence is consistent with that. All self-driving cars have some variation of this. However yes it’s suspicious when it disengages right when you need it most. I also don’t know of data to support whether this is a pattern or just a feature of certain well-published cases.

      Even in those false positives, it’s entirely consistent with the ai being confused, especially since many of these scenarios get addressed by software updates. I’m not trying to deny it, just say the evidence is not as clear as people here are claiming

    • FuglyDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      15 hours ago

      if it randomly turns off for unapparent reasons, people are going to be like ‘oh that’s weird’ and leave it at that. Tesla certainly isn’t going to admit that their code is malicious like that. at least not until the FBI is digging through their memos to show it was. and maybe not even then.

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        12 minutes ago

        When I tried it, the only unexpected disengagement was on the highway, but it just slowed and stayed in lane giving me lots of time to take over.

        Thinking about it afterwards, possible reasons include

        • I had cars on both sides, blocking me in. Perhaps it decided that was risky or it occluded vision, or perhaps one moved toward me and there was no room to avoid
        • it was a little over a mile from my exit. Perhaps it decided it had no way to switch lanes while being blocked in
    • Dultas@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      22 hours ago

      I think Mark (who made the OG video) speculated it might be the ultrasonic parking sensors detecting something and disengaging.