In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • PersnickityPenguin@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    14 days ago

    Yeah but that’s milliseconds. Ergo, the crash was already going to happen.

    In any case, the problem with Tesla autopilot is that it doesn’t have radar. It can’t see objects and there have been many instances where a Tesla crashed into a large visible object.

    • sudo@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      14 days ago

      That’s what’s confusing me. Rober’s hypothesis is without lidar the Tesla couldn’t detect the wall. But to claim that autopilot shut itself off before impact means that the Tesla detected the wall and decided impact was imminent, which disproves his point.

      If you watch the in car footage, autopilot is on for all of three seconds and by the time its on impact was already going to happen. That said, teslas should have lidar and probably do something other than disengage before hitting the wall but I suspect their cameras were good enough to detect the wall through lack of parallax or something like that.