In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.
The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.
Not sure how that helps in evading liability.
Every Tesla driver would need super human reaction speeds to respond in 17 frames, 680ms(I didn’t check the recording framerate, but 25fps is the slowest reasonable), less than a second.
It’s not likely to work, but them swapping to human control after it determined a crash is going to happen isn’t accidental.
Anything they can do to mire the proceedings they will do. It’s like how corporations file stupid junk motions to force plaintiffs to give up.
deleted by creator
The plaintiff’s lawyers would say, the autopilot was engaged, made the decision to run into the wall, and turned off 0.1 seconds before impact. Liability is not going disappear when there were 4.9 seconds of making dangerous decisions and peacing out in the last 0.1.