The death of a pedestrian during a test drive of a driverless vehicle (even as a backup human sat in the driver's seat) calls into question not just the technology—which didn't seem to detect the pedestrian crossing a busy roadway and therefore didn't brake or swerve—but also the notion that driving is nothing more than a set of instructions that can be carried out by a machine.
The surprised backup driver seemed to have confidence in the inventors of driverless cars as he was looking down at his computer briefly just before impact.
Certainly, a real human driver might have hit this pedestrian who was crossing a busy street at night with her bicycle. But, of course, as a friend of mine pointed out, there is a big difference in the public mind between a human driver hitting and killing a pedestrian and a robot killing one. If the incident had involved a human driver in a regular car, it would probably only have been reported locally.
But the real story is "robot kills human." Even worse, it happened as a seemingly helpless human backup driver looked on. The optics are the absolute worst imaginable for the driverless car industry.