A jury might recoil at the extended nature of Vasquez’s distraction — she gazed downward 23 times in the three minutes preceding the crash. Video from an inward-facing camera may be damning with a jury.

But those factors generate mixed feelings. Some experts say it’s inherently difficult for humans to monitor automated systems and that “automation complacency,” a phenomenon that invites inattention from people who are supposed to ensure machines work as intended, set Vasquez up for failure.

“I’m somewhat sympathetic to Ms. Vasquez because I have some understanding of how challenging this job is and, in some ways, how inevitable these kinds of cases are, whether it’s Tesla Autopilot or GM Super Cruise or a self-driving Uber,” Halfon said.

Whether because of automation complacency or other reasons, humans often bear the brunt of the blame when machines malfunction. A research paper by Madeleine Elish coined the phrase “moral crumple zone” to describe the way humans are assigned moral and legal responsibilities from these system problems. Look no further than the two fatal Boeing 737 MAX aircraft crashes in 2018 and 2019 to see the inclination to blame the pilots.

“There were initial attempts to say those were pilot error and that the pilots failed to understand the way the system worked, or they failed to respond appropriately,” said Daniel Hinkle, senior state affairs counsel for the American Association of Justice. “ ’Moral crumple zone’ is almost too abstract. When you are designing your system to ensure a human is there to absorb responsibility, it’s a human crumple zone.”

Important distinctions exist in the Tempe crash. It involved a self-driving vehicle, but it was only a test vehicle. It required human oversight.

An investigation by the National Transportation Safety Board, the federal agency charged with probing notable crashes, concluded the probable cause of the crash was Vasquez’s failure to monitor the road ahead. But the board also cited Uber’s safety culture and automation complacency as contributors. How those conclusions are treated by a jury will be a key question.

“It’s persuasive in that this is the finding of a high-level government agency,” Gottehrer said. “But just because the NTSB determined a probable cause, that’s not the same as the burden of proof needed to establish liability or criminal or civil culpability. … If you are telling a jury someone is criminally negligent, the question is, ‘Did they fall below the reasonable-care standards we’d expect from a safety driver?’ And that’s a whole new concept right there.”

So is the concept of an automated driver. Though NHTSA issued an interpretation in 2015 that stipulated a self-driving system could be considered the driver of a vehicle, Arizona law considers a driver to be the person sitting behind the wheel.

Legal precedent suggests there can be co-operators of vehicles, stemming from unusual cases, such as one in which one person was pushing a stalled vehicle while another steered. But in this matter, a prosecutor’s office has already absolved Uber of criminal liability. (The company reached a civil settlement with Herzberg’s family, the details of which are undisclosed).

That’s troublesome, Hinkle suggests, because Uber had disconnected two safety systems: The factory-installed City Safety auto-brake system supplied by Volvo and an internal fail-safe developed by Uber called Reflex. Further, the company had decreased the number of human safety drivers in its test vehicles from two to one, a move that eliminated another safeguard.

“They left her alone in the vehicle to monitor it all by herself,” Hinkle said. “It doesn’t absolve her of responsibility in any way, but it clearly implicates Uber in the crash.”

Source Article