top of page
  • By John R. Quain

When Self-driving Fails, Who's Responsible?

Updated: Sep 18, 2020


The safety driver involved in the fatal Uber accident back in 2018 in Tempe, Arizona, was charged with negligent homicide this week. The news turns what had been a theoretical issue into a real, practical question: who is responsible when an autonomous vehicle makes a fatal error?


Rafaela Vasquez, 46, was behind the wheel monitoring the Uber autonomous vehicle at the time of the accident, which killed a pedestrian who was walking her bicycle across a multi-lane road at night. Ms. Vasquez pleaded not guilty, facing possible penalties of up to 6 years in jail depending on how the charges are spelled out.


Some attorneys that OntheRoadtoAutonomy has spoken to assert that a whole new set of laws and legislation is needed to cover such cases. Other lawyers have said current laws are sufficient, essentially keeping the responsibility in the hands of the driver at all times, autonomous driving features or not.


Shortly after the accident, Uber made a settlement with the family of Elaine Herzberg, the victim in this accident. The terms of the agreement including possible payments were not made public. Then last year, prosecutors in Arizona decided that Uber would not face criminal liability charges in the crash.


But the Uber accident raises some important issues. For example, why shouldn't such autonomous or advanced driver assistance systems (ADAS) be treated in the same way we would treat a structural failure or, say, the poor design of an ignition switch? In the Uber case, there was clear technical incompetence. Even the most basic ADAS systems would have avoided the accident (or at least prevented Ms. Herzberg's death), let alone a system that was supposed to be using lidar. So why wasn't Uber held responsible for the failure?


It's an issue that has been raised many times by critics of Tesla and its over-hyped driver assistance program. Called “Autopilot,” Tesla's system is often touted as self-driving software even though it lacks the basic technology to make that possible. The software has been involved in a number of fatal accidents, but no criminal charges have been made against the company with authorities usually laying the blame on the driver using the Autopilot system. (RTFM, as if it were a matter for tech support.)


Similarly, the local Arizona authorities have made the Uber accident a simple case of distracted driving--but that's not accurate. Early in the investigation, police said the crash was “entirely avoidable” and that Vasquez was watching the “The Voice” instead of watching the road. It's clear from the in-cabin video that she was distracted—but it's also true that Ms. Vasquez was operating a car that, while experimental, was supposedly able to handle driving tasks with a minimum level of safety. Indeed, this is a very basic skill most ADAS systems have and did not involve a so-called edge case. Correctly perceiving and classifying a person pushing a bicycle is Autonomous Driving 101. So it was “entirely avoidable” from a technical point of view as well, with or without a driver. That fact would seem to put some responsibility in the lap of the manufacturer.


So far, the prevailing attitude among authorities has been to give manufacturers tremendous latitude and place the blame squarely on the driver. With the Uber case it looks like that attitude isn't going to change any time soon.


Comments


Commenting has been turned off.
Featured Review
Tag Cloud
bottom of page