“Moral Crumple Zones”: When your self-driving car crashes, you could still be the one who gets sued

Image from: International Telephone and Telegraph Corporation. May 1957. Advertisement. Broadcasting · Telecasting, 139.

“In a self-driving car, the control of the vehicle is shared between the driver and the car’s software. How the software behaves is in turn controlled — designed — by the software engineers. It’s no longer true to say that the driver is in full control… Nor does it feel right to say that the software designers are entirely control.
“Yet as control becomes distributed across multiple actors, our social and legal conceptions of responsibility are still generally about an individual. If there’s a crash, we intuitively — and our laws, in practice — want someone to take the blame.
“The result of this ambiguity is that humans may emerge as ‘liability sponges’ or ‘moral crumple zones.'”

At Data & Society’s Intelligence and Autonomy forum in March 2015, “moral crumple zone” emerged as a useful shared term for the way the “human in the loop” is saddled with liability in the failure of an automated system.

In this essay in Quartz, Madeleine Clare Elish and Tim Hwang explore the problematic named by “moral crumple zone,” with reference to cruise control, self-driving cars, and autopilot.